Precompile Headers. What is Best Practice?

John Carter john.carter@tait.co.nz
Thu Sep 6 00:06:00 GMT 2007


Precompiled headers sound really tasty..., I've read the docs three
times but I'm not sure I've quite digested it enough to do a large
implementation. So I'm looking for some practical / standard practice
guidance here.

I write the build scripts for a very large body of C and C++ code
being worked on by about 20 blokes.

I'm always trying to trim a few seconds off the 2 hour compile time to
it takes to compile & link all variants...

I build about 6 product variants off the same code line. Thus I'd
estimate around 75% of the time the .gch files for product variants
_should_ be different. (Differently defined macros, #if's etc.)

Thus storing them in the same directory as the .h seems a
non-starter.


The docs say... "If the precompiled header file can't
be used, it is ignored."

Hmm. What does that mean? Does it mean...
  * If using it would produce incorrect results don't use it.

     or

  * There is a single macro different, so potentially there is a
    problem, so won't use it.

Again the docs say...

    Alternatively, you might decide to put the precompiled header file
    in a directory and use `-I' to ensure that directory is searched
    before (or instead of) the directory containing the original
    header.  Then, if you want to check that the precompiled header
    file is always used, you can put a file of the same name as the
    original header in this directory containing an `#error' command.

Hmm. Anybody come up with standard patterns of doing that? eg.
  gch_dir/VARIANT_N/Path/To/Inc/Blah.gch


This rule also takes some digesting...
    * Only one precompiled header can be used in a particular
      compilation.

    * A precompiled header can't be used once the first C token is seen.
      You can have preprocessor directives before a precompiled header;
      you can even include a precompiled header from inside another
      header, so long as there are no C tokens before the `#include'.

Do you go around making sure that the (recursively) fattest .h is at the start?
   Or
just hope that some speed up is better than nothing?

Does this mean there since there is only ever any point in
precompiling the first #include...

To actually speed things up your dependency scanning magic needs to
work out which headers are in the first #include in more than N places
and only precompile those where N is greater than 2? 3? ..?

I'm starting to wonder whether it is worth the effort of doing this....

Anybody have any benchmarks available? eg.
  * Speedup per line of C/C++ header code.
  * System wide speedup if I didn't alter a line of C/C++ code.
  * System wide speedup if I moved the recursively fattest #include to the start of every file.

I once wrote a script that worked out the minimal set of #include's
need to successfully compile.  As a result I deleted about 2000
#includes and sped up compilation by about 1.5-2 and substantially
reduced accidental coupling that was causing pain in refactoring.





John Carter                             Phone : (64)(3) 358 6639
Tait Electronics                        Fax   : (64)(3) 359 4632
PO Box 1645 Christchurch                Email : john.carter@tait.co.nz
New Zealand



More information about the Gcc-help mailing list