This is the mail archive of the
mailing list for the GCC project.
Re: Problem with PFE approach [Was: Faster compilation speed]
- From: "Timothy J. Wood" <tjw at omnigroup dot com>
- To: Stan Shebs <shebs at apple dot com>
- Cc: Devang Patel <dpatel at apple dot com>, Mike Stump <mrs at apple dot com>, gcc at gcc dot gnu dot org
- Date: Mon, 19 Aug 2002 08:52:22 -0700
- Subject: Re: Problem with PFE approach [Was: Faster compilation speed]
On Monday, August 19, 2002, at 07:05 AM, Stan Shebs wrote:
It sounds to me like you're favoring a revival of the old NeXT system
of precomps that were systemwide and and per-include file. That
even after years of tuning and tweaking, tended to top out at about 4X
speedup, while PFE precomps are at 6X after just a few months of work.
(Admittedly, the NeXT scheme is based on a separate preprocessor, which
limits its effectiveness.)
I'm not really in favor of exactly the approach NeXT took. Instead,
I'm in favor of the compiler and IDE precompiling EVERY header that
gets seen (ok, maybe not every one, but all the ones that make sense
for some reasonable value of 'sense'). Also, these headers should be
cached in the person's build directory.
So, just like you have a .o cached for a .c file, you should have a
.p file cached for a .h file (or whatever suffix).
Hopefully this would be faster than the NeXT implementation due to
(a) hitting more headers and (b) being integrated into the compiler
proper instead of as a split preprocessor as you say.
Additionally, this fixes the invalid precomps problem that NeXT had
(since each user will end up caching their own precomp for the system
headers when they build their project).
Of course, the big problem I see with my suggestion is that you'd
potentially end up with a huge number of precompiled header files. You
could limit this by only precompiling headers that were included in an
actual source file. But, this would mean that if A.h and B.h both
include C.h and [AB].h are included in some source file, both would
contain a copy of C.h. This would use extra space on disk (and you'd
have to be able to skip spurious duplicate includes)
Yes, I'm amazingly happy with PFE so far. It is definitely a
But as a practical thing, a 6X speedup in the compiler so radically
changes what you can do day-to-day, that it's worth some effort and
some process change to accommodate it. CW precomps have all the flaws
you're pointing out, and yet CW users are pretty happy with it; by
editing their prefix file, they can adjust their one precomp to include
more or fewer of their own headers, depending on whether a header is
stable or not, and can do this at any point during development.
Sometimes the compiler will do too much recompiling, but who cares if
it only takes a minute to completely rebuild a big project?
I wonder if the CW users have just gotten used to the broken
dependency checking -- I'm guessing they are more 'resigned and mostly
content' than 'happy' :)
At any rate, I'll be most happy with PFE if the dependency checking
in ProjectBuilder remains as it is (i.e., the PFE is assumed to be a
#included file for the purposes of dependency). Again, PFE is a major
improvement (as I'm sure you know better than most).