This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Re: Problem with PFE approach [Was: Faster compilation speed]
- From: "Timothy J. Wood" <tjw at omnigroup dot com>
- To: Devang Patel <dpatel at apple dot com>
- Cc: Mike Stump <mrs at apple dot com>, gcc at gcc dot gnu dot org
- Date: Sat, 17 Aug 2002 15:30:56 -0700
- Subject: Re: Problem with PFE approach [Was: Faster compilation speed]
So, another problem with PFE that I've noticed after working with it
for a while...
If you put all your commonly used headers in a PFE, then changing any
of these headers causes the PFE header to considered changed. And,
since this header is imported into every single file in your project,
you end up in a situation where changing any header causes the entire
project to be rebuilt. This is clearly not good for day to day
development.
A PCH approach that was automatic and didn't have a single monolithic
file would avoid the artificial tying together of all the headers in
the world and would thus lead to faster incremental builds due to fewer
files being rebuilt.
Another approach that would work with a monolithic file would be some
sort of fact database that would allow the build system to decide early
on that the change in question didn't effect some subset of files.
-tim