This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: gcc compile-time performance




--On Saturday, May 18, 2002 03:59:40 PM -0400 Robert Dewar <dewar@gnat.com> 
wrote:

> <<This is a common misconception.  The problem is that sometimes (often)
> the headers are the *vast* majority of the code in a single translation
> unit.  (Often, more than 95%).  In order to be conformant, you must
> not only lex and parse all of that code -- you must perform significant
> semantic analysis.
>>>
>
> Well of course that's true. It's true in Ada too.

Not in nearly the same way.  Remember that people can -- and do -- encode
Turing programs in C++ headers, and the compiler must execute the program
as it reads the header.  The compiler must be an interpreter.  That
can be very expensive.

> the back end for almost all programs. Precompiled headers (in Ada
> precompiling specs) will typically only be able to help the front end
> time.

Not true; in C++ function bodies are present in the headers and the
precompiled header may store the already optimized form of the code.

> comments (perhaps half that without comments). The tree file (which is
> what would correspond to a precompiled header -- it contains a full
> symbol table and fully decorated semantic tree) is about 5 megabytes.

In C++, for real programs, I've seen tree images as large as 5 gigabytes.

> I do agree that if you don't have a really fast front end, then
> precompiled headers can be a big win, I am just not convinced that it is
> necessarily a win for a front end written for maximum speed.

As I've said, there are ways to get significant speedups out of G++; it's
not tuned for maximum speed.  On the other hand, I'm intimately familiar
with several C++ compilers and in all of them the front ends require
very significant amount of computation and all -- except G++ -- use
precompiled headers as a way to cut that down substantially.

I think you're just not familiar with the kinds of things people are doing
with C++.  POOMA is a good example; you may in the end get code that is
a total of only a few kilobytes: a bunch of loops doing little scientific
kernels.  But, you parse and analyze and inline -- all in the front end --
literally hundreds of thousands of functions.

--
Mark Mitchell                   mark@codesourcery.com
CodeSourcery, LLC               http://www.codesourcery.com


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]