This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: gcc compile-time performance


> The slowness of gcc when compiling is why I haven't been writing anything
> using the gtkmm and gnomemm libraries (C++ gtk wrapper,
> http://gtkmm.sourceforge.net/ ).  It can take minutes to compile an
> example program on my (old) 400MHz PII. The library has a lot of template
> usage, so I suspect this is what causes the problems.

It is certainly apparent that a lot of the problems with speed come from
people using old slow machines. After all for $1000 you can get a machine
that will be 4 times faster than this 400Mhz PII, and no amount of technical
work will speed up GCC by this factor. 

But I quite understand that for many volunteer developers who are not paid
for their time, the normal equation that immediately says that such an
expenditure is worth while may not apply (certainly no serious commercial
users are going to mess with such machines). 

I certainly agree that GCC is slow as compilers go, and in fact I find very
few compilers fast by the standards which I know are possible (from SPITBOL
and Realia COBOL). But in practice I think it will be difficult to make any
huge gains in speed here. Speed is something that has to be built in from
the start, it can't easily be retrofitted. Furthermore, the fundamental
approach of the GCC code generator is definitely not designed for maximum
speed of compilation.

In practice, machines do get faster faster than programs get bigger, so I
suspect that the complaints about speed disappear over time. I know that
in the COBOL world, Realia COBOL was ten times faster than its competitor
(Microfocus), and when you compared 10,000 lpm on a PC1 with 1,000 lpm or
less, that really made a difference, but by the time you were comparing
1,000,000 lpm with 100,000 lpm it was not such a critical difference.

Certainly it seems a good idea to me that any significant change to GCC should
involve doing comparative timings of compilation on some standard g++ test
suite, to be aware if a given patch really slows things down. Much easier
to catch such incremental pessimizations when they occur, than after the fact.

I must say it is amazing how fast machines get quicker, my old notebook is
less than two years old, cost $6000 at the time fully loaded, and now no one
in the company is even vaguely interested in inheriting it, and the value on
ebay seems to be about $300 if that :-)


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]