This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Faster compilation speed

<<We should see any speed improvement as a possibility to add
more functionnality into the compiler without changing much the
increase of speed the user expects to see. Even though for the time
being (and given the current state of gcc compared to the competition),
it looks like a lot of people just want to see the compiler go faster...

But remember that work you put in on speeding up the compiler is work
that you do not put in on improving the compiler. As time goes on, quality
of generated code continues to be critical, compiler speed is less critical.

<<Now, you may probably be right in this case, you certainly know more
than I do. Are you sure though that the quality of the codes generated by
these compilers were equal ?!? I suppose so, but just asking a

Well Phillipe Kahn in the keynote address at one big PC meeting asked
the audience if they knew which compiler for any language on the PC
generated the best code for the popular sieve benchmark. He surprised
the audience by telling them it was Realia COBOL. Now I don't know if
the guys at Computer Associates have kept up, but certainly that date
point shows that fast compilers can generate efficient code.

<<File size is not the only parameter. Modern languages do more
complicated thing than the average Cobol compiler I suppose....

You suppose dramatically wrong (it is amazing how little people now about
COBOL and how much they are willing to guess). Modern COBOL is an extremely
complex language, certainly more complex than Ada, and probably more complex
than C++.

The point is that GCC has a really terrible time if you throw a single
procedure with tens of thousands of lines of code in it at the compiler.

<<At the same time, people are getting new machines and expect their
programs to compile faster... nad not to mention that the "average
source code" (to be defined by someone ;-) ) is also probably growing
in size and complexity...

Actually compilers have in general got slower with time (see my SIGPLAN
compiler tutorial of many years ago, where I talked about the spectacular
advances in technology of slow compilers :-) Few modern compilers can
match Fastran on the IBM 7094.

<<And, it also depends on what the nine minutes you gained allow you
to do on your computer.... If the nine minutes can be used to do what
the average user considers to be a very important task, then nine
minutes is a lot !!!

Very little in practice. You do not rebuild a million line system every
two minutes after all, and in practice once the build time for a large
system is down in the ten minute range, the gains in making it faster
diminish rapidly. This is not a guess, as I say, this is an observation 
of market forces over a period of year in the competition between
Realia COBOL and Microfocus COBOL, where Realia always had a factor of
ten or more in compile speed to compete with.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]