This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: gcc 3.5 integration branch proposal


Gabriel Dos Reis wrote:

I suggest you spend some time in the bugzilla database, triaging bugs
and explaining people who say that the compiler segfaulted -- when
compiling their programs, and you have determined that GCC was
consuming huge memory -- that they are marginal. Until then, I guess we're just going through an empty discussion.

You are confusing apples and oranges.


What I was talking about here was increases in memory requirements
that mean that gcc does not operate well on outdated "small RAM"
machines so that there are programs which would once have compiled
OK on such machines and don't any longer.

It is of course a totally different matter if algorithms are introduced
which use absurd amounts of memory so that programs cannot even be
compiled on machines with a gigabyte of memory. There are most definitely such cases, and to me such cases are plain bugs. There is
really no excuse for a compiler requiring huge amounts of room. In fact
I don't really like the style of relying on virtual memory to guarantee
that huge data structures can be held in memory, but I am afraid we are
pretty much stuck with that.


It's somewhat the style in the C and C++ world to assume that each
individual compiled file should be small, and that it's not so terrible
if a compiler can't handle really large source files. I find this quite
unacceptable. For one thing, you can legitimately get huge files if they
are the output of code generators, and compilers should be able to handle such cases fine.


We certainly have bumped up against several cases in which gcc algorithms blew up horribly in space and time (the two usually go
together in these cases of serious performance bugs).


To put it concretely, it would not worry me if gcc could compile all sorts of giant programs comfortably in 256 megabytes, but was hopeless
on smaller machines. A machine with 256 megs is not what I would call
a "small RAM" machine.


Part of the trouble it seems to me with GCC is that there have never been any concrete performance requirements in terms of compile speed
and space requirements.


That's a matter of emphasis. In the case of Realia COBOL for instance,
compile speed was a primary functional requirement, and space was
constrained to 640 kilobytes by the hardware. We could still compile
million line files, and did so routinely to check that this worked fine
(on a PC1 at 5MHz, a million line file could take a couple of hours
to compile, but we made sure it did not get suddently worse than that).



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]