This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Mainline space problem


On Tue, 2004-08-31 at 11:36, Daniel Berlin wrote:
> You honestly believe it is not much better to use half as much memory?
> That 2 gigabytes and 1 gigabyte aren't that different?

This is an oversimplification, and an insult.  Yes, of course, using
half the memory is better.  But it doesn't fix the underlying bug, as we
still need more memory than what is available for this testcase, and
mainline still uses far more memory than gcc-3.4.

Also, we need to keep in mind that any change to data structures may add
overhead that may make the compiler slower.  The speed of the compiler
is a common complaint.  We should be keeping this in mind.

The underlying problem here, as I have said twice already, is that we
have too many pseudo regs.  Gcc-3.4 needs only about 9K pseudos. 
Mainline needs about 129K pseudos.  That leads to about a 200 times
increase in the memory usage needed, because of the quadratic factor. 
We can reduce that to 100 times by using a lower-triangular bitmatrix as
you mention, but I think this is the wrong approach, as it doesn't fix
the underlying problem, which is too many pseudos.  I believe we should
instead be looking at why we have so many pseudos for this testcase.

We can reduce the memory usage even farther by using sparse matrices,
but that will add even more overhead.  If we replace array references
with function calls, then I think it is likely for the common case that
there will be a performance hit.  This would eliminate the quadratic
factor which would let us compile large function though.  So there is a
trade off here, we can make the compiler slower in the common case in
order to increase the size of functions that we can compile.  It is far
from obvious that this is a good tradeoff.

> What makes you think that the common case is the dense case?

This is a false assumption, and an insult.  I never said the common case
was dense.  What I said, in both previous messages, is that we have too
many pseudo regs.  The common case is a small enough number of pseudos
such that the quadratic memory allocation here isn't a problem.

Obviously the matrix is sparse, particularly in this case.

Apparently you think I'm stupid.  I'm not.  I don't understand why you
feel the need to insult me so.
-- 
Jim Wilson, GNU Tools Support, http://www.SpecifixInc.com



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]