Reducing compilation memory usage

Tom St Denis tstdenis@ellipticsemi.com
Sun Jan 20 19:08:00 GMT 2008


Andrew Haley wrote:
> Tony Wetmore wrote:
>> Alejandro Pulver wrote:
>>  >> RAM!  Of course we could make gcc more economical, and
>>  >> we could somewhat reduce memory usage, but you're asking
>>  >> for something really hard.
>>  >
>>  > I wasn't asking to change the program, I was just asking
>>  > if there is an already existing option.
>>
>> I think Andrew may have meant that you are asking GCC to do something 
>> really hard, to optimize a single function that is so large.  And 
>> asking the compiler to do something that hard has a price -- it 
>> requires lots of memory, as you have discovered.
>
> I was saying exactly that, thank you for clarifying.
>
> The thing to realize is that it is really hard to do a great job of 
> optimizing
> huge functions.  So, it's quite likely that gcc will do a better job of
> optimizing a bunch of small functions, with well-contained locality of 
> variables,
> than one huge function.
>
> OK, so you will have the overhead of a function call.  But on most 
> architectures
> that isn't enormous.

Dropping some Friday afternoon pair of cents ...

You shouldn't really have large functions unless their machine made 
anyways.  And even then it's best to try and factor them as much as 
possible.  In one of my math libraries I have huge unrolled multipliers, 
for things like 1024x1024 bit multiplications (on a 32-bit platform that 
means 1024 MULADD macros).  Originally, I had all of the multipliers 
[different functions] in a single .C file which was machine generated.  
Later I found GCC performs much better on the same code if I left it one 
huge function per file.

Anyways ... on code that's human-written, you should never really run 
into any of these sorts of limits.  Otherwise, you're not thinking about 
your design properly.

Tom



More information about the Gcc-help mailing list