This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [tree-ssa PATCH] Pick memory consumption low hanging fruit

> >
> > So it seems that tree-pre is doing something that keeps a lot of
> > memory from being gc-ed.
> First, As i've pointed out, PRE doesn't do any ggc_alloc's during PRE anymore,
> the only thing it ggc_alloc's is a <100 byte structure (it's probably 20
> or 40) to store some info about the expression.  It may inadvertently
> cause ggc_allocs because of how tree-ssa works (IE resizing and clearing
> ggc_alloc'd varrays), but it's not in my code.
> :P
> Second, Also, I tried it, and at -O2, I can't get memory to go above 250
> meg WITH PRE on.
> I can at -O3, with or without PRE.
> This is on fedora core 1.  Maybe something changed in glibc's memory
> allocation?
> I have no local changes save a 2 line fix for Jeff Law's problem.
> My mem-report looks like this, *with* PRE, at -O2

In more oddness, I updated the CVS tree (it was last updated when i
committedmy last PRE patches) to today, and the numbers changed.

It now hits 330 meg, but the mem report doesn't show much different (13
meg increase):
8           1856k        313k         41k
16          3032k        688k         44k
32            18M       2984k        202k
64            11M       9057k        106k
128         4096        1024          32
256         6960k       6535k         47k
512         4640k       4456k         31k
1024        8960k       8698k         61k
2048          12k       6144          84
4096        8192        8192          56
8192          24k         24k         84
16384         16k         16k         28
32768        320k        320k        280
65536         64k         64k         28
131072        384k        384k         84
262144        256k        256k         28
116           41M         29M        331k
24            32M       7329k        391k
12          4612k        508k         76k
40            10M       4025k        107k
Total        145M         74M       1443k

I really hope our GC isn't this variant that we range from a high water
mark of 250 to 750 meg depending on random factors.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]