maximum memory for gcj-compiled executable?

Boehm, Hans
Tue Apr 1 00:54:00 GMT 2003

There should not be any unbounded memory growth.  In my experience, the most common reasons for such growth are, roughly in decreasing frequency of occurrence:

1) Memory objects that remain unintentionally referenced, caches that are never cleaned, or other data structures that really grow without bound.  This can clearly happen in either client code, or due to a bug in libgcj.  There are some tools in the garbage collector to help track these down, but some work is needed to make them usable with gcj.  There is nothing the collector can do about these other than providing tools for examining the heap and reachability within the heap.

2) Allocation of many very large objects.  You should be getting warnings out of the collector in this case.

3) Collector bugs.

4) Client data structures that grow without bound, but only a finite section of which is reachable at any point.  This issue is specific to conservative and certain other garbage collectors.   Linked queues are the canonical example.  (There are also tools in the collector to identify these, but again they're not yet usable from gcj.)  The solution usually is to clear link fields when you know that an object won't be needed further.  This is only necessary for a data structure that looks a lot like a linked queue.  (See my POPL 2002 paper for details.)

5) Too many pointers misidentified by the collector.  This is unlikely with gcj, since the collector only scans the stacks and static data conservatively.  (I'm not sure that unbounded heap growth is even possible this way with libgcj, though it's probably possible to get close enough that it doesn't matter.)

Tracing garbage collectors inherently should not reclaim memory until they believe that a certain fraction of the heap contains garbage.  Otherwise the garbage collection time overhead becomes excessive.  If you need to use 100% of the heap for application data, a tracing GC won't give you acceptable performance.  With the current libgcj and default parameters, you might hope for something like 60-70%.


> -----Original Message-----
> From: Erik Poupaert []
> Sent: Tuesday, March 25, 2003 6:28 AM
> To: GCJ Java
> Subject: RE: maximum memory for gcj-compiled executable?
> I've just found that jdbc hangs on to old resultsets, unless I close
> resultset, statement, connection, ... When I close the whole 
> lot explicitly,
> the growth in memory consumption is much slower. I guess these data
> structures are much more difficult to collect, if I don't null them
> explicitly.
> Limiting heap size, with GC_MAXIMIMUM_HEAP_SIZE, or even with 
> ulimit, is
> really nice to have, but it doesn't solve the fundamental problem. For
> server processes, the fundamental problem is related to the 
> fact that the gc
> must aggresively re-use memory instead of requesting 
> additional memory,
> whenever possible. It should also quickly release memory it 
> doesn't need any
> longer. Causing a server process under exceptional load to 
> crash, because of
> these limitations, while memory is still available, is not an 
> attractive
> scenario.
> There is still some slow residual growth in the memory 
> consumption of my
> server process, that worries me. I wonder how long the 
> process can run, and
> how many requests it can handle before there are new issues? 
> Linux has a
> reputation to run for years without rebooting ... what about 
> GCJ processes?

More information about the Java mailing list