Jacks, gij and the GC

Boehm, Hans hans.boehm@hp.com
Thu Apr 22 18:29:00 GMT 2004


We should already be throwing OutOfMemoryError when we run out of heap memory.
Whether or not it's possible to throw an exception when you're out
of memory is probably platform-dependent.  But certainly if you
use the interpreter option or the GC_MAXIMUM_HEAP_SIZE environment
variable, this should work.

(The exception is actually thrown by the handle_out_of_memory() callback
that is registered with the GC.  This avoids testing every allocated
object for NULL again.)

I think gcj differs from Sun's JDK in several ways here:

1. The default is to not impose a hard heap limit, since the heap is
neither preallocated nor contiguous.  I think this is the right default
for the same reasons that it's the default for every other language
implementation on Linux.  It sometimes will need to be overridden.

2. It issues a warning message in addition to throwing the exception.
This is a minor bug.  The easiest way out is probably for
libgcj should be registering a warning procedure
with the collector, which does some filtering on warnings.  We could
also add collector options, but the (admittedly clumsy) filtering
mechanism is already there.

3. Gcj handles running out of stack space poorly.  We had a long discussion
about fixing this.  I think the conclusion was that correctly and
reliably throwing an exception in that case was very hard.  I think it
would probably require adding explicit tests to the generated code to
make sure that a minimum amount of space is left at function entry.  A
major problem is that JNI or CNI code is generally not prepared to recover
if it runs out of stack space.  On the other hand, I think we probably
could do better at reporting the problem before the process dies.

It's very hard to make any sort of out-of-memory handling 100% reliable.
And I suspect all Java implementations fail now and then.  On Linux,
you can fail with a SIGSEGV while trying to grow a thread stack or writing
to a COW page containing VM internal static data.  I suspect in both cases
all bets are off for pretty much all Java (and C, C++, Fortran, ...)
implementations.  For such reasons, I would generally prefer something
like GC_MAXIMUM_HEAP_SIZE or the interpreter -mx option to the limit
or ulimit command.

Hans

> -----Original Message-----
> From: java-owner@gcc.gnu.org 
> [mailto:java-owner@gcc.gnu.org]On Behalf Of
> Andrew Haley
> Sent: Thursday, April 22, 2004 5:31 AM
> To: Ranjit Mathew
> Cc: java@gcc.gnu.org
> Subject: Re: Jacks, gij and the GC
> 
> 
> Ranjit Mathew writes:
>  > Andrew Haley wrote:
>  > > Ranjit Mathew writes:
>  > >  > 
>  > >  > In a well-behaving JVM this should print out
>  > >  > "false" after some time.
>  > > 
>  > > Yes, but the amount of time that it might take is unbounded.
>  > > 
>  > >  > 1. Doesn't anyone else see this behaviour on Linux?
>  > > 
>  > > Yes.  Linux can protect aginst this possibility by using 
> "ulimit".
>  > > 
>  > > This is a longstanding gcj bug that can be fixed, but with some
>  > > difficulty.
>  > 
>  > Thanks a lot, but can you please elaborate on that?
>  > 
>  > I mean, is this a known behaviour of the Boehm-GC or
>  > some other problem within GCJ?
> 
> The default limit is "no limit".  So, your process expands to fill all
> virtual memory, and that necessarily causes a great deal of thrashing.
> While it's thrashing you can't do anything else.
> 
> For this to work well would require gcj to be changed so that when it
> runs out of memory we cleanly throw an exception.  This isn't
> impossible, but it is hard.
> 
> Andrew.
> 



More information about the Java mailing list