This is the mail archive of the java@gcc.gnu.org mailing list for the Java project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

GC Trouble on Embedded System


Hello Everyone,

We are working on a port of GCJ/libgcj to the arm-wince-pe platform.  We
have the port working very well and it is now very close to being able
to run our application.

However, we have run into big problems with the Boehm GC.  We have
created a very small but very nasty test of garbage collection that
isolates the problem we're having in the application to a half-page of
very simple Java code.  This code is the GCTest class presented below. 
What it does to do some allocations of random length, keep references to
the allocations around in a Vector, and then release references to
allocations to keep the grand total of allocations below some
command-line-defined value.  Here is the GCTest:

import java.util.Vector;

public class GCTest {

  public static int maxNumAllocations = 10000;
  public static int maxAllocationSize = 1000000;
  public static int maxTotalAllocation = 10000000;

  public static void main(String[] argv) {

    Vector allocations = new Vector();
    int    grandTotalAllocated = 0;

    if (argv.length > 0) {
      maxNumAllocations = Integer.parseInt(argv[0]);
      if (argv.length > 1) {
        maxAllocationSize = Integer.parseInt(argv[1]);
        if (argv.length > 2) maxTotalAllocation = Integer.parseInt(argv[2]);
      }
    }

    int numAllocations = 0;
    int totalAllocation = 0;
    while (numAllocations < maxNumAllocations) {
      long allocationSize = Math.round(Math.random() * (double)maxAllocationSize);
      totalAllocation += allocationSize;
      while (totalAllocation > maxTotalAllocation) {
        int index = (int)(allocations.size() * Math.random());
        byte[] deallocation = (byte[])allocations.elementAt(index);
        totalAllocation -= deallocation.length;
        allocations.removeElementAt(index);
        System.out.println("Deallocated " + deallocation.length + " bytes, leaving " + totalAllocation + " bytes allocated in " + allocations.size() + " blocks");
      }
      byte[] allocation = new byte[allocationSize];
      allocations.addElement(allocation);
      numAllocations++;
      System.out.println("[" + numAllocations + "]: Allocated " + allocationSize + " bytes, for a running total of " + grandTotalAllocated + " bytes");
      grandTotalAllocated += allocationSize;
    }
  }
}

For the current discussion, I am running the test as:

GCTest 100000 1500000 6000000

The 3 arguments are:
1. iterations
2. max individual allocation permitted
3. grand total (sum) allocation limit

Running on Windows CE 3.0 on my ipaq 3765, this test fails gracefully
after about 65K iterations with a java.lang.OutOfMemoryError.  Leading
up to this failure, I can see that the Java heap is growing without
bound.  Why would the above use of Java memory allocation cause
unbounded heap growth?  This seems wrong.

On other platforms, notably Xscale-based CE machines (instead of
StrongARM), the above test is producing "prefetch aborts" and "data
aborts".  In other words, low-level memory management faults that kill
the process, or in many cases cause a system crash.

We are embarking on a project to debug the Boehm GC with a view toward
achieving operation that is robust enough to handle a large application
like ours.  Any ideas on how to proceed would be greatly appreciated. 
The first question is obviously - should the above GCTest work without
heap growth?  And, if not, why would there be the kind of heap growth
we're experiencing.  Fragmentation?

TIA,
craig vanderborgh
voxware incorporated


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]