maximum memory for gcj-compiled executable?

Erik Poupaert erik.poupaert@chello.be
Tue Mar 25 14:30:00 GMT 2003


I've just found that jdbc hangs on to old resultsets, unless I close
resultset, statement, connection, ... When I close the whole lot explicitly,
the growth in memory consumption is much slower. I guess these data
structures are much more difficult to collect, if I don't null them
explicitly.

Limiting heap size, with GC_MAXIMIMUM_HEAP_SIZE, or even with ulimit, is
really nice to have, but it doesn't solve the fundamental problem. For
server processes, the fundamental problem is related to the fact that the gc
must aggresively re-use memory instead of requesting additional memory,
whenever possible. It should also quickly release memory it doesn't need any
longer. Causing a server process under exceptional load to crash, because of
these limitations, while memory is still available, is not an attractive
scenario.

There is still some slow residual growth in the memory consumption of my
server process, that worries me. I wonder how long the process can run, and
how many requests it can handle before there are new issues? Linux has a
reputation to run for years without rebooting ... what about GCJ processes?



More information about the Java mailing list