Thu May 17 20:36:00 GMT 2001
I was somewhat astonished to run out of space on my disk while
building GCC today.
I found that the libgcj build took up 7GB on my x86. (Yes, that's GB,
The cause is the following wonderful N^2 algorithm used to produce the
ld -r -o .libs/libgcj.la-1.o .libs/prims.o
ld -r -o .libs/libgcj.la-2.o .libs/posix.o .libs/libgcj.la-1.o
ld -r -o .libs/libgcj.la-3.o .libs/jni.o .libs/libgcj.la-2.o
ld -r -o .libs/libgcj.la-4.o .libs/exception.o .libs/libgcj.la-3.o
ld -r -o .libs/libgcj.la-5.o .libs/resolve.o .libs/libgcj.la-4.o
ld -r -o .libs/libgcj.la-6.o .libs/defineclass.o .libs/libgcj.la-5.o
ld -r -o .libs/libgcj.la-7.o .libs/interpret.o .libs/libgcj.la-6.o
ld -r -o .libs/libgcj.la-8.o .libs/name-finder.o .libs/libgcj.la-7.o
ld -r -o .libs/libgcj.la-9.o gnu/gcj/convert/.libs/JIS0208_to_Unicode.o .libs/libgcj.la-8.o
Hmm. You do 735 times with 100K files, and you've got yourself a real
humdinger of a problem.
It looks like this is some wackiness in libtool, presumably to deal
with the fact that otherwise the command-line would be too long. If
we do the `rm' after each link, we will still be N^2 in time, but at
least not in space. It would be better if we could do more than once
file at time.
There are systems where running out of disk like this causes crashes
and corruption, and we shouldn't be putting our users at risk when
they start the day with 7GB of free space. So, we need a fix for this
ASAP, or we're going to have to choose between violating the
no-hacking-libtool-in-the-gcc-tree rule and temporarily disabling Java
builds, neither of which is cool.
Mark Mitchell email@example.com
CodeSourcery, LLC http://www.codesourcery.com
More information about the Java