hektor monteiro x285 <hmonteiro@ctio.noao.edu> writes:
Hi everyone! I am working with an astrophysical code that uses a lot
of memory (RAM). Basically because of a 3D matrix that I have to
define. I was working with 1G of RAM before and could run the code
with a cube of 70^3 cells. Recently I bought a new super machine with
4g of RAM and I am using the gcc 3.2 compiler on linux redhat 7.3. But
I am not able to run more than 70^3 cells on this. Can anyone give me
some clue?? I would think I should be able to do better no? Is this a
linux kernel limitation or gcc or something else??
by default, linux kernels are built to address a max of 2G. If you
rebuild your kernel, you can change this options to allow 4G or
70T (virtual, not physical). I think this still leaves individual
processes limited to 2G, but at least other programs running at
the same time will not be biting into that.