Compiling gcc with -mcmodel=large fails
Thu Jul 5 13:59:00 GMT 2012
On 3 July 2012 17:28, Andrew Haley <firstname.lastname@example.org> wrote:
> It certainly looks like there are a lot of problems with the large
> memory model. These can be fixed, but right now I'm not sure I'd
> use it.
> Is it really really necessary to have such a large executable?
OK thanks for the advice.
"These can be fixed", do you mean, in the sense that, if I use the
right options etc., I can get it to work? Or do you mean, in the
sense, that one would have to modify gcc in order to make it work? I
mean, as a software developer (but not familiar with the internals of
gcc), is it something I can realistically do, or not?
"Is it really really necessary to have such a large executable" Hehe
that is the million dollar question. Basically I've already written an
extensive program which generates code, and it works fine. The
performance is also totally fine.
Alas the following lines in the doc made me assume that, if the code
was larger than 2GB, it wouldn't be a problem:
Generate code for the large model: This model makes no
assumptions about addresses and sizes of sections.
Perhaps one could add a note such as "(experimental)" or something? As
it stands, one assumes that it will just work.
Although I will probably have no option other than to re-write the
code to use a different strategy for code generation, it feels wrong,
if I've got a computer with 128GB RAM, a 64-bit operating system, many
cores, to be limited in this way.
More information about the Gcc-help