Compiling gcc with -mcmodel=large fails

Andrew Haley
Thu Jul 5 15:23:00 GMT 2012

On 07/05/2012 02:59 PM, Adrian Smith wrote:
> On 3 July 2012 17:28, Andrew Haley <> wrote:
>> [...]
>> It certainly looks like there are a lot of problems with the large
>> memory model.  These can be fixed, but right now I'm not sure I'd
>> use it.
>> Is it really really necessary to have such a large executable?
>> Andrew.
> OK thanks for the advice.
> "These can be fixed", do you mean, in the sense that, if I use the
> right options etc., I can get it to work?


> Or do you mean, in the
> sense, that one would have to modify gcc in order to make it work?


> I
> mean, as a software developer (but not familiar with the internals of
> gcc), is it something I can realistically do, or not?

Probably not.  I got some internal compiler errors when building the
library for the large model

> "Is it really really necessary to have such a large executable" Hehe
> that is the million dollar question. Basically I've already written an
> extensive program which generates code, and it works fine. The
> performance is also totally fine.
> Alas the following lines in the doc made me assume that, if the code
> was larger than 2GB, it wouldn't be a problem:
>    -mcmodel=large
>       Generate code for the large model: This model makes no
>       assumptions about addresses and sizes of sections.
> Perhaps one could add a note such as "(experimental)" or something? As
> it stands, one assumes that it will just work.

Good point.  I don't know if anyone is working on this.

> Although I will probably have no option other than to re-write the
> code to use a different strategy for code generation, it feels wrong,
> if I've got a computer with 128GB RAM, a 64-bit operating system, many
> cores, to be limited in this way.

Huge executables are a pretty bad idea if you look at the architecture.
It makes far more sense to split a program up into a bunch of shared
libraries.  It's not that hard to do.


More information about the Gcc-help mailing list