This is the mail archive of the gcc-help@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Compiling gcc with -mcmodel=large fails


On 3 July 2012 17:28, Andrew Haley <aph@redhat.com> wrote:
> [...]
> It certainly looks like there are a lot of problems with the large
> memory model.  These can be fixed, but right now I'm not sure I'd
> use it.
>
> Is it really really necessary to have such a large executable?
>
> Andrew.

OK thanks for the advice.

"These can be fixed", do you mean, in the sense that, if I use the
right options etc., I can get it to work? Or do you mean, in the
sense, that one would have to modify gcc in order to make it work? I
mean, as a software developer (but not familiar with the internals of
gcc), is it something I can realistically do, or not?

"Is it really really necessary to have such a large executable" Hehe
that is the million dollar question. Basically I've already written an
extensive program which generates code, and it works fine. The
performance is also totally fine.

Alas the following lines in the doc made me assume that, if the code
was larger than 2GB, it wouldn't be a problem:

   -mcmodel=large
      Generate code for the large model: This model makes no
      assumptions about addresses and sizes of sections.

Perhaps one could add a note such as "(experimental)" or something? As
it stands, one assumes that it will just work.

Although I will probably have no option other than to re-write the
code to use a different strategy for code generation, it feels wrong,
if I've got a computer with 128GB RAM, a 64-bit operating system, many
cores, to be limited in this way.

Cheers, Adrian


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]