gcc reports Internal Error

Stuart MacDonald stuartm@connecttech.com
Thu Jul 11 13:59:00 GMT 2002

Here's the error text:

gcc: Internal error: Terminated (program cpp0)
Please submit a full bug report.
See URL:http://bugzilla.redhat.com/bugzilla/ for instructions.

Here's the story:

A little while ago I needed to find out what all the predefined macros
were for a compiler (not gcc). After some web research I came across
a program called 'defines' that did a quick search. Its main technique
is to build a C file with probable macros, and preprocess it and see
what happens.
It occurred to me that a brute-force approach could be employed; just
build a file with all possible macros, preprocess it and see what
I whipped up a little app that writes to standard out all possible
macros in the following format:

#ifdef A
char foo[] = "A"    = A;

starting at "A" and running to a specified end-length. So

# bfdefines 4


#ifdef A
char foo[] = "A"    = A;
#ifdef B
char foo[] = "B"    = B;
#ifdef zzzy
char foo[] = "zzzy"    = zzzy;
#ifdef zzzz
char foo[] = "zzzz"    = zzzz;

This turns out to be a fair amount of data. Rather than write it all
to a file, I

# bfdefines 4 | gcc -E - > results

This is when the error happens. It seems (from watching sar) that gcc
runs out of memory as well.

I'm guessing that gcc has a built-in limit on how big a file it can
compile/preprocess, but nowhere can I find what that limit is. Is this
what is happening, or is an actual bug in the compiler?

I did check the bugs at the url reference in the error text, but
nothing relevant turned up, and I think that gcc stuff should be
handled in gcc forums, not RedHat forums.

Why is gcc using up memory? The output isn't creating anything for the
preprocessor to store, it merely makes a large number of queries about
what's already stored.

A related question that I couldn't find an answer for: what's the
largest macro name size (in terms of string length) supported by gcc?


More information about the Gcc-bugs mailing list