gcc reports Internal Error

Neil Booth neil@daikokuya.co.uk
Sat Jul 13 02:16:00 GMT 2002

Zack Weinberg wrote:-

> On Fri, Jul 12, 2002 at 07:50:06AM +0100, Neil Booth wrote:
> > 
> > I've often wondered whether it might be worth reading chunks at a time,
> > or at least supporting this behaviour.
> > 
> > Suppose we read in a chunk of size N.  If we can find the last newline,
> > we can put a NUL after it (remembering what was there originally).
> > If we can't find a newline, keep reading chunks until we do.  Because
> > of the range of newlines we find, it's best to actually replace the first
> > newline char in the last string of newline chars, if you see what I
> > mean.
> ...
> Right, that was roughly what I had imagined doing too.

On further reflection, I don't think it's a great idea.  It's not
multibyte safe, for example.

> But it occurs to me that the real reason we slurp the file all at once
> is not to simplify the lexer - it's because we want to cache it in
> memory in case we have to process it again.  Of course, that basically
> never happens for the primary source file, but I like the way we can
> use the same code for the primary source file and all #included
> files...

Maybe we're best leaving the caching to the O/S?  I imagine the number
of headers that we want to process twice is quite limited (2%?).


More information about the Gcc-bugs mailing list