This is the mail archive of the
mailing list for the GCC project.
Re: [PATCH] Fix a fallout of PR14179 fix (3.3/3.4/4.0 regression)
- From: kenner at vlsi1 dot ultra dot nyu dot edu (Richard Kenner)
- To: roger at eyesopen dot com
- Cc: gcc-patches at gcc dot gnu dot org
- Date: Wed, 29 Dec 04 13:15:50 EST
- Subject: Re: [PATCH] Fix a fallout of PR14179 fix (3.3/3.4/4.0 regression)
But that's also what int_fits_type_p is for, to allow the middle-end
to detect the overflow itself, at those places where it's relevant,
and respond accordingly.
Right, but that's what we're talking about.
The middle-end has no reason to set TREE_OVERFLOW for itself, which
ignores it other than to disable transformations and optimizations
that then have to be performed in the overflow-agnostic RTL optimizers.
I'm not sure what you're saying; we may well be agreeing.
The question in my mind is how does that fact that a TYPE_SIZE has
overflowed get detected? Can the front end assume that if it overflowed,
TREE_OVERFLOW will have gotten set? Are the middle end and back end
allowed to consult TREE_OVERFLOW of a size when doing memory allocation?