This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH] Fix a fallout of PR14179 fix (3.3/3.4/4.0 regression)


    But that's also what int_fits_type_p is for, to allow the middle-end
    to detect the overflow itself, at those places where it's relevant,
    and respond accordingly.  

Right, but that's what we're talking about.

    The middle-end has no reason to set TREE_OVERFLOW for itself, which
    ignores it other than to disable transformations and optimizations
    that then have to be performed in the overflow-agnostic RTL optimizers.

I'm not sure what you're saying; we may well be agreeing.

The question in my mind is how does that fact that a TYPE_SIZE has
overflowed get detected?  Can the front end assume that if it overflowed,
TREE_OVERFLOW will have gotten set?  Are the middle end and back end
allowed to consult TREE_OVERFLOW of a size when doing memory allocation?


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]