[Bug tree-optimization/84416] internal compiler error: in int_cst_value, at tree.c:11089

rguenth at gcc dot gnu.org gcc-bugzilla@gcc.gnu.org
Fri Feb 16 09:39:00 GMT 2018


https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84416

Richard Biener <rguenth at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|UNCONFIRMED                 |NEW
   Last reconfirmed|                            |2018-02-16
                 CC|                            |jsm28 at gcc dot gnu.org,
                   |                            |rguenth at gcc dot gnu.org
     Ever confirmed|0                           |1

--- Comment #1 from Richard Biener <rguenth at gcc dot gnu.org> ---
This is a very old known issue with the representation of the dependence
analysis
lambda vectors.  It seems to be facilitated by us using __int128 for literals
automagically for a[b - 18446744073709551607] which in this context really
doesn't make sense.

Joseph, the C FE produces

  a[b] = a[(__int128) b + -0xfffffffffffffff7];

but shouldn't array indices be restricted to at most ptrdiff_t / size_t?  Is
the
FE the appropriate place to do that?


More information about the Gcc-bugs mailing list