[Bug ada/26348] [4.2 Regression] ICE compiling a-textio.adb at -O1 -ftree-vrp
rguenth at gcc dot gnu dot org
gcc-bugzilla@gcc.gnu.org
Tue Feb 28 17:29:00 GMT 2006
------- Comment #13 from rguenth at gcc dot gnu dot org 2006-02-28 17:21 -------
The following
-----------
Only the edge from block #14 is executable, so the result of the
PHI will be equivalent to the current range for last_15. The
range we record is [0, 0x7fffffff].
We then proceed to the uses of last_148. One of which is:
lastD.2483_86 = lastD.2483_148 + 1;
Now for the first "oddity". If we look at the underlying type
for last we have a type "natural___XDLU_0__2147483647". What's
interesting about it is that it has a 32bit type precision, but
the min/max values only specify 31 bits. ie, the min/max values
are 0, 0x7fffffff.
So anyway, we proceed to add the current range for last_148
[0, 0x7fffffff] to the constant 1. This results in
[1, 0x80000000]. Second oddity. This is clearly outside the
type's min/max values, but because the value is inside the
type's precision, no overflow is signaled and no bits are
zero'd out.
-----------
sounds odd - adding 1 to the range [0, 0x7fffffff] for a type
with the said min/max value should result in [1, 0x7fffffff],
not [1, 0x80000000]. Why is the resulting range not clipped to
the types min/max value? I can see no difference with this "special"
case specifying 31bit precision to any other "non-special" min/max
range like [5, 8]. But I must miss something.
--
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=26348
More information about the Gcc-bugs
mailing list