This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Pathalogical divides
- To: gcc at gcc dot gnu dot org
- Subject: Pathalogical divides
- From: kenner at vlsi1 dot ultra dot nyu dot edu (Richard Kenner)
- Date: Thu, 21 Sep 00 15:26:36 EDT
Consider the following program on x86:
int rem (int a, int b) { return a % b; }
int
main ()
{
printf ("%d\n", rem (0x80000000, -1));;
}
When run, rather than producing zero, as expected, it gets a SIGFPE.
This is because the division of the largest negative integer by negative one
results in an overflow.
So the first question is whether this is valid C behavior.
Next, compile the above with -O3 on an x86 and notice that GCC gets a
SIGFPE when constant-folding.
Finally, consider:
int
foo (int a, int b)
{
return (a - ((a == 0x80000000 && b == -1) ? 0 : a % b)) / b;
}
This program when passed "normal" arguments does not get an overflow.
But GCC pulls the conditional out of the subtraction and division and
causes the compiler to run into the SIGFPE above.
I think the compiler crash needs to be fixed. We can do it either by
protecting the integer part of simplify_binary_operation against SIGFPE
just like the FP or explicitly testing for this case just like we
check for divide by zero.
Any thoughts about whether we need a run-time test for this case
in the "%" operator?