This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

20040309-1.c vs overflow being undefined


If we look at this testcase, we have a function like:
int foo(unsigned short x)
{
  unsigned short y;
  y = x > 32767 ? x - 32768 : 0;
  return y;
}


x is promoted to a signed int by the front-end as the type
of 32768 is signed.  So when we pass 65535 to foo (like in the testcase),
we get some large negative number for (signed int)x and then we are subtracting
more from the number which causes us to have an overflow.

Does this sound right?  Should the testcase have -fwrapv or change 32768 to
32768u?  (This might also be wrong in SPEC and gzip too, I have not looked
yet).

Thanks,
Andrew Pinski


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]