[Bug middle-end/19606] wrong code for arith.expr: (((unsigned int)(signed int) a ) / 2LL) with signed char a=-4

steven at gcc dot gnu dot org gcc-bugzilla@gcc.gnu.org
Sat Feb 12 13:50:00 GMT 2005


------- Additional Comments From steven at gcc dot gnu dot org  2005-02-11 22:50 -------
And there was great rejoicing... 
 
In c-typeck.c: 
3253          ovalue = value; 
(gdb) p debug_generic_expr (value) 
(intD.0) aD.1454  // Good. 
(gdb) next 
3254          value = convert (type, value); 
(gdb) p debug_generic_expr (ovalue) 
(intD.0) aD.1454 
(gdb) next 
3257          if (TREE_CODE (value) == INTEGER_CST) 
(gdb) p debug_generic_expr (value) 
(unsigned intD.3) aD.1454 
(gdb) p debug_generic_expr(type) 
unsigned intD.3 
 
So "convert (<unsigned int>, <(int) a>)" results in "<(unsigned int) a>" which 
is wrong. 
 

-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19606



More information about the Gcc-bugs mailing list