This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug tree-optimization/25985] [4.2 Regression] with optimization integer math fails
- From: "pinskia at gcc dot gnu dot org" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: 15 Mar 2006 18:40:05 -0000
- Subject: [Bug tree-optimization/25985] [4.2 Regression] with optimization integer math fails
- References: <bug-25985-9034@http.gcc.gnu.org/bugzilla/>
- Reply-to: gcc-bugzilla at gcc dot gnu dot org
------- Comment #3 from pinskia at gcc dot gnu dot org 2006-03-15 18:40 -------
Hmm, if I change the function to be:
#include <stdio.h>
int main(void)
{
int bits = 25;
while (bits) {
printf("bits=%d\n",bits);
if (bits >= 8) {
bits -= 8;
} else {
break;
}
}
return 0;
}
And then compare 4.1 vs 4.2, I get the following observation:
in 4.1, it changed bits > 7 to use the iv, while in 4.2, it changed the bits !=
0 to use the iv
--
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=25985