[Bug other/23469] Behaviour of built-in __signbitf(x) differs with optimization
ddneilson at gmail dot com
gcc-bugzilla@gcc.gnu.org
Sun Aug 21 20:28:00 GMT 2005
------- Additional Comments From ddneilson at gmail dot com 2005-08-21 20:28 -------
One of the really odd things, that makes me still think this is a bug in some
way is that the definition of "mysignbit()" that I gave in signbit.cpp is
identical to the definition of __signbitf(x) as given in bits/mathinline.h. But,
yet, when optimizations aren't turned on the two functions have differing return
values; mysignbit() will return 1 iff the signbit is set, whereas __signbitf()
will return 0x80000000 iff the signbit is set.
Furthermore, when all documented -O1 flags are turned on manually (without
actually using -O1) the output of __signbitf(x) is the same as the unoptimized
version; it only differs when one of the -On flags are used.
--
What |Removed |Added
----------------------------------------------------------------------------
Status|RESOLVED |UNCONFIRMED
Resolution|INVALID |
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=23469
More information about the Gcc-bugs
mailing list