This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug other/23469] New: Behaviour of built-in __signbitf(x) differs with optimization
- From: "ddneilson at gmail dot com" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: 18 Aug 2005 20:43:07 -0000
- Subject: [Bug other/23469] New: Behaviour of built-in __signbitf(x) differs with optimization
- Reply-to: gcc-bugzilla at gcc dot gnu dot org
The return value of the builtin signbit(x) macro (called with a float argument)
differs when compiling with optimizations or not
When optimizations are off the return value is 0x8000 or 0 (signbit set and
not-set, respectively).
When optimizations are on (-O1) return value is 1 or 0 (signbit set and not-set,
respectively).
I'll attach a sample .cpp file, .ii file, and the output of -save-temps to this
report.
The sample .cpp file shows a simple example. The function mysignbit() is
defined just as the __signbitf(x) function in include/bits/mathinline.h but
returns a different value than __signbitf(x) when compiling without
optimizations, but returns the same value when compiling with optimizations (-O1
and above).
--
Summary: Behaviour of built-in __signbitf(x) differs with
optimization
Product: gcc
Version: 3.4.4
Status: UNCONFIRMED
Severity: normal
Priority: P2
Component: other
AssignedTo: unassigned at gcc dot gnu dot org
ReportedBy: ddneilson at gmail dot com
CC: gcc-bugs at gcc dot gnu dot org
GCC build triplet: i686-pc-linux-gnu
GCC host triplet: i686-pc-linux-gnu
GCC target triplet: i686-pc-linux-gnu
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=23469