This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug c/48874] Sign of zeros sometimes lost in literals


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=48874

--- Comment #1 from joseph at codesourcery dot com <joseph at codesourcery dot com> 2011-05-04 16:17:12 UTC ---
On Wed, 4 May 2011, jb at gcc dot gnu.org wrote:

> #include <stdio.h>
> #include <complex.h>
> 
> int main()
> {
>   double _Complex a = 0.0 + I*0.0;
>   double _Complex b = 0.0 - I*0.0;
>   double _Complex c = -0.0 + I*0.0;
>   double _Complex d = -0.0 - I*0.0;
>   printf("a= (%g,%g)\n", creal(a), cimag(a));
>   printf("b= (%g,%g)\n", creal(b), cimag(b));
>   printf("c= (%g,%g)\n", creal(c), cimag(c));
>   printf("d= (%g,%g)\n", creal(d), cimag(d));
> }
> 
> This program, compiled with "gcc zero1.c -O2 -pedantic -Wall -std=c99" (or
> -std=gnu99) prints
> 
> a= (0,0)
> b= (0,-0)
> c= (0,0)
> d= (-0,-0)
> 
> That is, the sign of the real part of "c" is lost. Add -fdump-tree-original to
> the compile flags shows the dump as

That output appears correct to me.  Each initializer is a real+complex 
addition, and the sum of -0.0 and +0.0 is +0.0 except when rounding 
towards negative infinity.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]