[Bug c/59128] New: I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA

jpmct01 at gmail dot com gcc-bugzilla@gcc.gnu.org
Thu Nov 14 10:46:00 GMT 2013


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

            Bug ID: 59128
           Summary: I use #define to set ALPHA to a constant and then (for
                    convenience) define ALPHA2 = ALPHA*ALPHA
           Product: gcc
           Version: 4.8.3
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c
          Assignee: unassigned at gcc dot gnu.org
          Reporter: jpmct01 at gmail dot com

Created attachment 31215
  --> http://gcc.gnu.org/bugzilla/attachment.cgi?id=31215&action=edit
zip file containing code and results (with line ordered file)

I use #define as follows

#define ALPHA = 10.
#define ALPHA2 ALPHA*ALPHA


In my code I define
f47 = ALPHA2;
which gives
ALPHA2 = 100.000000

but when I calculate f47/ALPHA2 I get the result
 f47/ALPHA2 = 100.000000

when I use the form f47/(1.*ALPHA2)
 f47/(1.*ALPHA2) = 1.000000

see attached code and result



More information about the Gcc-bugs mailing list