This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug c/59128] I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

Marc Glisse <glisse at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|UNCONFIRMED                 |RESOLVED
                 CC|                            |ktkachov at gcc dot gnu.org
         Resolution|---                         |INVALID
             Status|UNCONFIRMED                 |RESOLVED
         Resolution|---                         |INVALID

--- Comment #1 from ktkachov at gcc dot gnu.org ---
That's expected behaviour.
ALPHA2 expands to 10.*10.

f47/ALPHA2 is then 100.0 / 10.0 * 10.0

The * and / operators bind from left to right, therefore this is evaluated as
(100.0 / 10.0) * 10.0 = 100.0

That's why it's usually good practice to put parentheses in your #defines:

#define ALPHA2 ((ALPHA) * (ALPHA))

--- Comment #2 from Marc Glisse <glisse at gcc dot gnu.org> ---
> #define ALPHA = 10.

No = there.

> #define ALPHA2 ALPHA*ALPHA

You forgot parentheses.

This has nothing to do with gcc. Look at the output of gcc ZED3.c -E and try to
understand why your code is wrong.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]