This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug c/59128] I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA
- From: "glisse at gcc dot gnu.org" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Thu, 14 Nov 2013 11:00:42 +0000
- Subject: [Bug c/59128] I use #define to set ALPHA to a constant and then (for convenience) define ALPHA2 = ALPHA*ALPHA
- Auto-submitted: auto-generated
- References: <bug-59128-4 at http dot gcc dot gnu dot org/bugzilla/>
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128
Marc Glisse <glisse at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|UNCONFIRMED |RESOLVED
CC| |ktkachov at gcc dot gnu.org
Resolution|--- |INVALID
Status|UNCONFIRMED |RESOLVED
Resolution|--- |INVALID
--- Comment #1 from ktkachov at gcc dot gnu.org ---
That's expected behaviour.
ALPHA2 expands to 10.*10.
f47/ALPHA2 is then 100.0 / 10.0 * 10.0
The * and / operators bind from left to right, therefore this is evaluated as
(100.0 / 10.0) * 10.0 = 100.0
That's why it's usually good practice to put parentheses in your #defines:
#define ALPHA2 ((ALPHA) * (ALPHA))
--- Comment #2 from Marc Glisse <glisse at gcc dot gnu.org> ---
> #define ALPHA = 10.
No = there.
> #define ALPHA2 ALPHA*ALPHA
You forgot parentheses.
This has nothing to do with gcc. Look at the output of gcc ZED3.c -E and try to
understand why your code is wrong.