gcc 3.3: long long bug?
Lev Assinovsky
LAssinovsky@algorithm.aelita.com
Mon Apr 7 14:41:00 GMT 2003
100000000 * 1000000000 is calculated during compilation, right?
So the compiler is able to manage long long constants internally.
I guess all this issue occurs because of new gcc 3.3 parser
where long long literals are not recognized properly unless
"LL" suffix is presented.
May be I am wrong. May be there are another explanations
of the different long long processing in 3.3 vs. 3.2.
----
Lev Assinovsky
Aelita Software Corporation
O&S Core Division, Programmer
ICQ# 165072909
> -----Original Message-----
> From: Andreas Schwab [mailto:schwab@suse.de]
> Sent: Monday, April 07, 2003 6:13 PM
> To: Lev Assinovsky
> Cc: A.R. Ashok Kumar; Eric Botcazou; gcc-help@gcc.gnu.org
> Subject: Re: gcc 3.3: long long bug?
>
>
> "Lev Assinovsky" <LAssinovsky@algorithm.aelita.com> writes:
>
> |> No I meant MSVC 6.2 on Windows.
> |> Yes, it works without any suffixes
> |> with 3.2 on Unix.
> |> Also this:
> |> const long long n = 100000000 * 1000000000;
> |>
> |> works with 3.3 on Unix either.
>
> Whatever you mean with "works". This is invoking undefined
> behaviour if
> LONG_MAX < 100000000000000000, or INT_MAX >= 1000000000 and INT_MAX <
> 100000000000000000. The limits of long long are irrelevant here.
>
> Andreas.
>
> --
> Andreas Schwab, SuSE Labs, schwab@suse.de
> SuSE Linux AG, Deutschherrnstr. 15-19, D-90429 Nürnberg
> Key fingerprint = 58CA 54C7 6D53 942B 1756 01D3 44D5 214B 8276 4ED5
> "And now for something completely different."
>
More information about the Gcc-help
mailing list