This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug preprocessor/27195] hex and oct constants are converted to wrong type
- From: "lukew at radterm dot com dot au" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: 26 Oct 2006 06:45:48 -0000
- Subject: [Bug preprocessor/27195] hex and oct constants are converted to wrong type
- References: <bug-27195-7606@http.gcc.gnu.org/bugzilla/>
- Reply-to: gcc-bugzilla at gcc dot gnu dot org
------- Comment #5 from lukew at radterm dot com dot au 2006-10-26 06:45 -------
(In reply to comment #3)
> Subject: Re: hex and oct constants are converted to
> wrong type
> The resulting tokens compose the controlling constant expression which
> is evaluated according to the rules of 6.6. For the purposes of this
> token conversion and evaluation, all signed integer types and all
> unsigned integer types act as if they have the same representation as,
> respectively, the types intmax_t and uintmax_t defined in the header
> <stdint.h>.142)
>
> 142) Thus, on an implementation where INT_MAX is 0x7FFF and UINT_MAX is
> 0xFFFF, the constant 0x8000 is signed and positive within a #if
> expression even though it would be unsigned in translation phase 7.
>
I don't get it.
How can 0x8000 be signed AND positive when INT_MAX is a 16 bit integer?
--
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=27195