This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug preprocessor/27195] hex and oct constants are converted to wrong type



------- Comment #5 from lukew at radterm dot com dot au  2006-10-26 06:45 -------
(In reply to comment #3)
> Subject: Re:  hex and oct constants are converted to
>  wrong type

>   The resulting tokens compose the controlling constant expression which 
>   is evaluated according to the rules of 6.6. For the purposes of this 
>   token conversion and evaluation, all signed integer types and all 
>   unsigned integer types act as if they have the same representation as, 
>   respectively, the types intmax_t and uintmax_t defined in the header 
>   <stdint.h>.142)
> 
>   142) Thus, on an implementation where INT_MAX is 0x7FFF and UINT_MAX is 
>   0xFFFF, the constant 0x8000 is signed and positive within a #if 
>   expression even though it would be unsigned in translation phase 7.
> 

I don't get it.

How can 0x8000 be signed AND positive when INT_MAX is a 16 bit integer?


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=27195


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]