This is the mail archive of the gcc-help@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Help working out where GCC decides the size of 0xFFFFFFFF


Hi,

I am running GCC 5 on an SH-2 (7058) simulator that I wrote myself.
The simulator works well enough to (mostly) run GCC, however I have
encountered what is probably a bug in my simulator that GCC triggers
and am looking for help to work out where in GCC this bug is
triggered.

Running GCC as cross compiler targeting my platform seems to produce
the correct results.

The bug is that when I run GCC on my target, GCC identifies the
sizeof(0xFFFFFFFF) as 8 bytes, whereas when I run GCC as a cross
compiler targeting my simulator it correctly identifies the size as 4
bytes. The sizeof(0x7FFFFFFF) is identified correctly as 4 bytes on
both.

I am looking for help on narrowing down where GCC makes the decision
on the size of the 0xFFFFFFFF literal so I can investigate further.

So any pointers as to were to start looking for where GCC calculates
the size of a number would help me greatly.

Thanks in advance!

Alex


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]