This is the mail archive of the gcc-help@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Alignment issue on x86_64?


Hi everyone,

[Off topic rant]

>> I may have missed something in the discussion, but be aware that longs
>> in x86 are 32-bit, but in x86_64 they are 64-bit.
> 
> *on most architectures

#define RANT

>From my understanding of C (and C++, which inherits this from C), the data
types were originally intended to map to the architecture sizes.

byte        --> char
half-word   --> short
word        --> int
double-word --> long

On a 32-bit architecture:
short 16-bit, int 32-bit, long 64-bit.

On a 36-bit architecture:
short 18-bit, int 36-bit, long 72-bit.

So on a 64-bit architecture, shouldn't that be:
short 32-bit, int 64-bit, long 128-bit?

But instead we have IL32/P64 and I32/LP64 for 64-bit architecture.  Just
seems... incongruous.  Seems to me we should have IP64/L128 for 64-bit
architecture in C/C++.

The whole thing sort of smells like a certain other platform that has a WORD
be 16-bit, even though that typedef is an anachronism on 32-bit and 64-bit
architectures.  The same certain other platform which uses type-based
Hungarian notation such as dwParam, with which the Param is no longer a dw,
which rather undermines the entire basis for using type-based Hungarian
notation in the first place.  (Making the notational convention just noise,
since it cannot be trusted.)

#undef RANT

Sorry.  Pet peeve.

Ventilatory,
--Eljay


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]