This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

RE: `quad' printf format specifier ?


On Mon, 8 Feb 1999, Guillermo A. Loyola wrote:

> >From the list of gcc ports that never saw the light of day,
> Microunity had one that defined:
> 
> 		long long : 128 bits
> 		     long : 64 bits
> 			int : 64 bits
> 		    short : 32 bits
> 	    short short : 16 bits
> 		     char : 8 bits

OK, so an int is always the "native" machine's natural bit-length? ie. on
IA32, it's 32 bits, and on some 64bit architectures it's 64 bits? 

In one of my C++ books, it happens to mention that short int is always 16
bits, and long int is always 32 bits, but an int is always dependent on
the architecture in use. Does this holds true for most C/C++
implementations?

And what will we do if 128 bit architectures comes into wide use? Will we
have to use 'long long long'? Or 'bloody long int'? =) 

I'm just concerned because I have some software that defines its own
datatypes (i.e typedef short int uint8; typedef long int uint32), and I'd
hate to see that break on 64 bit or 128 bit systems.

Is 'short short' really a valid C/C++ data type? And does it gives you an
eight-bit type, assuming that 'short int' is 16 bits?

Cheers,
Alex.

---
I want 'long long long long' for 256 bit data elements!



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]