This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: [patch 1/4] change specific int128 -> generic intN
- From: DJ Delorie <dj at redhat dot com>
- To: Bernd Schmidt <bernds at codesourcery dot com>
- Cc: ebotcazou at adacore dot com, gcc-patches at gcc dot gnu dot org
- Date: Thu, 3 Jul 2014 16:01:26 -0400
- Subject: Re: [patch 1/4] change specific int128 -> generic intN
- Authentication-results: sourceware.org; auth=none
- References: <201404142303 dot s3EN3ONP009938 at greed dot delorie dot com> <23409176 dot SSiGLCXs8E at polaris> <201407021457 dot s62EvTOm016332 at greed dot delorie dot com> <1920647 dot vUrbzv2NSg at polaris> <201407031612 dot s63GC2CM030078 at greed dot delorie dot com> <53B584ED dot 6050603 at codesourcery dot com>
> That's what'll need fixing then.
Can I change TYPE_SIZE to TYPE_SIZE_WITH_PADDING then? Because it's
not reflecting the type's size any more. Why do we have to round up a
type's size anyway? That's a pointless assumption *unless* you're
allocating memory space for it, and in that case, you want
TYPE_SIZE_UNITS anyway.
> I doubt there are too many places that require changing.
I don't doubt it, because I've been fighting these assumptions for
years.
> > Heck, most of gcc is oblivious to the idea that types might not be
> > powers-of-two in size. GCC doesn't even bother with a
> > DECL_PRECISION.
>
> Sure - why would you even need one?
Why do we need to have DECL_SIZE_UNITS (the size of the type, rounded
up to whole number of bytes) and DECL_SIZE (the size of the type,
rounded up to whole number of bytes), yet not have something that says
how big the decl *really is* ?
A pointer on MSP430 is 20 bits. All the general registers are 20
bits. Not 16, and not 24. 20. There's nothing in a decl that says
"I'm 20 bits" and inevitably it ends up being SImode instead of
PSImode.
> > It seems to work just fine in testing, and I'm trying to make it
> > non-fundamental.
>
> I also think this is not a very good idea.
Then please provide a "very good idea" for how to teach gcc about true
20-bit types in a system with 8-bit memory and 16-bit words.