[PATCH] DEFAULT_SIGNED_BITFIELDS Macro
Wed Jun 30 23:15:00 GMT 1999
Jeffrey A Law <email@example.com> writes:
> In message < 19990604094348B.firstname.lastname@example.org >you write:
> The problem is the documentation is wrong about whether or not the signedness
> of a bitfield matters between .o files.
The documentation makes no such claim. Of course it matters.
The claim is that signed-ness of bitfields is not an ABI issue,
but a C language dialect issue. That is a more subtle and reasonable
point. The argument seems to be that which sign a bitfield has (when
not otherwise specified) is an issue of definining the *programming
language*, not the *architecture*, and hence is not a proper ABI issue.
> Consider the following test
> If foo.c and bar.c are compiled with different signedness of bitfields they
> will pass different values to printit. One will pass 0xfffffe00 and other
> 0x200. Clearly different and can cause a variety of problems is you're not
Well of course. Nobody has claimed otherwise.
> That's because the author of that passage is clueless in regards to the
> effect of bitfield signedness.
I very much doubt that.
Leave aside the philophical point as to whether bitfield signed-ness is
an "ABI" issue or not, the real question is which approach is the more
useful default: gcc consistent across all platforms or all compilers consistent
on a specific platform.
I think one could make a good case that consistency on all platforms is the
more useful result - as long as fixincludes fixes header files containing
bitfields that native compilers would compile as unsigned.
More information about the Gcc-patches