This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Weird bit-field problem in egcs-1.1.1 and egcs-1.1.2pre2...?


I've got some code that does some weird things if I change a datatype
from "int" to "UWORD" -- unsigned short, for a bit-field definition.
The code is listed below and this is tested with egcs-1.1.1 under
Solaris 2.6 (sparc) and also with egcs-1.1.2-pre2 :


If you change the "int"s in the struct below from "int" to "UWORD" you 
will get a different sized structure.  I'm not sure if this is a
feature or a bug in the compiler.  This same test was done with a beta 
copy of the Greenhills C/C++ compiler and it worked OK using the UWORD 
definitions.  I believe that either should produce "4" when the sizeof 
operator is used on the structure.  On our machine, if you use UWORD,
the sizeof operator returns "6", and returns "4" with "int".

I use this same type of stuff throughout some of my code and it works
OK there.. I know that the first 16 bits (field_1,field_2) are
technically spanning 16 bit words.  I guess my real question is
whether or not this is a feature or a bug?  Any comments are
appreciated!

-- Rick

===============================================================================
#include <stdio.h>

typedef unsigned short UWORD;                   // Define a unsigned word

typedef struct
{
   int                           field_1 : 14;
   int                           field_2 : 3;
   int                           field_3 : 9;
   int                           field_4 : 6;
} Header_rType;

main()
{
   Header_rType      l_hdr;

   (void)printf("Sizeof(l_hdr) = %d\n",sizeof(l_hdr));
}


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]