This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

RE: typeof and bitfields

> -----Original Message-----
> From: Andreas Schwab 
> Sent: 14 January 2005 16:36

> Note that it is implementation defined, whether bit-fields of 
> type int are signed or unsigned.

  Wow.  I didn't know that, I thought it was only ever chars that could vary.
And if I had, I would have put an explicit 'signed' qualifier anyway!

> > What are the range of values in a 1-bit signed int? Is that 
> 1 bit the
> > sign bit or the value field?
> It's one sign bit and zero value bits.
> > Can bar hold the values 0 and 1, or 0 and -1, or some other set?
> Depends on the representation: with two's complement it's -1 
> and 0, with
> sign/magnitude or one's complement it's 0 and -0.

  Aha!  I knew there was more to life than 2's C!

> > In a one bit field, the twos-complement operation 
> degenerates into the
> > identity - how can the concept of signed arithmetic retain 
> any coherency
> > in this case?
> It's no different from -INT_MIN: you get an overflow.

  Fair point.

  Thanks to you and everyone else who replied.  It's nice to get a really
definitive answer to something you've always wondered about but never been sure
of.....  :)

Can't think of a witty .sigline today....

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]