This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

RE: signed is undefined and has been since 1992 (in GCC)


----Original Message----
>From: Olivier Galibert
>Sent: 28 June 2005 19:02

> On Tue, Jun 28, 2005 at 06:36:26PM +0100, Dave Korn wrote:
>>   It certainly wasn't meant to be.  It was meant to be a dispassionate
>> description of the state of facts.  Software that violates the C standard
>> just *is* "buggy" or "incorrect", and your personal pride has absolutely
>> nothing to do with it.
> 
> Then your definition of "incorrect" is uninteresting.  Per your
> definition, "use of implementation-defined behaviour is incorrect",

  Please don't put words in my mouth.  That isn't remotely what I said, and
if you are trying to paraphrase it, you have changed the meaning.  Undefined
is not the same thing as implementation defined.

>>   If you re-read what *you* originally said, you made it look like you
>> were talking in abstract terms about software-in-general,
> 
> I said "A very large number of C and C++ programs".  That includes
> kernels, gnome, kde, lots of things.  Or if you want programs I
> work(ed) on, xemacs and mame.

  Well, then it counts stuff I've worked on as well.  And?

>> and that's certainly
>> what I was referring to when I replied; it's unreasonable of you to
>> point at that very generalised sentence and suddenly say "I was talking
>> about my own code, even though I hid the fact, and so you've insulted me
>> by disparaging it".
> 
> You disparaged probably around 99% of a typical linux distribution.

  I didn't disparage anything.  I described non-compliance with the standard
as representing anything on a scale from "blatantly buggy" (BTW, 'blatant'
means 'openly and visibly', it is not any kind of a pejorative term) to
"subtly incorrect", which seems to me a fair description of the sorts of
problems that can arise from disregarding the language spec.  It is only you
who is reading an emotional cast into this.

> Find one non-trivial program that doesn't assume that int is 32 bits.
> Find one of *your* programs that doesn't.

  Last one I wrote on my Amiga (all of them, in fact).  And?

>>   No number of correct assumptions about the sizes of various types or
>> the representation of NULL pointers will validate the incorrect
>> assumption that signed integer arithmetic could be made to wrap without
>> obliging the compiler to emit lousy code and miss an awful lot of
>> loop-optimisation opportunities.
> 
> Sure, and you'll notice I always special-cased the loop induction
> variables.  

  Yep, but at that point, your definition starts to look uninteresting,
because now it's starting to look like "We should be able to rely on signed
ints wrapping in all circumstances, except those in which they don't."  A
lot of this discussion has focussed on loop optimisations, but can you
guarantee those are the *only* optimisations which really benefit from
assuming signed ints don't wrap?  As far as I can see, it is reasonable for
the C standard to say that all signed integer overflow is undefined because
enumerating the circumstances in which it is or is not defined would be an
unbounded and poorly-defined task.  A language feature that sometimes works
and sometimes does not and there is no easy way to know whether it will or
will not work in any given circumstances is *not* a useful feature, it's a
dangerous ambiguity - worse than useless in a production environment.

>Maybe you should reread what I was replying to:
> 
> On Tue, Jun 28, 2005 at 08:57:20AM -0400, Robert Dewar wrote:
>> But the whole idea of hardware semantics is bogus, since you are
>> assuming some connection between C and the hardware which does not
>> exist. C is not an assembly language.
> 
> That is what I utterly disagree with.

  Well, I don't utterly _anything_ about either his position or yours.  C is
not just a high level assembler, it has complex and abstract semantics
imposed on that; it may have been reasonable to treat it as such back in the
very early K'n'R days, but it has changed massively since then.  I also
agree that reasoning in the utter abstract about languages is not
necessarily very useful in practice, but it is a perfectly reasonable way to
define a baseline against which it becomes possible to analyze the
similarities and differences of any real-world implementation.


    cheers,
      DaveK
-- 
Can't think of a witty .sigline today....


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]