This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Re: Ada front-end depends on signed overflow
- From: Paul Schlie <schlie at comcast dot net>
- To: Florian Weimer <fw at deneb dot enyo dot de>
- Cc: Robert Dewar <dewar at adacore dot com>,Andrew Pinski <pinskia at physics dot uc dot edu>,GCC List <gcc at gcc dot gnu dot org>,<bosch at gnat dot com>
- Date: Tue, 07 Jun 2005 11:19:31 -0400
- Subject: Re: Ada front-end depends on signed overflow
> From: Florian Weimer <fw@deneb.enyo.de>
> * Paul Schlie:
>
>> - I'm not attempting to design a language, but just defend the statement
>> that I made earlier; which was in effect that I contest the assertion
>> that undefined evaluation semantics enable compilers to generate more
>> efficient useful code by enabling them to arbitrarily destructively alter
>> evaluation order of interdependent sub-expressions, and/or base the
>> optimizations on behaviors which are not representative of their target
>> machines.
>
> But the assertion is trivially true. If you impose fewer constraints
> on an implementation by leaving some cases undefined, it always has
> got more choices when generating code, and some choices might yield
> better code. So code generation never gets worse.
- yes, it certainly enables an implementation to generate more efficient
code which has no required behavior; so in effect basically produce more
efficient programs which don't reliably do anything in particular; which
doesn't seem particularly useful?
>> (With an exception being FP optimization, as FP is itself based
>> only on the approximate not absolute representation of values.)
>
> Actually, this is a very interesting example. You don't care about
> proper floating point arithmetic and are willing to sacrifice obvious
> behavior for a speed or code size gain. Others feel the same about
> signed integer arithmetic.
- Essentially yes; as FP is an approximate not absolute representation
of a value, therefore seems reasonable to accept optimizations which
may result in some least significant bits of ambiguity.
Where integer operations are relied upon for state representations,
which are in general must remain precisely and deterministically
calculated, as otherwise catastrophic semantic divergences may result.
(i.e. a single lsb divergence in an address calculation is not acceptable
although an similar divergence in a FP value is likely harmless.)
>> The compiler should be able to statically determine if an
>> expression's operands are interdependent, by determining if any of
>> its operand's sub-expressions are themselves dependant on a variable
>> value potentially modifiable by any of the other operand's sub-
>> expressions.
>
> Phrased this way, you make a lot of code illegal. I doubt this is
> feasible.
- No, exactly the opposite, the definition of an order of evaluation
eliminates ambiguities, it does not prohibit anything other than the
compiler applying optimizations which would otherwise alter the meaning
of the specified expression.