This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Ada front-end depends on signed overflow


> From: Robert Dewar <dewar@adacore.com>
> Paul Schlie wrote:
> 
>> - So technically as such semantics are undefined, attempting to track
>>   and identify such ambiguities is helpful; however the compiler should
>>   always optimize based on the true semantics of the target, which is
>>   what the undefined semantics truly enable (as pretending a target's
>>   semantics are different than the optimization assumptions, or forcing
>>   post-fact run-time trapping semantics, are both useless and potentially
>>   worse, inefficient and/or erroneous otherwise).
> 
> The first part of this is contentious, but arguable certainly (what is
> useful behavior). There is certainly no requirement that the semantics
> should match that of the target, especially since that's ill-defined
> anyway (for targets that have many different kinds of arithemetic
> instructions).

- I don't mean to contest the standard which specifies the behavior is
  undefined (regardless of how useless I perceive that to be), but merely
  observe that in fact as most targets do implement 2's complement modulo
  2^N integer arithmetic, and given that overflow behavior is undefined,
  it makes makes no sense to presume otherwise (as such a behavior is both
  fully compliant and factually typical of most, if not near all, targets).

> The second part is wrong, it is clear that there are cases where
> the quality of code can be improved by really taking advantage of the
> undefinedness of integer overflow.

- As above; but to whom is it useful to compute an undefined result
  more efficiently, especially if the premise of an optimization is not
  factually consistent with the target's behavior (which will surely
  result in an incorrectly predicted, therefore likely "computationally
  ambiguous/useless" behavior)?

  Similar arguments has been given in support an undefined order of
  evaluation; which is absurd, as the specification of a semantic order
  of evaluation only constrains the evaluation of expressions which would
  otherwise be ambiguous, as expressions which are insensitive to their
  order of evaluation may always be evaluated in any order regardless of
  a specified semantic order of evaluation and yield the same result; so
  in effect, defining an order of evaluation only disambiguates expression
  evaluation, and does not constrain the optimization of otherwise
  unambiguous expressions.
 



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]