This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Ada front-end depends on signed overflow


Paul Schlie wrote:

  Similar arguments has been given in support an undefined order of
  evaluation; which is absurd, as the specification of a semantic order
  of evaluation only constrains the evaluation of expressions which would
  otherwise be ambiguous, as expressions which are insensitive to their
  order of evaluation may always be evaluated in any order regardless of
  a specified semantic order of evaluation and yield the same result; so
  in effect, defining an order of evaluation only disambiguates expression
  evaluation, and does not constrain the optimization of otherwise
  unambiguous expressions.

I think perhaps you have not really looked at the issues that are raised in optimizing compilers. Let me address this misconception first, similar considerations apply to the overflow case.

The point is that at compile time you cannot tell what expressions are
"ambiguous". I use quotes here since of course there is no ambiguity
involved:

suppose you write (a * f(b)) * (c * g(d))

where f(b) modifies c and g(d) modifies a. You would call this ambiguous
but in fact the semantics is undefined, so it is not ambiguous at all,
it has quite unambiguous semantics, namely undefined. The notion of
non-deterministic behavior is of course quite familiar, and what
undefined means here is that the behavior is non-deterministic from
among the set of all possible behaviors.

Now you seem to suggest that the optimizer should simply avoid
"optimizing" in such cases (where it matters). But the whole point
is that of course at compile time in general you can't tell whether
f(b) and g(d) have (horrible) side effects of modifying global
variables a and c. If we don't allow undefined behavior, the optimizer
would have to assume the worst, and refrain from optimizing the above
expression just in case some idiot had written side effects. The decision
of the standards committee is to make the side effect case undefined
PRECISELY so that the optimizer can go ahead and choose the optimal
order of evaluation without worrying about the side effect case.

Going back to the overflow case, consider:

   a = (some expression the compiler can tell is positive)
   b = (some expression the compiler can tell is positive)
   c = a + b;
   d = c / 8;

Here the compiler can do a right shift for the last statement. This
would of course be wrong in the general case (assuming c and d are int),
since the right shift does not work right if c is negative. But the
compiler is allowed to assume that a+b does not overflow and hence
the result is positive, and hence the shift is fine.

SO you see that guaranteeing twos complement wrap can hurt the quality
of the code in a case like this.

Now it is legitimate to argue about how much quality is hurt, and
whether the resulting non-determinisim is worth the effeciency hit.
But to conduct this argumnnt, you have to start off with an understanding
that there really is a trade-off and it is not clear what the decision
should be (C makes one decision, Java another, and Ada a third). The
three languages also differ in the handling of a+b

C says that this is undefined if there are side effects that cause what you
call ambiguity.

Java specifies left to right evaluation (evaluate a before b always).

Ada specifies that the result is non-deterministic, you either evaluate
a before b or b before a (not some mixture of the two), and there may
be (at most) two possible results.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]