This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: [PATCH] Document arithmetic overflow semantics
- From: dewar at gnat dot com (Robert Dewar)
- To: dewar at gnat dot com, kenner at vlsi1 dot ultra dot nyu dot edu
- Cc: gcc-patches at gcc dot gnu dot org, gcc at gcc dot gnu dot org
- Date: Fri, 14 Feb 2003 09:43:38 -0500 (EST)
- Subject: Re: [PATCH] Document arithmetic overflow semantics
> I guess the question is what exactly "back propagate" mean in practice. I
> think most people agree that doing so *explicitly* is a bad idea and doesn't
> produce any optimizations of correct programs in practice.
It is in fact almost impossible to precisely define, but in the realm of
optimizers, you have code that makes logical deductions from a set of
assumptions. For example, if you see
if a > 3 then
You create an implicit assertion that a is greater than 3 at the point of the
then statements. Subsequent actions assume this and e.g. remove a division
by zero check.
There is really no distinction between "forward" and "backward" propagation
at this level of logical operation. Many important optimizations, for example
removing dead code and dead assignments result from what you might informally
regard as "back propagation":
x := 3;
... (x not mentioned)
x := 4;
The information from the second assignment back propagates to the x := 3
point and causes it to be deleted.
If you introduce as an assumption that the optimizer can take into account that
certain behavior is undefined, then what informally we regard as "improper
back propagation" is hard to avoid. The meaning of two statements:
s1;
s2;
is represted semantically as the composition of two functions on states,
and if s2(s) yields undefined for all s, then most certainly s2(s1(s)) yields
undefined, and there goes the back propagation.
So it is indeed important to keep this under control. I actually think the
idea of using the Ada bounded error semantics for uninitialized variables
is exactly the right one. The Ada definition was carefully crafted with two
ideas in mind:
1. Don't cause any significant degradation of the code
2. Avoid nasty "back propagation" of undefined
And also of course, at the programmer's semantic level, we have a definition
of uninitialized behavior that corresponds intuitively to what the programmer
who does not know about weird optimizers would expect.