This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: basic VRP min/max range overflow question


> From: Robert Dewar <dewar@adacore.com>
>> Paul Schlie wrote:
>> The root of the concern being expressed is with respect to the compilers use
>> of statically identified undefined behaviors as opportunities to invoke
>> alternative semantics which are easily identified as being inconsistent with
>> the target's native semantics, thus altering the logical behavior of the
>> program than would otherwise have resulted. (without any halting solutions
>> required)
> 
> You are still not understanding, these are NOT cases of "statically
> identified undefined behavior". Those are the trivial cases that are
> uninterestig.

OK, assuming I may misunderstand, the following is my understanding:

- presently if GCC STATICALLY identifies a pointer dereference which
  precedes a pointer null comparison, it presumes that the comparison may
  be optimized away under the rationalization that preceding dereference
  would be trapped, and if not, rationalizes that an undefined behavior
  grants it permission do what it wants regardless of the fact that it
  is only safe (logically equivalent) to do so for targets which are known
  to trap null dereferences.

- given the following code fragment:

  int x, y;
  volatile z;

  x = z ? 1 : 2;    // x == [1, 2]
  y = z ? z + x;    // y == [INT_MIN+1, INT_MAX+2]

  if (x < 0)        // which may be safely optimized away as
    somthing();     // [1, 2] is STATICALLY known to be never < 0.

  if (y == INT_MIN) // which may not be safely optimized away unless
    something();    // it is known that the target does not wrap overflow.

My position is simply that an optimization should never remove a specified
operation unless it is known to yield logically equivalent behavior, as
producing a program which does not behave as specified is not a good idea,
to be polite, unless specifically requested to do so; as if the programmer
didn't intend the code to be evaluated, it wouldn't have been specified;
regardless of the compilers presumed license to do what ever it pleases
when encountering a STATICALLY identified/potential undefined behavior.


>> As candidly, regardless of this being technically allowed, it should obvious
>> that any optimization which may likely alter the behavior of a program
>> should never be invoked without explicit request and ideally diagnosis of
>> the resulting alternative possibly undesired and/or fatal behavior.
> 
> If the behavior is undefined, then it is undefined, and you cannot
> talk about a change in behavior. This is what non-deterministic semantics
> is about.

I guess I simply believe that optimizations should never alter the logical
behavior of a specified program relative to it's un-optimized form unless
explicitly granted permission to do so, therefore such optimizations should
never be considered enabled at any level of optimization by default.

As I personally I believe the unbounded liberty to arbitrary alter the
logical evaluation of program statements/expressions distinct from those
which are considered undefined without explicit permission is misguided.

As it's likely more useful/important that optimizations preserve the
behavior of even non-portable programs, than it likely useful/important
to produce a faster program which may have an arbitrary different behavior.

>> To be more clear, specifically as examples:
>> 
>> - As VRP relies on the static analysis of value ranges, primarily based
>> on embedded implicit and/or explicit constant values which enables the
> 
> It most certainly is NOT possible to statically identify situations that
> cause overflow. I cannot believe this is not clear to you.

Actually it is (or we may be talking about different things, see below).

But regardless, what's important is identifying those with predicable value
ranges, as they are the only ones which may be safely used for optimizations
i.e.:

Known no overflow:
 [1, 4] = [0, 1] + [1, 3];   // safe to use as a basis of VRP optimization.

Possible overflow:
 [?, ?] = [0, 1] + [0, INT_MAX]; // unsafe without OverFlow knowledge.

~[INT_MIN+1, -1] = [0, 1] + [0, INT_MAX] // safe with 2's comp OvF.

Known overflow:
 [?, ?] = [INT_MAX-1 INT_MAX] + INT_MAX; // unsafe without OvF knowledge.

 [-3, -2] = [INT_MAX-1, INT_MAX] + INT_MAX; // known safe with 2's comp OvF

(What undefined overflow semantics should really be utilized to provide is
target specific overflow optimization based on a targets native overflow
semantics, which would be a whole lot more useful than pretending otherwise
and risk altering the programs otherwise un-optimized behavior.)

Again, just my opinion (as after all, undefined semantics enable the
compiler to do anything, so might as well do something more useful than not
:)



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]