This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: basic VRP min/max range overflow question


> From: Robert Dewar <dewar@adacore.com>
>> Paul Schlie wrote:
>>> From: "Joseph S. Myers" <joseph@codesourcery.com>
>>> "no requirements" means that *any* translation conforms in the case of
>>> undefined behavior.  Only those executions not involving undefined
>>> behavior have any requirements.
>> 
>> What delineates the bounds between undefined and non-undefined behaviors?
>> 
>> (As in the extreme if an undefined behavior may arbitrarily corrupt the
>> entire specified program state, and/or modify the languages otherwise
>> required semantics governing the translation/execution of a program, it
>> would seem that rather than attempting to utilize undefined behaviors as
>> a basis of optimizations, the compiler should more properly simply abort
>> compilation upon their detection, as the resulting program would be
>> otherwise be arguably useless for any likely purpose if the effect of an
>> undefined behavior within a program is not bounded?)
> 
> But of COURSE you can't detect these situations at compile time. Even
> if you had all the input in advance, this would be trivially equivalent
> to solving the halting problem. Programming language definitions reserve
> this use of undefined PRECISELY for those cases where it cannot be
> determined statically whether some rule in the dynamic semantic
> definition is or is not met.
> 
> When a compiler can determine that a given construct is sure to result
> in undefined behavior, e.g. it can prove at compile time that overflow
> will always occur, then indeed the best approach is to abort, or raise
> some kind of exception (depending on the language), and to generate a
> warning at compile time that this is going on. It CAN NOT "abort
> compilation", since this is not an error condition, it would be improper
> to refuse to compile the program. Besides which it would in practice
> be wrong, since the compiler may very well be able to tell that a given
> statement IF EXECUTED will cause trouble, but be unable to tell if in
> fact it will be executed (my password program is like this, a friendly
> compiler would warn that the reference to npasswords_entered (or whatever
> I called it) results in undefined behavior, and an attentive programmer
> who does not ignore warnings will deal with this warning before the
> program causes chaotic results.

The root of the concern being expressed is with respect to the compilers use
of statically identified undefined behaviors as opportunities to invoke
alternative semantics which are easily identified as being inconsistent with
the target's native semantics, thus altering the logical behavior of the
program than would otherwise have resulted. (without any halting solutions
required)

As candidly, regardless of this being technically allowed, it should obvious
that any optimization which may likely alter the behavior of a program
should never be invoked without explicit request and ideally diagnosis of
the resulting alternative possibly undesired and/or fatal behavior.

To be more clear, specifically as examples:

- As VRP relies on the static analysis of value ranges, primarily based
on embedded implicit and/or explicit constant values which enables the
detection of both bound ranges and potential value range overflows, this
information may be used to both identify both known safe optimizations,
as well as those known to be potentially unsafe based upon the easily
identifiable true overflow behavior of the target. Therefore is seems
clear that the most desirable default behavior for VRP based optimizations
would be to only enable those which are known to be safe (i.e. not alter
the behavior of the resulting program, nor requires a halting problem be
solved), by default; and enable through an explicit command line switch
the presumption that no integer overflows will occur, with optionally the
warning diagnosis of all locations were potential overflow was identified
and may result in differing program behavior. (Again no halting problems
need to solved)

- Correspondingly, as pointer null comparison optimizations are known to be
unsafe for targets which are known to not trap null-pointer dereferences,
this optimization should only be enabled by default for targets which are
known to do so (regardless of if it's perceived to be "technically allowed",
as it should be obvious that any optimization which will identifiably alter
program behavior should never be invoked by default at any optimization
level without explicit request and optional corresponding diagnosis).

As just my opinion, and possibly the opinion of many who've ever cursed
the result of an optimization's altering a program's logical behavior
(which I can't imagine as ever being desirable in any circumstance, as
optimizations should ideally eliminate logical redundancies, not alter
logical semantics, without explicit request).



Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]