This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Re: Might a -native-semantics switch, forcing native target optimization semantics, be reasonable?
- From: Paul Schlie <schlie at comcast dot net>
- To: Mike Stump <mrs at apple dot com>
- Cc: GCC Development <gcc at gcc dot gnu dot org>
- Date: Mon, 02 Jan 2006 19:16:14 -0500
- Subject: Re: Might a -native-semantics switch, forcing native target optimization semantics, be reasonable?
> From: Mike Stump <mrs@apple.com>
>> On Dec 31, 2005, at 9:26 PM, Paul Schlie wrote:
>> be able define NULL as being some value other than 0.
>
> Do you have a specific chip in mind you want to do this for? Why
> would you want to do this? How many users would benefit from having
> done this?
- the avr maps it's register file starting at data address 0, as did the
old TI 99xx family CPU's (now defunct), and possibly others?
>> - enable the specification of arithmetic pointer and integer overflow
>> semantics, not limited to invoking an undefined or modulo results, as
>> being able to support saturating integer arithmetic optimization seems
>> increasingly attractive as signal processing becomes more pervasive.
>
> Yes, but you didn't answer my other two question. Anyway, what
> hardware does only saturating arithmetic? If it does both, would you
> want + to be saturating? If so, why? How would you then want to get
> non-saturating arithmetic?
>
> Saturating arithmetic is a good example of where the code should use
> a specialized form to denote the operation, and that form then makes
> the code completely portable, so, I cannot fathom why you'd want it
> in this class.
- as signed arithmetic overflow is undefined, it seems just as reasonable
to define an implementation which supports signed saturating arithmetic
as any other; as arguably signed integer saturating arithmetic may be
considered more arithmetically consistent than signed 2's comp modular
arithmetic is, and more similar to floating point overflow semantics
than alternatives. (not good or bad, but just fact)
> What compilers do this today? What code bases do this today?
>
> If none and none, why would we want to?
- the question shouldn't be who does this today, but ideally: if more
efficient, consistent, and behavior preserving target specific
optimization is perceived as being useful, then why shouldn't such
abilities be considered?
> We don't yet have a clue why you want this, could you give us the
> real reason. Theoretic beauty? You wanna sell a chip that does this
> and have a compiler for it? You want to define a new language
> because you think it'd be cool? You want gcc to match the needs of
> DSP programmers better?
- at the most basic level, I feel like I've too often needlessly wasted
time debugging programs at one level of optimization, to only see a
different behavior needlessly expressed at a different level of
optimization (which I understand means something isn't portable, but
isn't the correct way to inform one of non-portable code, but is one
hell of a way to unknowingly interject bugs into a program which didn't
exist at a different level of optimization); however if a compiler
supported the means by which a target could define the semantics left
undefined by a language, an optimizing compiler could then both satisfy
the formal constrains of language, while simultaneously enabling target
specific semantics to be supported, and preserved through optimization.
(which seems like a win-win to me)
>> - enable the specification of the result/behavior of a shift greater
>> than the width of a operand
>
> This one I actually understand.
>> - x[y] = 0;
>> if (x[y]) y = y+1;
>
> And how does this differ from the portable code in which x points to
> volatile data? If none, what are the advantages in being able to
> write non-portable code that leaves the volatile out over standard
> conforming code with the volatile in?
- I'm not trying to defend such code, but believe if it were determinable
that the dereference of x[y] were undefined for a particular value of y,
the only behavior I perceive as being reasonable to presume, is that
which has been defined by the target if not specifically designated by
the language, otherwise no optimization is valid unless it's effect is
known not be be logically observable thereby may be eliminated regardless
of specified semantics (where undefined is the absents of any
specification, including that specified as being undefined).