Linux and aliasing?

Linus Torvalds torvalds@transmeta.com
Fri Jun 4 01:50:00 GMT 1999


On Fri, 4 Jun 1999, Joe Buck wrote:
> 
> Oh, don't worry, we expect you to complain, and in a rude and insulting
> matter at that.  We're used to it.  It seems that you were a nicer guy
> before you had so many worshippers.

Ehh, I wasn't exactly known for being polite even before. Why do you think
people still quote the flame wars I had about microkernels?

But point taken.

> Mark put in strict-aliasing because it is a big performance win.  Of
> course he cares about performance.  The ISO rules were written in the
> way they were written precisely to enable this performance improvement.

The ISO rules were not written to "enable" the performance improvement. 
They were written explicitly to _DISable_ it in a number of cases where
the optimization was known to break old code. And not everybody was all
that excited about the rules even when they were written. Understandably,
because they really aren't made to make sense they are only made to give
at least _some_ way around aliasing issues. 

> A compiler will do better the more aliasing possibilities it can
> eliminate.  Mark used the ISO rules to determine what the set of
> eliminatable aliases is.  You want to change this set to a smaller set, so
> your programs will continue to work.

NO!

I want the _user_ to be able to give input. 

I have one _suggested_ option, that to me has the huge advantage of not
really polluting the language, while being simple and obvious.

I would be happy with a #pragma, or with an attribute. People have been
talking about much more specialized attributes ("naked" etc) that are not
really useful to _any_ normal programs. The alias control feature would be
useful to real users - not just the kernel. At least judging by the
snippets of code I've seen. Code that breaks with the ANSI rules. 

I happen to think that the "explicit cast invalidates the alias
information" rule is the simplest and best one, and gives the user the
best control without adding things like new attributes or other ways to
let the user be in control.

But the details of _how_ that control is achieved are much less important
than the fact that the programmer _should_ be in control.

>					  I understand that, I even
> sympathize.  But you seem blind to the fact that this will inevitably make
> some (possibly many) ISO-valid programs slower.

Did you read my post? I'm arguing against making it something we have no
control over.

I was even arguing for allowing _stricter_ aliases than ANSI allows - the
"char *" thing in ANSI is actually really hard to code around (as far as I
know, the only way to do a one-byte access that still allows alias logic
to work in ANSI C is to do something really ridiculous like

	typedef struct {
		char c;
	} *one_byte_t;

in order to avoid the rule that any char access automatically means that
the compiler can't use the regular alias type rules.

> > I can see technical arguments. An argument of "it's really too painful to
> > do" I can understand (preferably with an explanation, but hey, I don't
> > mind getting told that it's too hard to explain). I use that argument
> > every day myself. 
> 
> See above, or find someone to submit working code (you would ask this
> in an equivalent situation on the kernel list).

Yes. I would ask the same.

But I do NOT use arguments like "that is undefined by POSIX" unless I have
a damn good reason to. I consider POSIX to be a guide to me, but I do not
consider it to be automatically correct (POSIX has done some major
blunders in its time: outright idiocies that simply could not be
implemented correctly on 64-bit architecturesfor very simple technical
reasons, for example).

And I would not say "POSIX does not allow you to do that, so why should
you do it"? 

> The standard is not arbitrary: it is the way it is for technical reasons,
> specifically to make C a suitable language for numerical computation.
> Without such rules serious number-crunchers have to switch to Fortran.

Look at the actual rules. Tell me that the "char *" rule makes sense.

The standard _is_ arbitrary. They tried to select a number of special
rules to make it UNLIKELY that old programs break. But the rules _were_
arbitrary. 

Note that "arbitrary" does not imply "random". There are reasons for the
rules. "char *" has historical issues associated with it. But there are
reasons for the extension I suggested too - and they aren't really any
different from the standard reasons.

"Arbitrary" means that you don't have any strong reason to choose one over
the other. So maybe you should allow the user some choice in the matter?

> > just want to get good code without having to do magic contortions, guys.
> 
> We could flip the default for the flag, so that people have to write
> -fstrict-aliasing to get the optimization.  Had we done that, you
> never would have noticed.

I would certainly have complained less, yes. Backwards compatibility is a
strong argument, and the way it is set up now just rubs everyodys nose in
the fact that the compiler behaviour changed. Behaviour you could rely in
according to other (and equally valid) standards of the language - the
alias thing was not even a proposal when I started doing Linux. 

But I would have noticed - I don't think you realize quite how important
generated code quality is to me, and that I actually _am_ aware of the
standard even when I disagree with some of the details in it. I _like_
alias analysis. I just want to have better control over it, because I
happen to think that I can take _advantage_ of it. 

I dislike fascist compilers who think they know better than I do.

And I dislike people who think fascist compilers are a good idea.

			Linus



More information about the Gcc mailing list