This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH] Cleanup fold_rtx, 1/n


On Wednesday 13 April 2005 21:19, Roger Sayle wrote:
> A slightly better approach would be to include the dejagnu
> testsuite in the coverage analysis, as the preferred policy of
> adding a new testcase to trigger each transformation/optimization
> might help here.

We can do that kind of testing (I have merged coverage data for cse.c
from bootstraps on five different targets, and redoing that with the
test suite is just a matter of burning cycles), although I think you
are putting the bar quite high.  But that's only fair.

> Of course, this policy isn't rigorously enforced, 

This is not a problem someone willing to do the dirty cleanup work
can do anything about.  IMHO it is just unfair to make that a problem
now for people who have good arguments and sufficient evidence to show
that a piece of code is almost certainly dead.

If there is reason to believe some code is dead, and it does not or
almost not trigger in the test suite (on, say, three targets, to raise
the bar even further but still keep it doable?) then we should be able
to say that the code is really dead.

Otherwise we might as well stop right now and just not remove anything
in the RTL path.  Falsifying the assertion "this code is not dead" is
impossible ;-)

> I'd still be uncomfortable removing code based simply on the
> fact that we couldn't get it to trigger.

So you're saying there is no such thing as "enough evidence" for you
to feel comfortable about removing subsumed optimizations?

We're doomed :-)

Why is removing $random_hack so much harder than adding $random_hack?
Because it is there and has been there for such a long time?  That is
not a valid reason.  The code has to be there for a reason.  If nobody
knows or can show why the code is there, and it is safe to clean it up,
then by all means, let's clean it up!

> My personal preference remains that these RTL optimizations should
> be moved from fold_rtx to simplify_rtx when possible,

Of course.  I think Paolo did a pretty good job at that already.

> and those that 
> depend upon the CSE'ing different functional forms are left.  For
> example, one of the "???" comments in your patch recommends that that
> transformation be moved to combine.c, which is probably as true today
> as it was when it was first written.

That bit was added in revision 1.280 by you, when you yourself commited
a patch to catch this case by.... Paolo Bonzini!
So you are oposing the removal of code that he contributed himself ;-)
See http://gcc.gnu.org/cgi-bin/cvsweb.cgi/gcc/gcc/cse.c?annotate=1.280
The case Paolo wanted to catch back then is now optimized in the tree
optimizers, so there should be no need to move it to combine.  

> Thoughts?  Opinions?

I think you're being too conservative.  Conservatism is a virtue for
a compiler writer, but you have to be realistic too: there is a reason
why we have a new optimization framework.  Parts of the RTL path _are_
just dead code now.  That was the whole point IIUC.

> I'm not trying to be a road block on the path of progress, but I'd
> just like to get a feel for where we intend to draw the line with
> these protocols.  If taken to extremes, these policies can clearly
> be dangerous (if none of these cc1 files contains K&R prototypes,
> perhaps we could drop parser support for pre-ANSI C, etc...).

That is something completely different.  It is not even in the same
category, you're comparing breaking standard conformance with removing
dead code.

Gr.
Steven


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]