[Bug tree-optimization/83253] -ftree-slsr causes performance regression

wschmidt at gcc dot gnu.org gcc-bugzilla@gcc.gnu.org
Wed Dec 13 22:31:00 GMT 2017


https://gcc.gnu.org/bugzilla/show_bug.cgi?id=83253

--- Comment #9 from Bill Schmidt <wschmidt at gcc dot gnu.org> ---
I was able to build an i386 cross, and this wasn't sufficient to solve the
problem.  I see:

Processing dependency tree rooted at 1.
Inserting initializer: slsr_10 = scale_7(D) * 3;

Increment vector:

  0  increment:   1
     count:       0
     cost:        1000
     initializer: 

  1  increment:   3
     count:       1
     cost:        -4
     initializer: slsr_10

Replacing: _4 = ptr_6(D) + _3;
With: _4 = _1 + slsr_10;

The cost model thinks the replacement is still profitable.  I'll look into what
i386 is reporting for the costs tomorrow.


More information about the Gcc-bugs mailing list