[Bug tree-optimization/56982] [4.8/4.9 Regression] Bad optimization with setjmp()

rguenther at suse dot de gcc-bugzilla@gcc.gnu.org
Wed Apr 17 09:07:00 GMT 2013


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=56982

--- Comment #7 from rguenther at suse dot de <rguenther at suse dot de> 2013-04-17 09:07:10 UTC ---
On Wed, 17 Apr 2013, jakub at gcc dot gnu.org wrote:

> 
> http://gcc.gnu.org/bugzilla/show_bug.cgi?id=56982
> 
> --- Comment #6 from Jakub Jelinek <jakub at gcc dot gnu.org> 2013-04-17 08:56:00 UTC ---
> I don't see how we could declare the testcase invalid, why would n need to be
> volatile?  It isn't live across the setjmp call, it is even declared after the
> setjmp call, and it is always initialized after the setjmp call.

Then there is no other way but to model the abnormal control flow
properly.  Even simple CSE can break things otherwise.  Consider

int tmp = a + 1;
setjmp ()
int tmp2 = a + 1;

even on RTL CSE would break that, no?  setjmp doesn't even
forcefully start a new basic-block.

Hmm, maybe doing that, start a new BB for all returns-twice
calls and add an abnormal edge from FN entry is enough to
avoid all possibly dangerous transforms.

Richard.



More information about the Gcc-bugs mailing list