PR rtl-optimization/47925: deleting trivially live instructions

Richard Sandiford richard.sandiford@linaro.org
Mon Feb 28 16:48:00 GMT 2011


If we have a sequence like:

    (set (reg A) ...)
    (set (reg A) (mem/v (reg A))) [REG_DEAD A]

then a long-standing bug in delete_trivially_dead_insns will cause it to
delete the first instruction.  count_reg_usage counts out how many times
each register is used in the entire function, but it tries to ignore
uses in self-modifications of the form (set (reg A) (... (reg A) ...)).
The problem is that it is ignoring such uses even if the insn contains
an access to volatile memory.  insn_live_p rightly returns true for
such insns, regardless of register counts, so we end up keeping the
self-modification but deleting the instruction that sets its input.

count_reg_usage is set up to predict when insn_live_p would always
return true regardless of register usage.  It is doing this correctly
for instructions that might throw an exception, and for volatile asms,
but it is failing to do it for other side effects that insn_live_p
detects.  These include volatile MEMs and pre-/post-modifications.

Bootstrapped & regression-tested on x86_64-linux-gnu.  OK to install?

The bug showed up as a wrong-code regression in a modified version of Qt,
but I haven't yet found a testcase that fails with 4.6 but passes with
an older GCC.  Is the patch OK now regardless, or should it wait for 4.7?

Richard


gcc/
	PR rtl-optimization/47925
	* cse.c (count_reg_usage): Don't ignore the SET_DEST of instructions
	with side effects.  Remove the more-specific check for volatile asms.

gcc/testsuite/
	PR rtl-optimization/47925
	* gcc.c-torture/execute/pr47925.c: New test.

Index: gcc/cse.c
===================================================================
--- gcc/cse.c	2010-10-18 10:53:30.000000000 +0100
+++ gcc/cse.c	2011-02-28 15:22:51.000000000 +0000
@@ -6629,9 +6629,10 @@ count_reg_usage (rtx x, int *counts, rtx
     case CALL_INSN:
     case INSN:
     case JUMP_INSN:
-      /* We expect dest to be NULL_RTX here.  If the insn may trap, mark
-         this fact by setting DEST to pc_rtx.  */
-      if (insn_could_throw_p (x))
+      /* We expect dest to be NULL_RTX here.  If the insn may trap,
+	 or if it cannot be deleted due to side-effects, mark this fact
+	 by setting DEST to pc_rtx.  */
+      if (insn_could_throw_p (x) || side_effects_p (PATTERN (x)))
 	dest = pc_rtx;
       if (code == CALL_INSN)
 	count_reg_usage (CALL_INSN_FUNCTION_USAGE (x), counts, dest, incr);
@@ -6671,10 +6672,6 @@ count_reg_usage (rtx x, int *counts, rtx
       return;
 
     case ASM_OPERANDS:
-      /* If the asm is volatile, then this insn cannot be deleted,
-	 and so the inputs *must* be live.  */
-      if (MEM_VOLATILE_P (x))
-	dest = NULL_RTX;
       /* Iterate over just the inputs, not the constraints as well.  */
       for (i = ASM_OPERANDS_INPUT_LENGTH (x) - 1; i >= 0; i--)
 	count_reg_usage (ASM_OPERANDS_INPUT (x, i), counts, dest, incr);
Index: gcc/testsuite/gcc.c-torture/execute/pr47925.c
===================================================================
--- /dev/null	2011-02-21 12:47:04.267827113 +0000
+++ gcc/testsuite/gcc.c-torture/execute/pr47925.c	2011-02-28 16:05:17.000000000 +0000
@@ -0,0 +1,24 @@
+struct s { volatile struct s *next; };
+
+void __attribute__((noinline))
+bar (int ignored, int n)
+{
+  asm volatile ("");
+}
+
+int __attribute__((noinline))
+foo (volatile struct s *ptr, int n)
+{
+  int i;
+
+  bar (0, n);
+  for (i = 0; i < n; i++)
+    ptr = ptr->next;
+}
+
+int main (void)
+{
+  volatile struct s rec = { &rec };
+  foo (&rec, 10);
+  return 0;
+}



More information about the Gcc-patches mailing list