This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug tree-optimization/33344] if(!z) {a.b = x->b + 1; z = &a; } else if (!x) { a.b = z->b+1; x = &a; } use z->b and x->b; is not optimized



------- Comment #2 from rguenth at gcc dot gnu dot org  2007-09-08 17:49 -------
So the following brute force approach will make PRE to do insertion for _all_
loads.  It produces then

g (b, c)
{
  int prephitmp.15;
  int prephitmp.10;
  struct f g;

<bb 2>:
  if (b == 0B)
    goto <bb 3>;
  else
    goto <bb 4>;

<bb 3>:
  prephitmp.10 = c->a + 1;
  g.a = prephitmp.10;
  prephitmp.15 = c->a;
  goto <bb 7>;

<bb 4>:
  if (c == 0B)
    goto <bb 6>;
  else
    goto <bb 5>;

<bb 5>:
  prephitmp.15 = c->a;
  prephitmp.10 = b->a;
  goto <bb 7>;

<bb 6>:
  prephitmp.15 = b->a + 1;
  g.a = prephitmp.15;
  prephitmp.10 = b->a;

<bb 7>:
  return prephitmp.10 + prephitmp.15;

}


Index: tree-ssa-pre.c
===================================================================
--- tree-ssa-pre.c      (revision 128275)
+++ tree-ssa-pre.c      (working copy)
@@ -2598,6 +2598,7 @@ do_regular_insertion (basic_block block,
          bool by_some = false;
          bool cant_insert = false;
          bool all_same = true;
+         bool is_load = false;
          tree first_s = NULL;
          edge pred;
          basic_block bprime;
@@ -2614,6 +2615,10 @@ do_regular_insertion (basic_block block,
              continue;
            }

+         if (TREE_CODE (expr) == INDIRECT_REF
+             || handled_component_p (expr))
+           is_load = true;
+
          avail = XCNEWVEC (tree, last_basic_block);
          FOR_EACH_EDGE (pred, ei, block->preds)
            {
@@ -2672,7 +2677,7 @@ do_regular_insertion (basic_block block,
             already existing along every predecessor, and
             it's defined by some predecessor, it is
             partially redundant.  */
-         if (!cant_insert && !all_same && by_some)
+         if (!cant_insert && !all_same && (by_some || is_load))
            {
              if (insert_into_preds_of_block (block, get_expression_id (expr),
                                              avail))


The is_load should be really walk leading exprs looking for an inner
indirect_ref.  And edges should be walked to look for at least one
addr_expr in the phi arguments affected.


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=33344


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]