This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[PATCH][RFC][4.4] Add may-alias oracle, make SCCVN use it


This patch introduces a may-alias oracle and changes SCCVN to use it to
find more memory reference redundancies.  (Sorry for the other stuff in
the patch - that mainly moves functions around for whose targets in
case of tree-ssa-alias.c and tree-ssa-structalias.h I am not exactly
happy with either)

Basically we get

 - refs_may_alias_p, a predicate that tells whether two memory
   references may alias

 - refs_must_alias_p, a predicate that tells whether two memory
   references must alias

 - ref_killed_by_def_p, a predicate that tells whether a memory
   location is killed by a specified definition

where the latter two are only prototyped with an obvious implementation
and ref_killed_by_def_p just introduced because DSE seems to request
this particular kind of query.

refs_may_alias_p implements simple TBAA and base + offset memory
reference disambiguation and could be extended to use points-to
information if available (though this would add an extra cost to
the already costly function).

The SCCVN use is straight-forward - we can walk the use-def chains
at reference lookup and use the defs vuses for continuing the lookup
in case the def may not alias our reference.  This results in up to
5% runtime improvement for tramp3d and 4% for sixtrack.  Compile-time
and memory-usage impact is in the noise _except_ for -fprofile-use
and -fprofile-generate runs on tramp3d (though the effects here are
not necessarily expected, as we see runtime regressions for 
profile-generate and compile-time and textsize regressions for 
profile-use).  This remains to be investigated.  Other usual suspects
such as DLV, mico or boost based testcases show no effect.

If SCCVN is fixed to allow value-numbering of global initializers we
can finally get rid of store-ccp.  Zdenek has a similar patch for
updating LIM and SM to use an alias-oracle (I have updated that to
current trunk).  I also think that some major pass re-shuffling
is needed at some point (it looks like it is never the right time
to do this, thouhg ;))

Oh, and this fixes PR34172 which was asking for store-copyprop
to be extended (but that was removed).  This also would fix a
similar request for store-ccp.

Thanks,
Richard.

2007-12-05  Richard Guenther  <rguenther@suse.de>

	* tree-ssa-structalias.c (get_constraint_for_component_ref):
	Use ranges_overlap_p.
	(offset_overlaps_with_access): Rename
	to ranges_overlap_p and move ...
	* tree-flow-inline.h (ranges_overlap_p): ... here.

	* expr.c (get_inner_reference, handled_component_p): Move ...
	* tree.c (get_inner_reference, handled_component_p): ... here.
	* tree-dfa.c (get_ref_base_and_extent): Move ...
	* tree.c (get_ref_base_and_extent): ... here.
	* tree.h (get_inner_reference, handled_component_p): Update
	comments.
	(get_ref_base_and_extent): Declare.

	* tree.h (record_component_aliases, get_alias_set,
	alias_sets_conflict_p, alias_sets_must_conflict_p,
	objects_must_conflict_p): Move declarations ...
	* alias.h (record_component_aliases, get_alias_set,
	alias_sets_conflict_p, alias_sets_must_conflict_p,
	objects_must_conflict_p): ... here.
	Include coretypes.h.
	* Makefile.in (ALIAS_H): Add coretypes.h dependency.

	PR tree-optimization/34172
	* tree-ssa-alias.c (refs_may_alias_p): New function.
	(refs_must_alias_p): Likewise.
	(ref_killed_by_def_p): Likewise.
	* tree-ssa-structalias.h (refs_may_alias_p): Declare.
	(refs_must_alias_p): Likewise.
	(ref_killed_by_def_p): Likewise.
	* tree-dfa.c (get_single_def_stmt): New function.
	* tree-flow.h (get_single_def_stmt): Declare.
	* tree-ssa-sccvn.c: Include tree-ssa-structalias.h.
	(vn_reference_lookup): Walk the virtual use-def chain to
	continue searching for a match if the def does not alias the
	reference we are looking for.
	* tree-ssa-dse.c: Include tree-ssa-structalias.h.
	(get_kill_of_stmt_lhs): Use ref_killed_by_def_p.
	* Makefile.in (tree-ssa-sccvn.o): Add tree-ssa-structalias.h
	dependency.
	(tree-ssa-dse.o): Likewise.

	* gcc.dg/tree-ssa/ssa-fre-7.c: New testcase.
	* gcc.dg/tree-ssa/20031106-4.c: Remove XFAIL.


Index: trunk/gcc/tree-flow-inline.h
===================================================================
*** trunk.orig/gcc/tree-flow-inline.h	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-flow-inline.h	2007-12-06 11:47:12.000000000 +0100
*************** var_can_have_subvars (const_tree v)
*** 1720,1726 ****
    return false;
  }
  
!   
  /* Return true if OFFSET and SIZE define a range that overlaps with some
     portion of the range of SV, a subvar.  If there was an exact overlap,
     *EXACT will be set to true upon return. */
--- 1720,1749 ----
    return false;
  }
  
! 
! /* Return true, if the two ranges [POS1, SIZE1] and [POS2, SIZE2]
!    overlap.  SIZE1 and/or SIZE2 can be (unsigned)-1 in which case the
!    range is open-ended.  Otherwise return false.  */
! 
! static inline bool
! ranges_overlap_p (unsigned HOST_WIDE_INT pos1,
! 		  unsigned HOST_WIDE_INT size1,
! 		  unsigned HOST_WIDE_INT pos2,
! 		  unsigned HOST_WIDE_INT size2)
! {
!   if (pos1 >= pos2
!       && (size2 == (unsigned HOST_WIDE_INT)-1
! 	  || pos1 < (pos2 + size2)))
!     return true;
!   if (pos2 >= pos1
!       && (size1 == (unsigned HOST_WIDE_INT)-1
! 	  || pos2 < (pos1 + size1)))
!     return true;
! 
!   return false;
! }
! 
! 
  /* Return true if OFFSET and SIZE define a range that overlaps with some
     portion of the range of SV, a subvar.  If there was an exact overlap,
     *EXACT will be set to true upon return. */
Index: trunk/gcc/tree-ssa-sccvn.c
===================================================================
*** trunk.orig/gcc/tree-ssa-sccvn.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-ssa-sccvn.c	2007-12-06 14:29:05.000000000 +0100
*************** along with GCC; see the file COPYING3.  
*** 44,49 ****
--- 44,50 ----
  #include "cfgloop.h"
  #include "tree-ssa-propagate.h"
  #include "tree-ssa-sccvn.h"
+ #include "tree-ssa-structalias.h"
  
  /* This algorithm is based on the SCC algorithm presented by Keith
     Cooper and L. Taylor Simpson in "SCC-Based Value numbering"
*************** vn_reference_lookup (tree op, VEC (tree,
*** 649,654 ****
--- 650,656 ----
  {
    void **slot;
    struct vn_reference_s vr1;
+   tree def_stmt;
  
    vr1.vuses = valueize_vuses (vuses);
    vr1.operands = valueize_refs (shared_reference_ops_from_ref (op));
*************** vn_reference_lookup (tree op, VEC (tree,
*** 658,667 ****
    if (!slot && current_info == optimistic_info)
      slot = htab_find_slot_with_hash (valid_info->references, &vr1, vr1.hashcode,
  				     NO_INSERT);
!   if (!slot)
!     return NULL_TREE;
  
!   return ((vn_reference_t)*slot)->result;
  }
  
  /* Insert OP into the current hash table with a value number of
--- 660,683 ----
    if (!slot && current_info == optimistic_info)
      slot = htab_find_slot_with_hash (valid_info->references, &vr1, vr1.hashcode,
  				     NO_INSERT);
!   if (slot)
!     return ((vn_reference_t)*slot)->result;
  
!   /* If there is a single defining statement for all virtual uses, we can
!      use that, following virtual use-def chains.  */
!   if (vr1.vuses
!       && VEC_length (tree, vr1.vuses) >= 1
!       && (def_stmt = get_single_def_stmt (vr1.vuses))
!       && TREE_CODE (def_stmt) == GIMPLE_MODIFY_STMT
!       /* If there is a call involved, op must be assumed to
! 	 be clobbered.  */
!       && !get_call_expr_in (def_stmt)
!       /* If the stored location does not alias the op we are looking
! 	 for, do the lookup with the def stmts vuses.  */
!       && !refs_may_alias_p (op, GIMPLE_STMT_OPERAND (def_stmt, 0)))
!     return vn_reference_lookup (op, shared_vuses_from_stmt (def_stmt));
! 
!   return NULL_TREE;
  }
  
  /* Insert OP into the current hash table with a value number of
Index: trunk/gcc/tree-ssa-structalias.c
===================================================================
*** trunk.orig/gcc/tree-ssa-structalias.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-ssa-structalias.c	2007-12-06 11:47:12.000000000 +0100
*************** bitpos_of_field (const tree fdecl)
*** 2630,2654 ****
  }
  
  
- /* Return true if an access to [ACCESSPOS, ACCESSSIZE]
-    overlaps with a field at [FIELDPOS, FIELDSIZE] */
- 
- static bool
- offset_overlaps_with_access (const unsigned HOST_WIDE_INT fieldpos,
- 			     const unsigned HOST_WIDE_INT fieldsize,
- 			     const unsigned HOST_WIDE_INT accesspos,
- 			     const unsigned HOST_WIDE_INT accesssize)
- {
-   if (fieldpos == accesspos && fieldsize == accesssize)
-     return true;
-   if (accesspos >= fieldpos && accesspos < (fieldpos + fieldsize))
-     return true;
-   if (accesspos < fieldpos && (accesspos + accesssize > fieldpos))
-     return true;
- 
-   return false;
- }
- 
  /* Given a COMPONENT_REF T, return the constraint_expr for it.  */
  
  static void
--- 2630,2635 ----
*************** get_constraint_for_component_ref (tree t
*** 2713,2720 ****
  	  varinfo_t curr;
  	  for (curr = get_varinfo (result->var); curr; curr = curr->next)
  	    {
! 	      if (offset_overlaps_with_access (curr->offset, curr->size,
! 					       result->offset, bitmaxsize))
  		{
  		  result->var = curr->id;
  		  break;
--- 2694,2701 ----
  	  varinfo_t curr;
  	  for (curr = get_varinfo (result->var); curr; curr = curr->next)
  	    {
! 	      if (ranges_overlap_p (curr->offset, curr->size,
! 				    result->offset, bitmaxsize))
  		{
  		  result->var = curr->id;
  		  break;
Index: trunk/gcc/testsuite/gcc.dg/tree-ssa/ssa-fre-7.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- trunk/gcc/testsuite/gcc.dg/tree-ssa/ssa-fre-7.c	2007-12-06 11:47:12.000000000 +0100
***************
*** 0 ****
--- 1,26 ----
+ /* { dg-do compile } */
+ /* { dg-options "-O -fdump-tree-fre-details" } */
+ 
+ struct
+ {
+   int x;
+   int y;
+ } S[100];
+ 
+ int z[100];
+ 
+ int
+ foo (int y)
+ {
+   int x;
+ 
+   S[5].x = 4;
+   S[5].y = 0;
+ 
+   x = S[5].x;
+ 
+   return (x);
+ }
+ 
+ /* { dg-final { scan-tree-dump "Replaced S\\\[5\\\].x with 4" "fre" } } */
+ /* { dg-final { cleanup-tree-dump "fre" } } */
Index: trunk/gcc/testsuite/gcc.dg/tree-ssa/20031106-4.c
===================================================================
*** trunk.orig/gcc/testsuite/gcc.dg/tree-ssa/20031106-4.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/testsuite/gcc.dg/tree-ssa/20031106-4.c	2007-12-06 11:47:12.000000000 +0100
*************** void foo (struct s*  r)
*** 26,30 ****
  }
  
  /* There should be no link_error calls.  */
! /* { dg-final { scan-tree-dump-times "link_error" 0 "optimized" { xfail *-*-* } } } */
  /* { dg-final { cleanup-tree-dump "optimized" } } */
--- 26,30 ----
  }
  
  /* There should be no link_error calls.  */
! /* { dg-final { scan-tree-dump-times "link_error" 0 "optimized" } } */
  /* { dg-final { cleanup-tree-dump "optimized" } } */
Index: trunk/gcc/expr.c
===================================================================
*** trunk.orig/gcc/expr.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/expr.c	2007-12-06 11:47:12.000000000 +0100
*************** store_field (rtx target, HOST_WIDE_INT b
*** 5827,6028 ****
      }
  }
  
- /* Given an expression EXP that may be a COMPONENT_REF, a BIT_FIELD_REF,
-    an ARRAY_REF, or an ARRAY_RANGE_REF, look for nested operations of these
-    codes and find the ultimate containing object, which we return.
- 
-    We set *PBITSIZE to the size in bits that we want, *PBITPOS to the
-    bit position, and *PUNSIGNEDP to the signedness of the field.
-    If the position of the field is variable, we store a tree
-    giving the variable offset (in units) in *POFFSET.
-    This offset is in addition to the bit position.
-    If the position is not variable, we store 0 in *POFFSET.
- 
-    If any of the extraction expressions is volatile,
-    we store 1 in *PVOLATILEP.  Otherwise we don't change that.
- 
-    If the field is a bit-field, *PMODE is set to VOIDmode.  Otherwise, it
-    is a mode that can be used to access the field.  In that case, *PBITSIZE
-    is redundant.
- 
-    If the field describes a variable-sized object, *PMODE is set to
-    VOIDmode and *PBITSIZE is set to -1.  An access cannot be made in
-    this case, but the address of the object can be found.
- 
-    If KEEP_ALIGNING is true and the target is STRICT_ALIGNMENT, we don't
-    look through nodes that serve as markers of a greater alignment than
-    the one that can be deduced from the expression.  These nodes make it
-    possible for front-ends to prevent temporaries from being created by
-    the middle-end on alignment considerations.  For that purpose, the
-    normal operating mode at high-level is to always pass FALSE so that
-    the ultimate containing object is really returned; moreover, the
-    associated predicate handled_component_p will always return TRUE
-    on these nodes, thus indicating that they are essentially handled
-    by get_inner_reference.  TRUE should only be passed when the caller
-    is scanning the expression in order to build another representation
-    and specifically knows how to handle these nodes; as such, this is
-    the normal operating mode in the RTL expanders.  */
- 
- tree
- get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
- 		     HOST_WIDE_INT *pbitpos, tree *poffset,
- 		     enum machine_mode *pmode, int *punsignedp,
- 		     int *pvolatilep, bool keep_aligning)
- {
-   tree size_tree = 0;
-   enum machine_mode mode = VOIDmode;
-   tree offset = size_zero_node;
-   tree bit_offset = bitsize_zero_node;
- 
-   /* First get the mode, signedness, and size.  We do this from just the
-      outermost expression.  */
-   if (TREE_CODE (exp) == COMPONENT_REF)
-     {
-       size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
-       if (! DECL_BIT_FIELD (TREE_OPERAND (exp, 1)))
- 	mode = DECL_MODE (TREE_OPERAND (exp, 1));
- 
-       *punsignedp = DECL_UNSIGNED (TREE_OPERAND (exp, 1));
-     }
-   else if (TREE_CODE (exp) == BIT_FIELD_REF)
-     {
-       size_tree = TREE_OPERAND (exp, 1);
-       *punsignedp = BIT_FIELD_REF_UNSIGNED (exp);
- 
-       /* For vector types, with the correct size of access, use the mode of
- 	 inner type.  */
-       if (TREE_CODE (TREE_TYPE (TREE_OPERAND (exp, 0))) == VECTOR_TYPE
- 	  && TREE_TYPE (exp) == TREE_TYPE (TREE_TYPE (TREE_OPERAND (exp, 0)))
- 	  && tree_int_cst_equal (size_tree, TYPE_SIZE (TREE_TYPE (exp))))
-         mode = TYPE_MODE (TREE_TYPE (exp));
-     }
-   else
-     {
-       mode = TYPE_MODE (TREE_TYPE (exp));
-       *punsignedp = TYPE_UNSIGNED (TREE_TYPE (exp));
- 
-       if (mode == BLKmode)
- 	size_tree = TYPE_SIZE (TREE_TYPE (exp));
-       else
- 	*pbitsize = GET_MODE_BITSIZE (mode);
-     }
- 
-   if (size_tree != 0)
-     {
-       if (! host_integerp (size_tree, 1))
- 	mode = BLKmode, *pbitsize = -1;
-       else
- 	*pbitsize = tree_low_cst (size_tree, 1);
-     }
- 
-   *pmode = mode;
- 
-   /* Compute cumulative bit-offset for nested component-refs and array-refs,
-      and find the ultimate containing object.  */
-   while (1)
-     {
-       switch (TREE_CODE (exp))
- 	{
- 	case BIT_FIELD_REF:
- 	  bit_offset = size_binop (PLUS_EXPR, bit_offset,
- 				   TREE_OPERAND (exp, 2));
- 	  break;
- 
- 	case COMPONENT_REF:
- 	  {
- 	    tree field = TREE_OPERAND (exp, 1);
- 	    tree this_offset = component_ref_field_offset (exp);
- 
- 	    /* If this field hasn't been filled in yet, don't go past it.
- 	       This should only happen when folding expressions made during
- 	       type construction.  */
- 	    if (this_offset == 0)
- 	      break;
- 
- 	    offset = size_binop (PLUS_EXPR, offset, this_offset);
- 	    bit_offset = size_binop (PLUS_EXPR, bit_offset,
- 				     DECL_FIELD_BIT_OFFSET (field));
- 
- 	    /* ??? Right now we don't do anything with DECL_OFFSET_ALIGN.  */
- 	  }
- 	  break;
- 
- 	case ARRAY_REF:
- 	case ARRAY_RANGE_REF:
- 	  {
- 	    tree index = TREE_OPERAND (exp, 1);
- 	    tree low_bound = array_ref_low_bound (exp);
- 	    tree unit_size = array_ref_element_size (exp);
- 
- 	    /* We assume all arrays have sizes that are a multiple of a byte.
- 	       First subtract the lower bound, if any, in the type of the
- 	       index, then convert to sizetype and multiply by the size of
- 	       the array element.  */
- 	    if (! integer_zerop (low_bound))
- 	      index = fold_build2 (MINUS_EXPR, TREE_TYPE (index),
- 				   index, low_bound);
- 
- 	    offset = size_binop (PLUS_EXPR, offset,
- 			         size_binop (MULT_EXPR,
- 					     fold_convert (sizetype, index),
- 					     unit_size));
- 	  }
- 	  break;
- 
- 	case REALPART_EXPR:
- 	  break;
- 
- 	case IMAGPART_EXPR:
- 	  bit_offset = size_binop (PLUS_EXPR, bit_offset,
- 				   bitsize_int (*pbitsize));
- 	  break;
- 
- 	case VIEW_CONVERT_EXPR:
- 	  if (keep_aligning && STRICT_ALIGNMENT
- 	      && (TYPE_ALIGN (TREE_TYPE (exp))
- 	       > TYPE_ALIGN (TREE_TYPE (TREE_OPERAND (exp, 0))))
- 	      && (TYPE_ALIGN (TREE_TYPE (TREE_OPERAND (exp, 0)))
- 		  < BIGGEST_ALIGNMENT)
- 	      && (TYPE_ALIGN_OK (TREE_TYPE (exp))
- 		  || TYPE_ALIGN_OK (TREE_TYPE (TREE_OPERAND (exp, 0)))))
- 	    goto done;
- 	  break;
- 
- 	default:
- 	  goto done;
- 	}
- 
-       /* If any reference in the chain is volatile, the effect is volatile.  */
-       if (TREE_THIS_VOLATILE (exp))
- 	*pvolatilep = 1;
- 
-       exp = TREE_OPERAND (exp, 0);
-     }
-  done:
- 
-   /* If OFFSET is constant, see if we can return the whole thing as a
-      constant bit position.  Make sure to handle overflow during
-      this conversion.  */
-   if (host_integerp (offset, 0))
-     {
-       double_int tem = double_int_mul (tree_to_double_int (offset),
- 				       uhwi_to_double_int (BITS_PER_UNIT));
-       tem = double_int_add (tem, tree_to_double_int (bit_offset));
-       if (double_int_fits_in_shwi_p (tem))
- 	{
- 	  *pbitpos = double_int_to_shwi (tem);
- 	  *poffset = NULL_TREE;
- 	  return exp;
- 	}
-     }
- 
-   /* Otherwise, split it up.  */
-   *pbitpos = tree_low_cst (bit_offset, 0);
-   *poffset = offset;
- 
-   return exp;
- }
- 
  /* Given an expression EXP that may be a COMPONENT_REF or an ARRAY_REF,
     look for whether EXP or any nested component-refs within EXP is marked
     as PACKED.  */
--- 5827,5832 ----
*************** component_ref_field_offset (tree exp)
*** 6156,6182 ****
    else
      return SUBSTITUTE_PLACEHOLDER_IN_EXPR (DECL_FIELD_OFFSET (field), exp);
  }
- 
- /* Return 1 if T is an expression that get_inner_reference handles.  */
- 
- int
- handled_component_p (const_tree t)
- {
-   switch (TREE_CODE (t))
-     {
-     case BIT_FIELD_REF:
-     case COMPONENT_REF:
-     case ARRAY_REF:
-     case ARRAY_RANGE_REF:
-     case VIEW_CONVERT_EXPR:
-     case REALPART_EXPR:
-     case IMAGPART_EXPR:
-       return 1;
- 
-     default:
-       return 0;
-     }
- }
  
  /* Given an rtx VALUE that may contain additions and multiplications, return
     an equivalent value that just refers to a register, memory, or constant.
--- 5960,5965 ----
Index: trunk/gcc/tree-dfa.c
===================================================================
*** trunk.orig/gcc/tree-dfa.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-dfa.c	2007-12-06 11:47:12.000000000 +0100
*************** find_new_referenced_vars (tree *stmt_p)
*** 860,1030 ****
    walk_tree (stmt_p, find_new_referenced_vars_1, NULL, NULL);
  }
  
! 
! /* If EXP is a handled component reference for a structure, return the
!    base variable.  The access range is delimited by bit positions *POFFSET and
!    *POFFSET + *PMAX_SIZE.  The access size is *PSIZE bits.  If either
!    *PSIZE or *PMAX_SIZE is -1, they could not be determined.  If *PSIZE
!    and *PMAX_SIZE are equal, the access is non-variable.  */
  
  tree
! get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
! 			 HOST_WIDE_INT *psize,
! 			 HOST_WIDE_INT *pmax_size)
  {
!   HOST_WIDE_INT bitsize = -1;
!   HOST_WIDE_INT maxsize = -1;
!   tree size_tree = NULL_TREE;
!   HOST_WIDE_INT bit_offset = 0;
!   bool seen_variable_array_ref = false;
! 
!   gcc_assert (!SSA_VAR_P (exp));
! 
!   /* First get the final access size from just the outermost expression.  */
!   if (TREE_CODE (exp) == COMPONENT_REF)
!     size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
!   else if (TREE_CODE (exp) == BIT_FIELD_REF)
!     size_tree = TREE_OPERAND (exp, 1);
!   else
!     {
!       enum machine_mode mode = TYPE_MODE (TREE_TYPE (exp));
!       if (mode == BLKmode)
! 	size_tree = TYPE_SIZE (TREE_TYPE (exp));
!       else
! 	bitsize = GET_MODE_BITSIZE (mode);
!     }
!   if (size_tree != NULL_TREE)
!     {
!       if (! host_integerp (size_tree, 1))
! 	bitsize = -1;
!       else
! 	bitsize = TREE_INT_CST_LOW (size_tree);
!     }
  
!   /* Initially, maxsize is the same as the accessed element size.
!      In the following it will only grow (or become -1).  */
!   maxsize = bitsize;
! 
!   /* Compute cumulative bit-offset for nested component-refs and array-refs,
!      and find the ultimate containing object.  */
!   while (1)
!     {
!       switch (TREE_CODE (exp))
! 	{
! 	case BIT_FIELD_REF:
! 	  bit_offset += tree_low_cst (TREE_OPERAND (exp, 2), 0);
! 	  break;
! 
! 	case COMPONENT_REF:
! 	  {
! 	    tree field = TREE_OPERAND (exp, 1);
! 	    tree this_offset = component_ref_field_offset (exp);
! 
! 	    if (this_offset && TREE_CODE (this_offset) == INTEGER_CST)
! 	      {
! 		HOST_WIDE_INT hthis_offset = tree_low_cst (this_offset, 0);
! 
! 		hthis_offset *= BITS_PER_UNIT;
! 		bit_offset += hthis_offset;
! 		bit_offset += tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 0);
! 	      }
! 	    else
! 	      {
! 		tree csize = TYPE_SIZE (TREE_TYPE (TREE_OPERAND (exp, 0)));
! 		/* We need to adjust maxsize to the whole structure bitsize.
! 		   But we can subtract any constant offset seen sofar,
! 		   because that would get us out of the structure otherwise.  */
! 		if (maxsize != -1 && csize && host_integerp (csize, 1))
! 		  maxsize = TREE_INT_CST_LOW (csize) - bit_offset;
! 		else
! 		  maxsize = -1;
! 	      }
! 	  }
! 	  break;
! 
! 	case ARRAY_REF:
! 	case ARRAY_RANGE_REF:
! 	  {
! 	    tree index = TREE_OPERAND (exp, 1);
! 	    tree low_bound = array_ref_low_bound (exp);
! 	    tree unit_size = array_ref_element_size (exp);
! 
! 	    /* If the resulting bit-offset is constant, track it.  */
! 	    if (host_integerp (index, 0)
! 		&& host_integerp (low_bound, 0)
! 		&& host_integerp (unit_size, 1))
! 	      {
! 		HOST_WIDE_INT hindex = tree_low_cst (index, 0);
! 
! 		hindex -= tree_low_cst (low_bound, 0);
! 		hindex *= tree_low_cst (unit_size, 1);
! 		hindex *= BITS_PER_UNIT;
! 		bit_offset += hindex;
! 
! 		/* An array ref with a constant index up in the structure
! 		   hierarchy will constrain the size of any variable array ref
! 		   lower in the access hierarchy.  */
! 		seen_variable_array_ref = false;
! 	      }
! 	    else
! 	      {
! 		tree asize = TYPE_SIZE (TREE_TYPE (TREE_OPERAND (exp, 0)));
! 		/* We need to adjust maxsize to the whole array bitsize.
! 		   But we can subtract any constant offset seen sofar,
! 		   because that would get us outside of the array otherwise.  */
! 		if (maxsize != -1 && asize && host_integerp (asize, 1))
! 		  maxsize = TREE_INT_CST_LOW (asize) - bit_offset;
! 		else
! 		  maxsize = -1;
! 
! 		/* Remember that we have seen an array ref with a variable
! 		   index.  */
! 		seen_variable_array_ref = true;
! 	      }
! 	  }
! 	  break;
! 
! 	case REALPART_EXPR:
! 	  break;
! 
! 	case IMAGPART_EXPR:
! 	  bit_offset += bitsize;
! 	  break;
! 
! 	case VIEW_CONVERT_EXPR:
! 	  /* ???  We probably should give up here and bail out.  */
! 	  break;
! 
! 	default:
! 	  goto done;
! 	}
  
!       exp = TREE_OPERAND (exp, 0);
      }
-  done:
- 
-   /* We need to deal with variable arrays ending structures such as
-        struct { int length; int a[1]; } x;           x.a[d]
-        struct { struct { int a; int b; } a[1]; } x;  x.a[d].a
-        struct { struct { int a[1]; } a[1]; } x;      x.a[0][d], x.a[d][0]
-      where we do not know maxsize for variable index accesses to
-      the array.  The simplest way to conservatively deal with this
-      is to punt in the case that offset + maxsize reaches the
-      base type boundary.  */
-   if (seen_variable_array_ref
-       && maxsize != -1
-       && host_integerp (TYPE_SIZE (TREE_TYPE (exp)), 1)
-       && bit_offset + maxsize
- 	   == (signed)TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp))))
-     maxsize = -1;
- 
-   /* ???  Due to negative offsets in ARRAY_REF we can end up with
-      negative bit_offset here.  We might want to store a zero offset
-      in this case.  */
-   *poffset = bit_offset;
-   *psize = bitsize;
-   *pmax_size = maxsize;
  
!   return exp;
  }
  
--- 860,886 ----
    walk_tree (stmt_p, find_new_referenced_vars_1, NULL, NULL);
  }
  
! /* Return the single statement defining all virtual uses in VUSES or
!    NULL_TREE, if there are multiple defining statements.
!    ???  This should take a parameter that matches how virtual operands
! 	are kept at a statement level.  */
  
  tree
! get_single_def_stmt (VEC (tree, gc) *vuses)
  {
!   tree def_stmt, vuse;
!   unsigned int i;
  
!   gcc_assert (VEC_length (tree, vuses) >= 1);
  
!   def_stmt = SSA_NAME_DEF_STMT (VEC_index (tree, vuses, 0));
!   for (i = 1; VEC_iterate (tree, vuses, i, vuse); ++i)
!     {
!       tree tmp = SSA_NAME_DEF_STMT (vuse);
!       if (tmp != def_stmt)
! 	return NULL_TREE;
      }
  
!   return def_stmt;
  }
  
Index: trunk/gcc/tree-flow.h
===================================================================
*** trunk.orig/gcc/tree-flow.h	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-flow.h	2007-12-06 11:47:12.000000000 +0100
*************** extern void find_new_referenced_vars (tr
*** 818,823 ****
--- 818,824 ----
  extern tree make_rename_temp (tree, const char *);
  extern void set_default_def (tree, tree);
  extern tree gimple_default_def (struct function *, tree);
+ extern tree get_single_def_stmt (VEC (tree, gc) *);
  
  /* In tree-phinodes.c  */
  extern void reserve_phi_args_for_new_edge (basic_block);
*************** static inline subvar_t get_subvars_for_v
*** 851,858 ****
  static inline tree get_subvar_at (tree, unsigned HOST_WIDE_INT);
  static inline bool ref_contains_array_ref (const_tree);
  static inline bool array_ref_contains_indirect_ref (const_tree);
- extern tree get_ref_base_and_extent (tree, HOST_WIDE_INT *,
- 				     HOST_WIDE_INT *, HOST_WIDE_INT *);
  static inline bool var_can_have_subvars (const_tree);
  static inline bool overlap_subvar (unsigned HOST_WIDE_INT,
  				   unsigned HOST_WIDE_INT,
--- 852,857 ----
Index: trunk/gcc/tree.c
===================================================================
*** trunk.orig/gcc/tree.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree.c	2007-12-06 11:47:12.000000000 +0100
*************** block_nonartificial_location (tree block
*** 8802,8805 ****
--- 8802,9191 ----
    return ret;
  }
  
+ 
+ /* Return true if T is an expression that get_inner_reference or
+    get_ref_base_and_extent handle.  */
+ 
+ bool
+ handled_component_p (const_tree t)
+ {
+   switch (TREE_CODE (t))
+     {
+     case BIT_FIELD_REF:
+     case COMPONENT_REF:
+     case ARRAY_REF:
+     case ARRAY_RANGE_REF:
+     case VIEW_CONVERT_EXPR:
+     case REALPART_EXPR:
+     case IMAGPART_EXPR:
+       return true;
+ 
+     default:
+       return false;
+     }
+ }
+ 
+ /* Given an expression EXP that may be a COMPONENT_REF, a BIT_FIELD_REF,
+    an ARRAY_REF, or an ARRAY_RANGE_REF, look for nested operations of these
+    codes and find the ultimate containing object, which we return.
+ 
+    We set *PBITSIZE to the size in bits that we want, *PBITPOS to the
+    bit position, and *PUNSIGNEDP to the signedness of the field.
+    If the position of the field is variable, we store a tree
+    giving the variable offset (in units) in *POFFSET.
+    This offset is in addition to the bit position.
+    If the position is not variable, we store 0 in *POFFSET.
+ 
+    If any of the extraction expressions is volatile,
+    we store 1 in *PVOLATILEP.  Otherwise we don't change that.
+ 
+    If the field is a bit-field, *PMODE is set to VOIDmode.  Otherwise, it
+    is a mode that can be used to access the field.  In that case, *PBITSIZE
+    is redundant.
+ 
+    If the field describes a variable-sized object, *PMODE is set to
+    VOIDmode and *PBITSIZE is set to -1.  An access cannot be made in
+    this case, but the address of the object can be found.
+ 
+    If KEEP_ALIGNING is true and the target is STRICT_ALIGNMENT, we don't
+    look through nodes that serve as markers of a greater alignment than
+    the one that can be deduced from the expression.  These nodes make it
+    possible for front-ends to prevent temporaries from being created by
+    the middle-end on alignment considerations.  For that purpose, the
+    normal operating mode at high-level is to always pass FALSE so that
+    the ultimate containing object is really returned; moreover, the
+    associated predicate handled_component_p will always return TRUE
+    on these nodes, thus indicating that they are essentially handled
+    by get_inner_reference.  TRUE should only be passed when the caller
+    is scanning the expression in order to build another representation
+    and specifically knows how to handle these nodes; as such, this is
+    the normal operating mode in the RTL expanders.  */
+ 
+ tree
+ get_inner_reference (tree exp, HOST_WIDE_INT *pbitsize,
+ 		     HOST_WIDE_INT *pbitpos, tree *poffset,
+ 		     enum machine_mode *pmode, int *punsignedp,
+ 		     int *pvolatilep, bool keep_aligning)
+ {
+   tree size_tree = 0;
+   enum machine_mode mode = VOIDmode;
+   tree offset = size_zero_node;
+   tree bit_offset = bitsize_zero_node;
+ 
+   /* First get the mode, signedness, and size.  We do this from just the
+      outermost expression.  */
+   if (TREE_CODE (exp) == COMPONENT_REF)
+     {
+       size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
+       if (! DECL_BIT_FIELD (TREE_OPERAND (exp, 1)))
+ 	mode = DECL_MODE (TREE_OPERAND (exp, 1));
+ 
+       *punsignedp = DECL_UNSIGNED (TREE_OPERAND (exp, 1));
+     }
+   else if (TREE_CODE (exp) == BIT_FIELD_REF)
+     {
+       size_tree = TREE_OPERAND (exp, 1);
+       *punsignedp = BIT_FIELD_REF_UNSIGNED (exp);
+ 
+       /* For vector types, with the correct size of access, use the mode of
+ 	 inner type.  */
+       if (TREE_CODE (TREE_TYPE (TREE_OPERAND (exp, 0))) == VECTOR_TYPE
+ 	  && TREE_TYPE (exp) == TREE_TYPE (TREE_TYPE (TREE_OPERAND (exp, 0)))
+ 	  && tree_int_cst_equal (size_tree, TYPE_SIZE (TREE_TYPE (exp))))
+         mode = TYPE_MODE (TREE_TYPE (exp));
+     }
+   else
+     {
+       mode = TYPE_MODE (TREE_TYPE (exp));
+       *punsignedp = TYPE_UNSIGNED (TREE_TYPE (exp));
+ 
+       if (mode == BLKmode)
+ 	size_tree = TYPE_SIZE (TREE_TYPE (exp));
+       else
+ 	*pbitsize = GET_MODE_BITSIZE (mode);
+     }
+ 
+   if (size_tree != 0)
+     {
+       if (! host_integerp (size_tree, 1))
+ 	mode = BLKmode, *pbitsize = -1;
+       else
+ 	*pbitsize = tree_low_cst (size_tree, 1);
+     }
+ 
+   *pmode = mode;
+ 
+   /* Compute cumulative bit-offset for nested component-refs and array-refs,
+      and find the ultimate containing object.  */
+   while (1)
+     {
+       switch (TREE_CODE (exp))
+ 	{
+ 	case BIT_FIELD_REF:
+ 	  bit_offset = size_binop (PLUS_EXPR, bit_offset,
+ 				   TREE_OPERAND (exp, 2));
+ 	  break;
+ 
+ 	case COMPONENT_REF:
+ 	  {
+ 	    tree field = TREE_OPERAND (exp, 1);
+ 	    tree this_offset = component_ref_field_offset (exp);
+ 
+ 	    /* If this field hasn't been filled in yet, don't go past it.
+ 	       This should only happen when folding expressions made during
+ 	       type construction.  */
+ 	    if (this_offset == 0)
+ 	      break;
+ 
+ 	    offset = size_binop (PLUS_EXPR, offset, this_offset);
+ 	    bit_offset = size_binop (PLUS_EXPR, bit_offset,
+ 				     DECL_FIELD_BIT_OFFSET (field));
+ 
+ 	    /* ??? Right now we don't do anything with DECL_OFFSET_ALIGN.  */
+ 	  }
+ 	  break;
+ 
+ 	case ARRAY_REF:
+ 	case ARRAY_RANGE_REF:
+ 	  {
+ 	    tree index = TREE_OPERAND (exp, 1);
+ 	    tree low_bound = array_ref_low_bound (exp);
+ 	    tree unit_size = array_ref_element_size (exp);
+ 
+ 	    /* We assume all arrays have sizes that are a multiple of a byte.
+ 	       First subtract the lower bound, if any, in the type of the
+ 	       index, then convert to sizetype and multiply by the size of
+ 	       the array element.  */
+ 	    if (! integer_zerop (low_bound))
+ 	      index = fold_build2 (MINUS_EXPR, TREE_TYPE (index),
+ 				   index, low_bound);
+ 
+ 	    offset = size_binop (PLUS_EXPR, offset,
+ 			         size_binop (MULT_EXPR,
+ 					     fold_convert (sizetype, index),
+ 					     unit_size));
+ 	  }
+ 	  break;
+ 
+ 	case REALPART_EXPR:
+ 	  break;
+ 
+ 	case IMAGPART_EXPR:
+ 	  bit_offset = size_binop (PLUS_EXPR, bit_offset,
+ 				   bitsize_int (*pbitsize));
+ 	  break;
+ 
+ 	case VIEW_CONVERT_EXPR:
+ 	  if (keep_aligning && STRICT_ALIGNMENT
+ 	      && (TYPE_ALIGN (TREE_TYPE (exp))
+ 	       > TYPE_ALIGN (TREE_TYPE (TREE_OPERAND (exp, 0))))
+ 	      && (TYPE_ALIGN (TREE_TYPE (TREE_OPERAND (exp, 0)))
+ 		  < BIGGEST_ALIGNMENT)
+ 	      && (TYPE_ALIGN_OK (TREE_TYPE (exp))
+ 		  || TYPE_ALIGN_OK (TREE_TYPE (TREE_OPERAND (exp, 0)))))
+ 	    goto done;
+ 	  break;
+ 
+ 	default:
+ 	  goto done;
+ 	}
+ 
+       /* If any reference in the chain is volatile, the effect is volatile.  */
+       if (TREE_THIS_VOLATILE (exp))
+ 	*pvolatilep = 1;
+ 
+       exp = TREE_OPERAND (exp, 0);
+     }
+  done:
+ 
+   /* If OFFSET is constant, see if we can return the whole thing as a
+      constant bit position.  Make sure to handle overflow during
+      this conversion.  */
+   if (host_integerp (offset, 0))
+     {
+       double_int tem = double_int_mul (tree_to_double_int (offset),
+ 				       uhwi_to_double_int (BITS_PER_UNIT));
+       tem = double_int_add (tem, tree_to_double_int (bit_offset));
+       if (double_int_fits_in_shwi_p (tem))
+ 	{
+ 	  *pbitpos = double_int_to_shwi (tem);
+ 	  *poffset = NULL_TREE;
+ 	  return exp;
+ 	}
+     }
+ 
+   /* Otherwise, split it up.  */
+   *pbitpos = tree_low_cst (bit_offset, 0);
+   *poffset = offset;
+ 
+   return exp;
+ }
+ 
+ /* If EXP is a handled component reference for a structure, return the
+    base variable.  The access range is delimited by bit positions *POFFSET and
+    *POFFSET + *PMAX_SIZE.  The access size is *PSIZE bits.  If either
+    *PSIZE or *PMAX_SIZE is -1, they could not be determined.  If *PSIZE
+    and *PMAX_SIZE are equal, the access is non-variable.  */
+ 
+ tree
+ get_ref_base_and_extent (tree exp, HOST_WIDE_INT *poffset,
+ 			 HOST_WIDE_INT *psize,
+ 			 HOST_WIDE_INT *pmax_size)
+ {
+   HOST_WIDE_INT bitsize = -1;
+   HOST_WIDE_INT maxsize = -1;
+   tree size_tree = NULL_TREE;
+   HOST_WIDE_INT bit_offset = 0;
+   bool seen_variable_array_ref = false;
+ 
+   gcc_assert (!SSA_VAR_P (exp));
+ 
+   /* First get the final access size from just the outermost expression.  */
+   if (TREE_CODE (exp) == COMPONENT_REF)
+     size_tree = DECL_SIZE (TREE_OPERAND (exp, 1));
+   else if (TREE_CODE (exp) == BIT_FIELD_REF)
+     size_tree = TREE_OPERAND (exp, 1);
+   else
+     {
+       enum machine_mode mode = TYPE_MODE (TREE_TYPE (exp));
+       if (mode == BLKmode)
+ 	size_tree = TYPE_SIZE (TREE_TYPE (exp));
+       else
+ 	bitsize = GET_MODE_BITSIZE (mode);
+     }
+   if (size_tree != NULL_TREE)
+     {
+       if (! host_integerp (size_tree, 1))
+ 	bitsize = -1;
+       else
+ 	bitsize = TREE_INT_CST_LOW (size_tree);
+     }
+ 
+   /* Initially, maxsize is the same as the accessed element size.
+      In the following it will only grow (or become -1).  */
+   maxsize = bitsize;
+ 
+   /* Compute cumulative bit-offset for nested component-refs and array-refs,
+      and find the ultimate containing object.  */
+   while (1)
+     {
+       switch (TREE_CODE (exp))
+ 	{
+ 	case BIT_FIELD_REF:
+ 	  bit_offset += tree_low_cst (TREE_OPERAND (exp, 2), 0);
+ 	  break;
+ 
+ 	case COMPONENT_REF:
+ 	  {
+ 	    tree field = TREE_OPERAND (exp, 1);
+ 	    tree this_offset = component_ref_field_offset (exp);
+ 
+ 	    if (this_offset && TREE_CODE (this_offset) == INTEGER_CST)
+ 	      {
+ 		HOST_WIDE_INT hthis_offset = tree_low_cst (this_offset, 0);
+ 
+ 		hthis_offset *= BITS_PER_UNIT;
+ 		bit_offset += hthis_offset;
+ 		bit_offset += tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 0);
+ 	      }
+ 	    else
+ 	      {
+ 		tree csize = TYPE_SIZE (TREE_TYPE (TREE_OPERAND (exp, 0)));
+ 		/* We need to adjust maxsize to the whole structure bitsize.
+ 		   But we can subtract any constant offset seen sofar,
+ 		   because that would get us out of the structure otherwise.  */
+ 		if (maxsize != -1 && csize && host_integerp (csize, 1))
+ 		  maxsize = TREE_INT_CST_LOW (csize) - bit_offset;
+ 		else
+ 		  maxsize = -1;
+ 	      }
+ 	  }
+ 	  break;
+ 
+ 	case ARRAY_REF:
+ 	case ARRAY_RANGE_REF:
+ 	  {
+ 	    tree index = TREE_OPERAND (exp, 1);
+ 	    tree low_bound = array_ref_low_bound (exp);
+ 	    tree unit_size = array_ref_element_size (exp);
+ 
+ 	    /* If the resulting bit-offset is constant, track it.  */
+ 	    if (host_integerp (index, 0)
+ 		&& host_integerp (low_bound, 0)
+ 		&& host_integerp (unit_size, 1))
+ 	      {
+ 		HOST_WIDE_INT hindex = tree_low_cst (index, 0);
+ 
+ 		hindex -= tree_low_cst (low_bound, 0);
+ 		hindex *= tree_low_cst (unit_size, 1);
+ 		hindex *= BITS_PER_UNIT;
+ 		bit_offset += hindex;
+ 
+ 		/* An array ref with a constant index up in the structure
+ 		   hierarchy will constrain the size of any variable array ref
+ 		   lower in the access hierarchy.  */
+ 		seen_variable_array_ref = false;
+ 	      }
+ 	    else
+ 	      {
+ 		tree asize = TYPE_SIZE (TREE_TYPE (TREE_OPERAND (exp, 0)));
+ 		/* We need to adjust maxsize to the whole array bitsize.
+ 		   But we can subtract any constant offset seen sofar,
+ 		   because that would get us outside of the array otherwise.  */
+ 		if (maxsize != -1 && asize && host_integerp (asize, 1))
+ 		  maxsize = TREE_INT_CST_LOW (asize) - bit_offset;
+ 		else
+ 		  maxsize = -1;
+ 
+ 		/* Remember that we have seen an array ref with a variable
+ 		   index.  */
+ 		seen_variable_array_ref = true;
+ 	      }
+ 	  }
+ 	  break;
+ 
+ 	case REALPART_EXPR:
+ 	  break;
+ 
+ 	case IMAGPART_EXPR:
+ 	  bit_offset += bitsize;
+ 	  break;
+ 
+ 	case VIEW_CONVERT_EXPR:
+ 	  /* ???  We probably should give up here and bail out.  */
+ 	  break;
+ 
+ 	default:
+ 	  goto done;
+ 	}
+ 
+       exp = TREE_OPERAND (exp, 0);
+     }
+  done:
+ 
+   /* We need to deal with variable arrays ending structures such as
+        struct { int length; int a[1]; } x;           x.a[d]
+        struct { struct { int a; int b; } a[1]; } x;  x.a[d].a
+        struct { struct { int a[1]; } a[1]; } x;      x.a[0][d], x.a[d][0]
+      where we do not know maxsize for variable index accesses to
+      the array.  The simplest way to conservatively deal with this
+      is to punt in the case that offset + maxsize reaches the
+      base type boundary.  */
+   if (seen_variable_array_ref
+       && maxsize != -1
+       && host_integerp (TYPE_SIZE (TREE_TYPE (exp)), 1)
+       && bit_offset + maxsize
+ 	   == (signed)TREE_INT_CST_LOW (TYPE_SIZE (TREE_TYPE (exp))))
+     maxsize = -1;
+ 
+   /* ???  Due to negative offsets in ARRAY_REF we can end up with
+      negative bit_offset here.  We might want to store a zero offset
+      in this case.  */
+   *poffset = bit_offset;
+   *psize = bitsize;
+   *pmax_size = maxsize;
+ 
+   return exp;
+ }
+ 
  #include "gt-tree.h"
Index: trunk/gcc/tree.h
===================================================================
*** trunk.orig/gcc/tree.h	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree.h	2007-12-06 11:47:12.000000000 +0100
*************** extern tree get_unwidened (tree, tree);
*** 4557,4580 ****
  
  extern tree get_narrower (tree, int *);
  
! /* Given an expression EXP that may be a COMPONENT_REF or an ARRAY_REF,
!    look for nested component-refs or array-refs at constant positions
!    and find the ultimate containing object, which is returned.  */
  
  extern tree get_inner_reference (tree, HOST_WIDE_INT *, HOST_WIDE_INT *,
  				 tree *, enum machine_mode *, int *, int *,
  				 bool);
  
  /* Given an expression EXP that may be a COMPONENT_REF or an ARRAY_REF,
     look for whether EXP or any nested component-refs within EXP is marked
     as PACKED.  */
  
  extern bool contains_packed_reference (const_tree exp);
  
- /* Return 1 if T is an expression that get_inner_reference handles.  */
- 
- extern int handled_component_p (const_tree);
- 
  /* Return a tree of sizetype representing the size, in bytes, of the element
     of EXP, an ARRAY_REF.  */
  
--- 4557,4587 ----
  
  extern tree get_narrower (tree, int *);
  
! /* Return true if T is an expression that get_inner_reference handles.  */
! 
! extern bool handled_component_p (const_tree);
! 
! /* Given an expression EXP that is a handled_component_p,
!    look for the ultimate containing object, which is returned and specify
!    the access position and size.  */
  
  extern tree get_inner_reference (tree, HOST_WIDE_INT *, HOST_WIDE_INT *,
  				 tree *, enum machine_mode *, int *, int *,
  				 bool);
  
+ /* Given an expression EXP that is a handled_component_p,
+    look for the ultimate containing object, which is returned and specify
+    a constant bound for the access position and size.  */
+ 
+ extern tree get_ref_base_and_extent (tree, HOST_WIDE_INT *,
+ 				     HOST_WIDE_INT *, HOST_WIDE_INT *);
+ 
  /* Given an expression EXP that may be a COMPONENT_REF or an ARRAY_REF,
     look for whether EXP or any nested component-refs within EXP is marked
     as PACKED.  */
  
  extern bool contains_packed_reference (const_tree exp);
  
  /* Return a tree of sizetype representing the size, in bytes, of the element
     of EXP, an ARRAY_REF.  */
  
*************** extern int get_pointer_alignment (tree, 
*** 4866,4878 ****
  /* In convert.c */
  extern tree strip_float_extensions (tree);
  
- /* In alias.c */
- extern void record_component_aliases (tree);
- extern alias_set_type get_alias_set (tree);
- extern int alias_sets_conflict_p (alias_set_type, alias_set_type);
- extern int alias_sets_must_conflict_p (alias_set_type, alias_set_type);
- extern int objects_must_conflict_p (tree, tree);
- 
  /* In tree.c */
  extern int really_constant_p (const_tree);
  extern int int_fits_type_p (const_tree, const_tree);
--- 4873,4878 ----
Index: trunk/gcc/alias.h
===================================================================
*** trunk.orig/gcc/alias.h	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/alias.h	2007-12-06 11:47:12.000000000 +0100
*************** along with GCC; see the file COPYING3.  
*** 20,33 ****
--- 20,40 ----
  #ifndef GCC_ALIAS_H
  #define GCC_ALIAS_H
  
+ #include "coretypes.h"
+ 
  /* The type of an alias set.  */
  typedef HOST_WIDE_INT alias_set_type;
  
  extern alias_set_type new_alias_set (void);
+ extern alias_set_type get_alias_set (tree);
  extern alias_set_type get_varargs_alias_set (void);
  extern alias_set_type get_frame_alias_set (void);
  extern bool component_uses_parent_alias_set (const_tree);
  extern bool alias_set_subset_of (alias_set_type, alias_set_type);
+ extern void record_component_aliases (tree);
+ extern int alias_sets_conflict_p (alias_set_type, alias_set_type);
+ extern int alias_sets_must_conflict_p (alias_set_type, alias_set_type);
+ extern int objects_must_conflict_p (tree, tree);
  
  /* This alias set can be used to force a memory to conflict with all
     other memories, creating a barrier across which no memory reference
Index: trunk/gcc/tree-ssa-alias.c
===================================================================
*** trunk.orig/gcc/tree-ssa-alias.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-ssa-alias.c	2007-12-06 11:47:12.000000000 +0100
*************** may_alias_p (tree ptr, alias_set_type me
*** 2938,2943 ****
--- 2938,3007 ----
  }
  
  
+ /* Return true, if the two memory references REF1 and REF2 may alias.  */
+ 
+ bool
+ refs_may_alias_p (tree ref1, tree ref2)
+ {
+   tree base1, base2;
+   HOST_WIDE_INT offset1, offset2;
+   HOST_WIDE_INT size1, size2;
+   HOST_WIDE_INT max_size1, max_size2;
+ 
+   /* Defer to TBAA if possible.  */
+   if (flag_strict_aliasing
+       && !alias_sets_conflict_p (get_alias_set (ref1), get_alias_set (ref2)))
+     return false;
+ 
+   /* Decompose the references into their base objects and the access.  */
+   base1 = ref1;
+   if (handled_component_p (ref1))
+     base1 = get_ref_base_and_extent (ref1, &offset1, &size1, &max_size1);
+   base2 = ref2;
+   if (handled_component_p (ref2))
+     base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &max_size2);
+ 
+   /* If both references are based on different variables, they cannot alias.
+      If both references are based on the same variable, they cannot alias if
+      if the accesses do not overlap.  */
+   if (SSA_VAR_P (base1)
+       && SSA_VAR_P (base2)
+       && (!operand_equal_p (base1, base2, 0)
+ 	  || !ranges_overlap_p (offset1, max_size1, offset2, max_size2)))
+     return false;
+ 
+   /* If both references are through pointers and both pointers are equal
+      then they do not alias if the accesses do not overlap.  */
+   if (TREE_CODE (base1) == INDIRECT_REF
+       && TREE_CODE (base2) == INDIRECT_REF
+       && operand_equal_p (TREE_OPERAND (base1, 0),
+ 			  TREE_OPERAND (base2, 0), 0)
+       && !ranges_overlap_p (offset1, max_size1, offset2, max_size2))
+     return false;
+ 
+   return true;
+ }
+ 
+ /* Return true, if the two memory references REF1 and REF2 must alias.  */
+ 
+ bool
+ refs_must_alias_p (tree ref1, tree ref2)
+ {
+   return operand_equal_p (ref1, ref2, 0);
+ }
+ 
+ /* Return true, if the memory reference REF is killed by the memory
+    definition DEF.  */
+ 
+ bool
+ ref_killed_by_def_p (tree ref, tree def)
+ {
+   /* While kill relationship is non-symmetric, must-alias relationship
+      is a conservative approximation to it.  */
+   return refs_must_alias_p (ref, def);
+ }
+ 
+ 
  /* Add ALIAS to the set of variables that may alias VAR.  */
  
  static void
Index: trunk/gcc/tree-ssa-structalias.h
===================================================================
*** trunk.orig/gcc/tree-ssa-structalias.h	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-ssa-structalias.h	2007-12-06 11:47:12.000000000 +0100
*************** struct alias_info
*** 65,70 ****
--- 65,73 ----
  /* In tree-ssa-alias.c.  */
  enum escape_type is_escape_site (tree);
  void update_mem_sym_stats_from_stmt (tree, tree, long, long);
+ bool refs_may_alias_p (tree, tree);
+ bool refs_must_alias_p (tree, tree);
+ bool ref_killed_by_def_p (tree, tree);
  
  /* In tree-ssa-structalias.c.  */
  extern void compute_points_to_sets (struct alias_info *);
Index: trunk/gcc/tree-ssa-dse.c
===================================================================
*** trunk.orig/gcc/tree-ssa-dse.c	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/tree-ssa-dse.c	2007-12-06 11:47:12.000000000 +0100
*************** along with GCC; see the file COPYING3.  
*** 33,38 ****
--- 33,39 ----
  #include "tree-dump.h"
  #include "domwalk.h"
  #include "flags.h"
+ #include "tree-ssa-structalias.h"
  
  /* This file implements dead store elimination.
  
*************** get_kill_of_stmt_lhs (tree stmt,
*** 259,265 ****
        /* If the use stmts lhs matches the original lhs we have
  	 found the kill, otherwise continue walking.  */
        use_lhs = GIMPLE_STMT_OPERAND (stmt, 0);
!       if (operand_equal_p (use_lhs, lhs, 0))
  	{
  	  *use_stmt = stmt;
  	  return true;
--- 260,266 ----
        /* If the use stmts lhs matches the original lhs we have
  	 found the kill, otherwise continue walking.  */
        use_lhs = GIMPLE_STMT_OPERAND (stmt, 0);
!       if (ref_killed_by_def_p (lhs, use_lhs))
  	{
  	  *use_stmt = stmt;
  	  return true;
Index: trunk/gcc/Makefile.in
===================================================================
*** trunk.orig/gcc/Makefile.in	2007-12-06 11:10:45.000000000 +0100
--- trunk/gcc/Makefile.in	2007-12-06 11:47:12.000000000 +0100
*************** GCOV_IO_H = gcov-io.h gcov-iov.h auto-ho
*** 786,792 ****
  COVERAGE_H = coverage.h $(GCOV_IO_H)
  DEMANGLE_H = $(srcdir)/../include/demangle.h
  RECOG_H = recog.h
! ALIAS_H = alias.h
  EMIT_RTL_H = emit-rtl.h
  FLAGS_H = flags.h options.h
  FUNCTION_H = function.h $(TREE_H) $(HASHTAB_H)
--- 786,792 ----
  COVERAGE_H = coverage.h $(GCOV_IO_H)
  DEMANGLE_H = $(srcdir)/../include/demangle.h
  RECOG_H = recog.h
! ALIAS_H = alias.h coretypes.h
  EMIT_RTL_H = emit-rtl.h
  FLAGS_H = flags.h options.h
  FUNCTION_H = function.h $(TREE_H) $(HASHTAB_H)
*************** tree-outof-ssa.o : tree-outof-ssa.c $(TR
*** 2008,2014 ****
  tree-ssa-dse.o : tree-ssa-dse.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
     $(TM_H) $(GGC_H) $(TREE_H) $(RTL_H) $(TM_P_H) $(BASIC_BLOCK_H) \
     $(TREE_FLOW_H) tree-pass.h $(TREE_DUMP_H) domwalk.h $(FLAGS_H) \
!    $(DIAGNOSTIC_H) $(TIMEVAR_H)
  tree-ssa-forwprop.o : tree-ssa-forwprop.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
     $(TM_H) $(GGC_H) $(TREE_H) $(RTL_H) $(TM_P_H) $(BASIC_BLOCK_H) \
     $(TREE_FLOW_H) tree-pass.h $(TREE_DUMP_H) $(DIAGNOSTIC_H) $(TIMEVAR_H) \
--- 2008,2014 ----
  tree-ssa-dse.o : tree-ssa-dse.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
     $(TM_H) $(GGC_H) $(TREE_H) $(RTL_H) $(TM_P_H) $(BASIC_BLOCK_H) \
     $(TREE_FLOW_H) tree-pass.h $(TREE_DUMP_H) domwalk.h $(FLAGS_H) \
!    $(DIAGNOSTIC_H) $(TIMEVAR_H) tree-ssa-structalias.h
  tree-ssa-forwprop.o : tree-ssa-forwprop.c $(CONFIG_H) $(SYSTEM_H) coretypes.h \
     $(TM_H) $(GGC_H) $(TREE_H) $(RTL_H) $(TM_P_H) $(BASIC_BLOCK_H) \
     $(TREE_FLOW_H) tree-pass.h $(TREE_DUMP_H) $(DIAGNOSTIC_H) $(TIMEVAR_H) \
*************** tree-ssa-sccvn.o : tree-ssa-sccvn.c $(TR
*** 2075,2081 ****
     $(SYSTEM_H) $(TREE_H) $(GGC_H) $(DIAGNOSTIC_H) $(TIMEVAR_H) \
     $(TM_H) coretypes.h $(TREE_DUMP_H) tree-pass.h $(FLAGS_H) $(CFGLOOP_H) \
     alloc-pool.h $(BASIC_BLOCK_H) bitmap.h $(HASHTAB_H) $(TREE_GIMPLE_H) \
!    $(TREE_INLINE_H) tree-iterator.h tree-ssa-propagate.h tree-ssa-sccvn.h
  tree-vn.o : tree-vn.c $(CONFIG_H) $(SYSTEM_H) coretypes.h $(TM_H) $(GGC_H) \
     $(TREE_H) $(TREE_FLOW_H) $(HASHTAB_H) langhooks.h tree-pass.h \
     $(TREE_DUMP_H) $(DIAGNOSTIC_H) tree-ssa-sccvn.h
--- 2075,2082 ----
     $(SYSTEM_H) $(TREE_H) $(GGC_H) $(DIAGNOSTIC_H) $(TIMEVAR_H) \
     $(TM_H) coretypes.h $(TREE_DUMP_H) tree-pass.h $(FLAGS_H) $(CFGLOOP_H) \
     alloc-pool.h $(BASIC_BLOCK_H) bitmap.h $(HASHTAB_H) $(TREE_GIMPLE_H) \
!    $(TREE_INLINE_H) tree-iterator.h tree-ssa-propagate.h tree-ssa-sccvn.h \
!    tree-ssa-structalias.h
  tree-vn.o : tree-vn.c $(CONFIG_H) $(SYSTEM_H) coretypes.h $(TM_H) $(GGC_H) \
     $(TREE_H) $(TREE_FLOW_H) $(HASHTAB_H) langhooks.h tree-pass.h \
     $(TREE_DUMP_H) $(DIAGNOSTIC_H) tree-ssa-sccvn.h


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]