[PATCH][RFC] Unify constant folding

Richard Guenther rguenther@suse.de
Thu Mar 17 15:30:00 GMT 2011


This tries to move us towards a single constant-folding machinery
on gimple, usable by the various value-numbering passes we have
(CCP, VRP, SCCVN and eventually DOM, not yet in this patch).  At
least VRP and SCCVN should in theory perform the same constant
propagations as CCP does but they do not at the moment which is
especially awkward as while there is a VRP pass after loop
optimizations a CCP pass is missing.

Thus, this patch makes VRP and SCCVN dispatch to the constant
folding implementation from CCP which is the most complete one.
For this valueization needs to be implementation dependent and
thus uses a callback function.

Starting from ccp_fold which is now gimple_fold_stmt_to_constant
(I probably should move it to gimple-fold.c) all dependent functions
also get the valueization callback.  To avoid the overhead for
indirect function calls I have started to place the actual
implementations in header files as static inlines with the
theory that in each of the value-numbering implementations there
will be a single kind of valueization and thus IPA-CP should
de-virtualize them all (making it a C-like template case).

That's the part I'd like to get feedback on - the patch doesn't
yet move fold_const_aggregate_ref_1 or gimple_fold_stmt_to_constant_1
to a header file.  Is this too ugly and do we just want to pay
the price of some indirect function calls?  
get_addr_base_and_unit_offset_1 is used more widely throughout the
compiler, so I think that case is definitely worth the few
extra copies.

Tried re-bootstrapping and testing the following variant, but
bootstrap is currently broken.

Thanks,
Richard.

2011-03-17  Richard Guenther  <rguenther@suse.de>

	PR tree-optimization/46562
	* tree.c (build_invariant_address): New function.
	* tree.h (build_invariant_address): Declare.
	* tree-dfa.c (get_addr_base_and_unit_offset): Wrap around
	a renamed function moved ...
	* tree-flow-inline.h (get_addr_base_and_unit_offset_1): ... here.
	Take valueization callback parameter.
	* tree-flow.h (gimple_fold_stmt_to_constant): Declare.
	* tree-ssa-ccp.c (gimple_fold_stmt_to_constant_1): New function
	split out from ccp_fold.  Take a valueization callback parameter.
	Valueize all operands.
	(gimple_fold_stmt_to_constant): New wrapper function.
	(ccp_fold): Use gimple_fold_stmt_to_constant_1 for shared parts.
	(fold_const_aggregate_ref_1): New function split out from
	fold_const_aggregate_ref.  Take a valueization callback parameter.
	(fold_const_aggregate_ref): Wrap fold_const_aggregate_ref_1.
	* tree-ssa-sccvn.c (simplify_binary_expression): Simplify
	invariant POINTER_PLUS_EXPRs to invariant form.
	(vn_valueize): New function.
	(try_to_simplify): Simplify by using gimple_fold_stmt_to_constant.
	* tree-vrp.c (vrp_valueize): New function.
	(vrp_visit_assignment_or_call): Use gimple_fold_stmt_to_constant
	to fold statements to constants.
	* tree-ssa-pre.c (eliminate): Properly guard propagation of
	function declarations.

	* c-c++-common/pr46562-2.c: New testcase.
	* c-c++-common/pr46562.c: Likewise.

Index: gcc/tree.c
===================================================================
*** gcc/tree.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree.c	2011-03-17 16:12:23.000000000 +0100
*************** reference_alias_ptr_type (const_tree t)
*** 4013,4018 ****
--- 4013,4032 ----
      return build_pointer_type (TYPE_MAIN_VARIANT (TREE_TYPE (base)));
  }
  
+ /* Return an invariant ADDR_EXPR of type TYPE taking the address of BASE
+    offsetted by OFFSET units.  */
+ 
+ tree
+ build_invariant_address (tree type, tree base, HOST_WIDE_INT offset)
+ {
+   tree ref = fold_build2 (MEM_REF, TREE_TYPE (type),
+ 			  build_fold_addr_expr (base),
+ 			  build_int_cst (ptr_type_node, offset));
+   tree addr = build1 (ADDR_EXPR, type, ref);
+   recompute_tree_invariant_for_addr_expr (addr);
+   return addr;
+ }
+ 
  /* Similar except don't specify the TREE_TYPE
     and leave the TREE_SIDE_EFFECTS as 0.
     It is permissible for arguments to be null,
Index: gcc/tree.h
===================================================================
*** gcc/tree.h.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree.h	2011-03-17 16:12:23.000000000 +0100
*************** extern tree build_simple_mem_ref_loc (lo
*** 5094,5099 ****
--- 5094,5100 ----
  	build_simple_mem_ref_loc (UNKNOWN_LOCATION, T)
  extern double_int mem_ref_offset (const_tree);
  extern tree reference_alias_ptr_type (const_tree);
+ extern tree build_invariant_address (tree, tree, HOST_WIDE_INT);
  extern tree constant_boolean_node (int, tree);
  extern tree div_if_zero_remainder (enum tree_code, const_tree, const_tree);
  
Index: gcc/tree-flow-inline.h
===================================================================
*** gcc/tree-flow-inline.h.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-flow-inline.h	2011-03-17 16:12:23.000000000 +0100
*************** make_ssa_name (tree var, gimple stmt)
*** 1227,1230 ****
--- 1227,1365 ----
    return make_ssa_name_fn (cfun, var, stmt);
  }
  
+ /* Returns the base object and a constant BITS_PER_UNIT offset in *POFFSET that
+    denotes the starting address of the memory access EXP.
+    Returns NULL_TREE if the offset is not constant or any component
+    is not BITS_PER_UNIT-aligned.
+    VALUEIZE if non-NULL is used to valueize SSA names.  It should return
+    its argument or a constant if the argument is known to be constant.  */
+ 
+ static inline tree
+ get_addr_base_and_unit_offset_1 (tree exp, HOST_WIDE_INT *poffset,
+ 				 tree (*valueize) (tree))
+ {
+   HOST_WIDE_INT byte_offset = 0;
+ 
+   /* Compute cumulative byte-offset for nested component-refs and array-refs,
+      and find the ultimate containing object.  */
+   while (1)
+     {
+       switch (TREE_CODE (exp))
+ 	{
+ 	case BIT_FIELD_REF:
+ 	  return NULL_TREE;
+ 
+ 	case COMPONENT_REF:
+ 	  {
+ 	    tree field = TREE_OPERAND (exp, 1);
+ 	    tree this_offset = component_ref_field_offset (exp);
+ 	    HOST_WIDE_INT hthis_offset;
+ 
+ 	    if (!this_offset
+ 		|| TREE_CODE (this_offset) != INTEGER_CST
+ 		|| (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
+ 		    % BITS_PER_UNIT))
+ 	      return NULL_TREE;
+ 
+ 	    hthis_offset = TREE_INT_CST_LOW (this_offset);
+ 	    hthis_offset += (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
+ 			     / BITS_PER_UNIT);
+ 	    byte_offset += hthis_offset;
+ 	  }
+ 	  break;
+ 
+ 	case ARRAY_REF:
+ 	case ARRAY_RANGE_REF:
+ 	  {
+ 	    tree index = TREE_OPERAND (exp, 1);
+ 	    tree low_bound, unit_size;
+ 
+ 	    if (valueize
+ 		&& TREE_CODE (index) == SSA_NAME)
+ 	      index = (*valueize) (index);
+ 
+ 	    /* If the resulting bit-offset is constant, track it.  */
+ 	    if (TREE_CODE (index) == INTEGER_CST
+ 		&& (low_bound = array_ref_low_bound (exp),
+ 		    TREE_CODE (low_bound) == INTEGER_CST)
+ 		&& (unit_size = array_ref_element_size (exp),
+ 		    TREE_CODE (unit_size) == INTEGER_CST))
+ 	      {
+ 		HOST_WIDE_INT hindex = TREE_INT_CST_LOW (index);
+ 
+ 		hindex -= TREE_INT_CST_LOW (low_bound);
+ 		hindex *= TREE_INT_CST_LOW (unit_size);
+ 		byte_offset += hindex;
+ 	      }
+ 	    else
+ 	      return NULL_TREE;
+ 	  }
+ 	  break;
+ 
+ 	case REALPART_EXPR:
+ 	  break;
+ 
+ 	case IMAGPART_EXPR:
+ 	  byte_offset += TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (exp)));
+ 	  break;
+ 
+ 	case VIEW_CONVERT_EXPR:
+ 	  break;
+ 
+ 	case MEM_REF:
+ 	  {
+ 	    tree base = TREE_OPERAND (exp, 0);
+ 	    if (valueize
+ 		&& TREE_CODE (base) == SSA_NAME)
+ 	      base = (*valueize) (base);
+ 
+ 	    /* Hand back the decl for MEM[&decl, off].  */
+ 	    if (TREE_CODE (base) == ADDR_EXPR)
+ 	      {
+ 		if (!integer_zerop (TREE_OPERAND (exp, 1)))
+ 		  {
+ 		    double_int off = mem_ref_offset (exp);
+ 		    gcc_assert (off.high == -1 || off.high == 0);
+ 		    byte_offset += double_int_to_shwi (off);
+ 		  }
+ 		exp = TREE_OPERAND (base, 0);
+ 	      }
+ 	    goto done;
+ 	  }
+ 
+ 	case TARGET_MEM_REF:
+ 	  {
+ 	    tree base = TREE_OPERAND (exp, 0);
+ 	    if (valueize
+ 		&& TREE_CODE (base) == SSA_NAME)
+ 	      base = (*valueize) (base);
+ 
+ 	    /* Hand back the decl for MEM[&decl, off].  */
+ 	    if (TREE_CODE (base) == ADDR_EXPR)
+ 	      {
+ 		if (TMR_INDEX (exp) || TMR_INDEX2 (exp))
+ 		  return NULL_TREE;
+ 		if (!integer_zerop (TMR_OFFSET (exp)))
+ 		  {
+ 		    double_int off = mem_ref_offset (exp);
+ 		    gcc_assert (off.high == -1 || off.high == 0);
+ 		    byte_offset += double_int_to_shwi (off);
+ 		  }
+ 		exp = TREE_OPERAND (base, 0);
+ 	      }
+ 	    goto done;
+ 	  }
+ 
+ 	default:
+ 	  goto done;
+ 	}
+ 
+       exp = TREE_OPERAND (exp, 0);
+     }
+ done:
+ 
+   *poffset = byte_offset;
+   return exp;
+ }
+ 
  #endif /* _TREE_FLOW_INLINE_H  */
Index: gcc/tree-dfa.c
===================================================================
*** gcc/tree-dfa.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-dfa.c	2011-03-17 16:12:23.000000000 +0100
*************** get_ref_base_and_extent (tree exp, HOST_
*** 963,1072 ****
  tree
  get_addr_base_and_unit_offset (tree exp, HOST_WIDE_INT *poffset)
  {
!   HOST_WIDE_INT byte_offset = 0;
! 
!   /* Compute cumulative byte-offset for nested component-refs and array-refs,
!      and find the ultimate containing object.  */
!   while (1)
!     {
!       switch (TREE_CODE (exp))
! 	{
! 	case BIT_FIELD_REF:
! 	  return NULL_TREE;
! 
! 	case COMPONENT_REF:
! 	  {
! 	    tree field = TREE_OPERAND (exp, 1);
! 	    tree this_offset = component_ref_field_offset (exp);
! 	    HOST_WIDE_INT hthis_offset;
! 
! 	    if (!this_offset
! 		|| TREE_CODE (this_offset) != INTEGER_CST
! 		|| (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
! 		    % BITS_PER_UNIT))
! 	      return NULL_TREE;
! 
! 	    hthis_offset = TREE_INT_CST_LOW (this_offset);
! 	    hthis_offset += (TREE_INT_CST_LOW (DECL_FIELD_BIT_OFFSET (field))
! 			     / BITS_PER_UNIT);
! 	    byte_offset += hthis_offset;
! 	  }
! 	  break;
! 
! 	case ARRAY_REF:
! 	case ARRAY_RANGE_REF:
! 	  {
! 	    tree index = TREE_OPERAND (exp, 1);
! 	    tree low_bound, unit_size;
! 
! 	    /* If the resulting bit-offset is constant, track it.  */
! 	    if (TREE_CODE (index) == INTEGER_CST
! 		&& (low_bound = array_ref_low_bound (exp),
! 		    TREE_CODE (low_bound) == INTEGER_CST)
! 		&& (unit_size = array_ref_element_size (exp),
! 		    TREE_CODE (unit_size) == INTEGER_CST))
! 	      {
! 		HOST_WIDE_INT hindex = TREE_INT_CST_LOW (index);
! 
! 		hindex -= TREE_INT_CST_LOW (low_bound);
! 		hindex *= TREE_INT_CST_LOW (unit_size);
! 		byte_offset += hindex;
! 	      }
! 	    else
! 	      return NULL_TREE;
! 	  }
! 	  break;
! 
! 	case REALPART_EXPR:
! 	  break;
! 
! 	case IMAGPART_EXPR:
! 	  byte_offset += TREE_INT_CST_LOW (TYPE_SIZE_UNIT (TREE_TYPE (exp)));
! 	  break;
! 
! 	case VIEW_CONVERT_EXPR:
! 	  break;
! 
! 	case MEM_REF:
! 	  /* Hand back the decl for MEM[&decl, off].  */
! 	  if (TREE_CODE (TREE_OPERAND (exp, 0)) == ADDR_EXPR)
! 	    {
! 	      if (!integer_zerop (TREE_OPERAND (exp, 1)))
! 		{
! 		  double_int off = mem_ref_offset (exp);
! 		  gcc_assert (off.high == -1 || off.high == 0);
! 		  byte_offset += double_int_to_shwi (off);
! 		}
! 	      exp = TREE_OPERAND (TREE_OPERAND (exp, 0), 0);
! 	    }
! 	  goto done;
! 
! 	case TARGET_MEM_REF:
! 	  /* Hand back the decl for MEM[&decl, off].  */
! 	  if (TREE_CODE (TMR_BASE (exp)) == ADDR_EXPR)
! 	    {
! 	      if (TMR_INDEX (exp) || TMR_INDEX2 (exp))
! 		return NULL_TREE;
! 	      if (!integer_zerop (TMR_OFFSET (exp)))
! 		{
! 		  double_int off = mem_ref_offset (exp);
! 		  gcc_assert (off.high == -1 || off.high == 0);
! 		  byte_offset += double_int_to_shwi (off);
! 		}
! 	      exp = TREE_OPERAND (TMR_BASE (exp), 0);
! 	    }
! 	  goto done;
! 
! 	default:
! 	  goto done;
! 	}
! 
!       exp = TREE_OPERAND (exp, 0);
!     }
! done:
! 
!   *poffset = byte_offset;
!   return exp;
  }
  
  /* Returns true if STMT references an SSA_NAME that has
--- 963,969 ----
  tree
  get_addr_base_and_unit_offset (tree exp, HOST_WIDE_INT *poffset)
  {
!   return get_addr_base_and_unit_offset_1 (exp, poffset, NULL);
  }
  
  /* Returns true if STMT references an SSA_NAME that has
Index: gcc/tree-ssa-ccp.c
===================================================================
*** gcc/tree-ssa-ccp.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-ssa-ccp.c	2011-03-17 16:12:23.000000000 +0100
*************** static bool ccp_fold_stmt (gimple_stmt_i
*** 170,175 ****
--- 170,176 ----
  static tree fold_ctor_reference (tree type, tree ctor,
  				 unsigned HOST_WIDE_INT offset,
  				 unsigned HOST_WIDE_INT size);
+ static tree fold_const_aggregate_ref_1 (tree, tree (*) (tree));
  
  /* Dump constant propagation value VAL to file OUTF prefixed by PREFIX.  */
  
*************** valueize_op (tree op)
*** 1083,1099 ****
    return op;
  }
  
! /* CCP specific front-end to the non-destructive constant folding
!    routines.
  
!    Attempt to simplify the RHS of STMT knowing that one or more
!    operands are constants.
  
!    If simplification is possible, return the simplified RHS,
!    otherwise return the original RHS or NULL_TREE.  */
  
  static tree
! ccp_fold (gimple stmt)
  {
    location_t loc = gimple_location (stmt);
    switch (gimple_code (stmt))
--- 1084,1100 ----
    return op;
  }
  
! /* Fold STMT to a constant using VALUEIZE to valueize SSA names.
  
!    Either NULL_TREE, a simplified but non-constant or a constant
!    is returned.
  
!    ???  This should go into a tree-ssa-ccp.h file to be eventually
!    privatized with the single valueize function used in the various TUs
!    to avoid the indirect function call overhead.  */
  
  static tree
! gimple_fold_stmt_to_constant_1 (gimple stmt, tree (*valueize) (tree))
  {
    location_t loc = gimple_location (stmt);
    switch (gimple_code (stmt))
*************** ccp_fold (gimple stmt)
*** 1113,1150 ****
                  {
                    /* If the RHS is an SSA_NAME, return its known constant value,
                       if any.  */
!                   return get_constant_value (rhs);
                  }
! 	      /* Handle propagating invariant addresses into address operations.
! 		 The folding we do here matches that in tree-ssa-forwprop.c.  */
! 	      else if (TREE_CODE (rhs) == ADDR_EXPR)
  		{
! 		  tree *base;
! 		  base = &TREE_OPERAND (rhs, 0);
! 		  while (handled_component_p (*base))
! 		    base = &TREE_OPERAND (*base, 0);
! 		  if (TREE_CODE (*base) == MEM_REF
! 		      && TREE_CODE (TREE_OPERAND (*base, 0)) == SSA_NAME)
! 		    {
! 		      tree val = get_constant_value (TREE_OPERAND (*base, 0));
! 		      if (val
! 			  && TREE_CODE (val) == ADDR_EXPR)
! 			{
! 			  tree ret, save = *base;
! 			  tree new_base;
! 			  new_base = fold_build2 (MEM_REF, TREE_TYPE (*base),
! 						  unshare_expr (val),
! 						  TREE_OPERAND (*base, 1));
! 			  /* We need to return a new tree, not modify the IL
! 			     or share parts of it.  So play some tricks to
! 			     avoid manually building it.  */
! 			  *base = new_base;
! 			  ret = unshare_expr (rhs);
! 			  recompute_tree_invariant_for_addr_expr (ret);
! 			  *base = save;
! 			  return ret;
! 			}
! 		    }
  		}
  	      else if (TREE_CODE (rhs) == CONSTRUCTOR
  		       && TREE_CODE (TREE_TYPE (rhs)) == VECTOR_TYPE
--- 1114,1136 ----
                  {
                    /* If the RHS is an SSA_NAME, return its known constant value,
                       if any.  */
!                   return (*valueize) (rhs);
                  }
! 	      /* Handle propagating invariant addresses into address
! 		 operations.  */
! 	      else if (TREE_CODE (rhs) == ADDR_EXPR
! 		       && !is_gimple_min_invariant (rhs))
  		{
! 		  HOST_WIDE_INT offset;
! 		  tree base;
! 		  base = get_addr_base_and_unit_offset_1 (TREE_OPERAND (rhs, 0),
! 							  &offset,
! 							  valueize);
! 		  if (base
! 		      && (CONSTANT_CLASS_P (base)
! 			  || decl_address_invariant_p (base)))
! 		    return build_invariant_address (TREE_TYPE (rhs),
! 						    base, offset);
  		}
  	      else if (TREE_CODE (rhs) == CONSTRUCTOR
  		       && TREE_CODE (TREE_TYPE (rhs)) == VECTOR_TYPE
*************** ccp_fold (gimple stmt)
*** 1157,1163 ****
  		  list = NULL_TREE;
  		  FOR_EACH_CONSTRUCTOR_VALUE (CONSTRUCTOR_ELTS (rhs), i, val)
  		    {
! 		      val = valueize_op (val);
  		      if (TREE_CODE (val) == INTEGER_CST
  			  || TREE_CODE (val) == REAL_CST
  			  || TREE_CODE (val) == FIXED_CST)
--- 1143,1149 ----
  		  list = NULL_TREE;
  		  FOR_EACH_CONSTRUCTOR_VALUE (CONSTRUCTOR_ELTS (rhs), i, val)
  		    {
! 		      val = (*valueize) (val);
  		      if (TREE_CODE (val) == INTEGER_CST
  			  || TREE_CODE (val) == REAL_CST
  			  || TREE_CODE (val) == FIXED_CST)
*************** ccp_fold (gimple stmt)
*** 1176,1204 ****
  		       || TREE_CODE (rhs) == IMAGPART_EXPR)
  		      && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = get_constant_value (TREE_OPERAND (rhs, 0));
! 		      if (val)
! 			return fold_unary_loc (EXPR_LOCATION (rhs),
! 					       TREE_CODE (rhs),
! 					       TREE_TYPE (rhs), val);
  		    }
  		  else if (TREE_CODE (rhs) == BIT_FIELD_REF
  			   && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = get_constant_value (TREE_OPERAND (rhs, 0));
! 		      if (val)
! 			return fold_ternary_loc (EXPR_LOCATION (rhs),
! 						 TREE_CODE (rhs),
! 						 TREE_TYPE (rhs), val,
! 						 TREE_OPERAND (rhs, 1),
! 						 TREE_OPERAND (rhs, 2));
  		    }
  		  else if (TREE_CODE (rhs) == MEM_REF
  			   && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = get_constant_value (TREE_OPERAND (rhs, 0));
! 		      if (val
! 			  && TREE_CODE (val) == ADDR_EXPR)
  			{
  			  tree tem = fold_build2 (MEM_REF, TREE_TYPE (rhs),
  						  unshare_expr (val),
--- 1162,1188 ----
  		       || TREE_CODE (rhs) == IMAGPART_EXPR)
  		      && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = (*valueize) (TREE_OPERAND (rhs, 0));
! 		      return fold_unary_loc (EXPR_LOCATION (rhs),
! 					     TREE_CODE (rhs),
! 					     TREE_TYPE (rhs), val);
  		    }
  		  else if (TREE_CODE (rhs) == BIT_FIELD_REF
  			   && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = (*valueize) (TREE_OPERAND (rhs, 0));
! 		      return fold_ternary_loc (EXPR_LOCATION (rhs),
! 					       TREE_CODE (rhs),
! 					       TREE_TYPE (rhs), val,
! 					       TREE_OPERAND (rhs, 1),
! 					       TREE_OPERAND (rhs, 2));
  		    }
  		  else if (TREE_CODE (rhs) == MEM_REF
  			   && TREE_CODE (TREE_OPERAND (rhs, 0)) == SSA_NAME)
  		    {
! 		      tree val = (*valueize) (TREE_OPERAND (rhs, 0));
! 		      if (TREE_CODE (val) == ADDR_EXPR
! 			  && is_gimple_min_invariant (val))
  			{
  			  tree tem = fold_build2 (MEM_REF, TREE_TYPE (rhs),
  						  unshare_expr (val),
*************** ccp_fold (gimple stmt)
*** 1207,1213 ****
  			    rhs = tem;
  			}
  		    }
! 		  return fold_const_aggregate_ref (rhs);
  		}
                else if (kind == tcc_declaration)
                  return get_symbol_constant_value (rhs);
--- 1191,1197 ----
  			    rhs = tem;
  			}
  		    }
! 		  return fold_const_aggregate_ref_1 (rhs, valueize);
  		}
                else if (kind == tcc_declaration)
                  return get_symbol_constant_value (rhs);
*************** ccp_fold (gimple stmt)
*** 1220,1226 ****
                   Note that we know the single operand must be a constant,
                   so this should almost always return a simplified RHS.  */
                tree lhs = gimple_assign_lhs (stmt);
!               tree op0 = valueize_op (gimple_assign_rhs1 (stmt));
  
  	      /* Conversions are useless for CCP purposes if they are
  		 value-preserving.  Thus the restrictions that
--- 1204,1210 ----
                   Note that we know the single operand must be a constant,
                   so this should almost always return a simplified RHS.  */
                tree lhs = gimple_assign_lhs (stmt);
!               tree op0 = (*valueize) (gimple_assign_rhs1 (stmt));
  
  	      /* Conversions are useless for CCP purposes if they are
  		 value-preserving.  Thus the restrictions that
*************** ccp_fold (gimple stmt)
*** 1251,1258 ****
            case GIMPLE_BINARY_RHS:
              {
                /* Handle binary operators that can appear in GIMPLE form.  */
!               tree op0 = valueize_op (gimple_assign_rhs1 (stmt));
!               tree op1 = valueize_op (gimple_assign_rhs2 (stmt));
  
  	      /* Translate &x + CST into an invariant form suitable for
  	         further propagation.  */
--- 1235,1242 ----
            case GIMPLE_BINARY_RHS:
              {
                /* Handle binary operators that can appear in GIMPLE form.  */
!               tree op0 = (*valueize) (gimple_assign_rhs1 (stmt));
!               tree op1 = (*valueize) (gimple_assign_rhs2 (stmt));
  
  	      /* Translate &x + CST into an invariant form suitable for
  	         further propagation.  */
*************** ccp_fold (gimple stmt)
*** 1274,1282 ****
            case GIMPLE_TERNARY_RHS:
              {
                /* Handle ternary operators that can appear in GIMPLE form.  */
!               tree op0 = valueize_op (gimple_assign_rhs1 (stmt));
!               tree op1 = valueize_op (gimple_assign_rhs2 (stmt));
!               tree op2 = valueize_op (gimple_assign_rhs3 (stmt));
  
                return fold_ternary_loc (loc, subcode,
  				       gimple_expr_type (stmt), op0, op1, op2);
--- 1258,1266 ----
            case GIMPLE_TERNARY_RHS:
              {
                /* Handle ternary operators that can appear in GIMPLE form.  */
!               tree op0 = (*valueize) (gimple_assign_rhs1 (stmt));
!               tree op1 = (*valueize) (gimple_assign_rhs2 (stmt));
!               tree op2 = (*valueize) (gimple_assign_rhs3 (stmt));
  
                return fold_ternary_loc (loc, subcode,
  				       gimple_expr_type (stmt), op0, op1, op2);
*************** ccp_fold (gimple stmt)
*** 1286,1296 ****
              gcc_unreachable ();
            }
        }
-       break;
  
      case GIMPLE_CALL:
        {
! 	tree fn = valueize_op (gimple_call_fn (stmt));
  	if (TREE_CODE (fn) == ADDR_EXPR
  	    && TREE_CODE (TREE_OPERAND (fn, 0)) == FUNCTION_DECL
  	    && DECL_BUILT_IN (TREE_OPERAND (fn, 0)))
--- 1270,1279 ----
              gcc_unreachable ();
            }
        }
  
      case GIMPLE_CALL:
        {
! 	tree fn = (*valueize) (gimple_call_fn (stmt));
  	if (TREE_CODE (fn) == ADDR_EXPR
  	    && TREE_CODE (TREE_OPERAND (fn, 0)) == FUNCTION_DECL
  	    && DECL_BUILT_IN (TREE_OPERAND (fn, 0)))
*************** ccp_fold (gimple stmt)
*** 1299,1305 ****
  	    tree call, retval;
  	    unsigned i;
  	    for (i = 0; i < gimple_call_num_args (stmt); ++i)
! 	      args[i] = valueize_op (gimple_call_arg (stmt, i));
  	    call = build_call_array_loc (loc,
  					 gimple_call_return_type (stmt),
  					 fn, gimple_call_num_args (stmt), args);
--- 1282,1288 ----
  	    tree call, retval;
  	    unsigned i;
  	    for (i = 0; i < gimple_call_num_args (stmt); ++i)
! 	      args[i] = (*valueize) (gimple_call_arg (stmt, i));
  	    call = build_call_array_loc (loc,
  					 gimple_call_return_type (stmt),
  					 fn, gimple_call_num_args (stmt), args);
*************** ccp_fold (gimple stmt)
*** 1312,1317 ****
--- 1295,1333 ----
  	return NULL_TREE;
        }
  
+     default:
+       return NULL_TREE;
+     }
+ }
+ 
+ /* Fold STMT to a constant using VALUEIZE to valueize SSA names.
+    Returns NULL_TREE if folding to a constant is not possible, otherwise
+    returns a constant according to is_gimple_min_invariant.  */
+ 
+ tree
+ gimple_fold_stmt_to_constant (gimple stmt, tree (*valueize) (tree))
+ {
+   tree res = gimple_fold_stmt_to_constant_1 (stmt, valueize);
+   if (res && is_gimple_min_invariant (res))
+     return res;
+   return NULL_TREE;
+ }
+ 
+ /* CCP specific front-end to the non-destructive constant folding
+    routines.
+ 
+    Attempt to simplify the RHS of STMT knowing that one or more
+    operands are constants.
+ 
+    If simplification is possible, return the simplified RHS,
+    otherwise return the original RHS or NULL_TREE.  */
+ 
+ static tree
+ ccp_fold (gimple stmt)
+ {
+   location_t loc = gimple_location (stmt);
+   switch (gimple_code (stmt))
+     {
      case GIMPLE_COND:
        {
          /* Handle comparison operators that can appear in GIMPLE form.  */
*************** ccp_fold (gimple stmt)
*** 1327,1332 ****
--- 1343,1352 ----
          return valueize_op (gimple_switch_index (stmt));
        }
  
+     case GIMPLE_ASSIGN:
+     case GIMPLE_CALL:
+       return gimple_fold_stmt_to_constant_1 (stmt, valueize_op);
+ 
      default:
        gcc_unreachable ();
      }
*************** fold_ctor_reference (tree type, tree cto
*** 1617,1627 ****
  }
  
  /* Return the tree representing the element referenced by T if T is an
!    ARRAY_REF or COMPONENT_REF into constant aggregates.  Return
!    NULL_TREE otherwise.  */
  
! tree
! fold_const_aggregate_ref (tree t)
  {
    tree ctor, idx, base;
    HOST_WIDE_INT offset, size, max_size;
--- 1637,1647 ----
  }
  
  /* Return the tree representing the element referenced by T if T is an
!    ARRAY_REF or COMPONENT_REF into constant aggregates valuezing SSA
!    names using VALUEIZE.  Return NULL_TREE otherwise.  */
  
! static tree
! fold_const_aggregate_ref_1 (tree t, tree (*valueize) (tree))
  {
    tree ctor, idx, base;
    HOST_WIDE_INT offset, size, max_size;
*************** fold_const_aggregate_ref (tree t)
*** 1644,1650 ****
  	 (they will be handled only by iteration of ccp).  Perhaps we can bring
  	 get_ref_base_and_extent here and make it use get_constant_value.  */
        if (TREE_CODE (TREE_OPERAND (t, 1)) == SSA_NAME
! 	  && (idx = get_constant_value (TREE_OPERAND (t, 1)))
  	  && host_integerp (idx, 0))
  	{
  	  tree low_bound, unit_size;
--- 1664,1671 ----
  	 (they will be handled only by iteration of ccp).  Perhaps we can bring
  	 get_ref_base_and_extent here and make it use get_constant_value.  */
        if (TREE_CODE (TREE_OPERAND (t, 1)) == SSA_NAME
! 	  && valueize
! 	  && (idx = (*valueize) (TREE_OPERAND (t, 1)))
  	  && host_integerp (idx, 0))
  	{
  	  tree low_bound, unit_size;
*************** fold_const_aggregate_ref (tree t)
*** 1704,1710 ****
      case REALPART_EXPR:
      case IMAGPART_EXPR:
        {
! 	tree c = fold_const_aggregate_ref (TREE_OPERAND (t, 0));
  	if (c && TREE_CODE (c) == COMPLEX_CST)
  	  return fold_build1_loc (EXPR_LOCATION (t),
  			      TREE_CODE (t), TREE_TYPE (t), c);
--- 1725,1731 ----
      case REALPART_EXPR:
      case IMAGPART_EXPR:
        {
! 	tree c = fold_const_aggregate_ref_1 (TREE_OPERAND (t, 0), valueize);
  	if (c && TREE_CODE (c) == COMPLEX_CST)
  	  return fold_build1_loc (EXPR_LOCATION (t),
  			      TREE_CODE (t), TREE_TYPE (t), c);
*************** fold_const_aggregate_ref (tree t)
*** 1718,1723 ****
--- 1739,1750 ----
    return NULL_TREE;
  }
  
+ tree
+ fold_const_aggregate_ref (tree t)
+ {
+   return fold_const_aggregate_ref_1 (t, NULL);
+ }
+ 
  /* Apply the operation CODE in type TYPE to the value, mask pair
     RVAL and RMASK representing a value of type RTYPE and set
     the value, mask pair *VAL and *MASK to the result.  */
Index: gcc/tree-ssa-sccvn.c
===================================================================
*** gcc/tree-ssa-sccvn.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-ssa-sccvn.c	2011-03-17 16:13:58.000000000 +0100
*************** simplify_binary_expression (gimple stmt)
*** 2770,2775 ****
--- 2770,2785 ----
  	op1 = SSA_VAL (op1);
      }
  
+   /* Pointer plus constant can be represented as invariant address.
+      Do so to allow further propatation, see also tree forwprop.  */
+   if (gimple_assign_rhs_code (stmt) == POINTER_PLUS_EXPR
+       && host_integerp (op1, 1)
+       && TREE_CODE (op0) == ADDR_EXPR
+       && is_gimple_min_invariant (op0))
+     return build_invariant_address (TREE_TYPE (op0),
+ 				    TREE_OPERAND (op0, 0),
+ 				    TREE_INT_CST_LOW (op1));
+ 
    /* Avoid folding if nothing changed.  */
    if (op0 == gimple_assign_rhs1 (stmt)
        && op1 == gimple_assign_rhs2 (stmt))
*************** simplify_unary_expression (gimple stmt)
*** 2849,2854 ****
--- 2859,2877 ----
    return NULL_TREE;
  }
  
+ /* Valueize NAME if it is an SSA name, otherwise just return it.  */
+ 
+ static inline tree
+ vn_valueize (tree name)
+ {
+   if (TREE_CODE (name) == SSA_NAME)
+     {
+       tree tem = SSA_VAL (name);
+       return tem == VN_TOP ? name : tem;
+     }
+   return name;
+ }
+ 
  /* Try to simplify RHS using equivalences and constant folding.  */
  
  static tree
*************** try_to_simplify (gimple stmt)
*** 2862,2882 ****
        && TREE_CODE (gimple_assign_rhs1 (stmt)) == SSA_NAME)
      return NULL_TREE;
  
    switch (TREE_CODE_CLASS (gimple_assign_rhs_code (stmt)))
      {
-     case tcc_declaration:
-       tem = get_symbol_constant_value (gimple_assign_rhs1 (stmt));
-       if (tem)
- 	return tem;
-       break;
- 
      case tcc_reference:
-       /* Do not do full-blown reference lookup here, but simplify
- 	 reads from constant aggregates.  */
-       tem = fold_const_aggregate_ref (gimple_assign_rhs1 (stmt));
-       if (tem)
- 	return tem;
- 
        /* Fallthrough for some codes that can operate on registers.  */
        if (!(TREE_CODE (gimple_assign_rhs1 (stmt)) == REALPART_EXPR
  	    || TREE_CODE (gimple_assign_rhs1 (stmt)) == IMAGPART_EXPR
--- 2885,2899 ----
        && TREE_CODE (gimple_assign_rhs1 (stmt)) == SSA_NAME)
      return NULL_TREE;
  
+   /* First try constant folding based on our current lattice.  */
+   tem = gimple_fold_stmt_to_constant (stmt, vn_valueize);
+   if (tem)
+     return tem;
+ 
+   /* If that didn't work try combining multiple statements.  */
    switch (TREE_CODE_CLASS (gimple_assign_rhs_code (stmt)))
      {
      case tcc_reference:
        /* Fallthrough for some codes that can operate on registers.  */
        if (!(TREE_CODE (gimple_assign_rhs1 (stmt)) == REALPART_EXPR
  	    || TREE_CODE (gimple_assign_rhs1 (stmt)) == IMAGPART_EXPR
*************** try_to_simplify (gimple stmt)
*** 2886,2896 ****
  	 into binary ops, but it's debatable whether it is worth it. */
      case tcc_unary:
        return simplify_unary_expression (stmt);
!       break;
      case tcc_comparison:
      case tcc_binary:
        return simplify_binary_expression (stmt);
!       break;
      default:
        break;
      }
--- 2903,2913 ----
  	 into binary ops, but it's debatable whether it is worth it. */
      case tcc_unary:
        return simplify_unary_expression (stmt);
! 
      case tcc_comparison:
      case tcc_binary:
        return simplify_binary_expression (stmt);
! 
      default:
        break;
      }
Index: gcc/tree-flow.h
===================================================================
*** gcc/tree-flow.h.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-flow.h	2011-03-17 16:12:23.000000000 +0100
*************** extern void ssanames_print_statistics (v
*** 594,599 ****
--- 594,600 ----
  
  /* In tree-ssa-ccp.c  */
  tree fold_const_aggregate_ref (tree);
+ tree gimple_fold_stmt_to_constant (gimple, tree (*)(tree));
  
  /* In tree-ssa-dom.c  */
  extern void dump_dominator_optimization_stats (FILE *);
Index: gcc/testsuite/c-c++-common/pr46562-2.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/c-c++-common/pr46562-2.c	2011-03-17 16:12:23.000000000 +0100
***************
*** 0 ****
--- 1,13 ----
+ /* { dg-do compile } */
+ /* { dg-options "-O -fno-tree-ccp -fno-tree-forwprop -fdump-tree-fre" } */
+ 
+ static const int a[4] = {};
+ int foo(void)
+ {
+   int i = 1;
+   const int *p = &a[i];
+   return *p;
+ }
+ 
+ /* { dg-final { scan-tree-dump "= 0;" "fre" } } */
+ /* { dg-final { cleanup-tree-dump "fre" } } */
Index: gcc/testsuite/c-c++-common/pr46562.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/c-c++-common/pr46562.c	2011-03-17 16:12:23.000000000 +0100
***************
*** 0 ****
--- 1,13 ----
+ /* { dg-do compile } */
+ /* { dg-options "-O -fdump-tree-ccp1" } */
+ 
+ static const int a[4] = {};
+ int foo(void)
+ {
+   int i = 1;
+   const int *p = &a[i];
+   return *p;
+ }
+ 
+ /* { dg-final { scan-tree-dump "return 0;" "ccp1" } } */
+ /* { dg-final { cleanup-tree-dump "ccp1" } } */
Index: gcc/tree-vrp.c
===================================================================
*** gcc/tree-vrp.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-vrp.c	2011-03-17 16:12:23.000000000 +0100
*************** vrp_initialize (void)
*** 5614,5619 ****
--- 5614,5634 ----
      }
  }
  
+ /* Return the singleton value-range for NAME or NAME.  */
+ 
+ static inline tree
+ vrp_valueize (tree name)
+ {
+   if (TREE_CODE (name) == SSA_NAME)
+     {
+       value_range_t *vr = get_value_range (name);
+       if (vr->type == VR_RANGE
+ 	  && (vr->min == vr->max
+ 	      || operand_equal_p (vr->min, vr->max, 0)))
+ 	return vr->min;
+     }
+   return name;
+ }
  
  /* Visit assignment STMT.  If it produces an interesting range, record
     the SSA name in *OUTPUT_P.  */
*************** vrp_visit_assignment_or_call (gimple stm
*** 5637,5643 ****
      {
        value_range_t new_vr = { VR_UNDEFINED, NULL_TREE, NULL_TREE, NULL };
  
!       if (code == GIMPLE_CALL)
  	extract_range_basic (&new_vr, stmt);
        else
  	extract_range_from_assignment (&new_vr, stmt);
--- 5652,5663 ----
      {
        value_range_t new_vr = { VR_UNDEFINED, NULL_TREE, NULL_TREE, NULL };
  
!       /* Try folding the statement to a constant first.  */
!       tree tem = gimple_fold_stmt_to_constant (stmt, vrp_valueize);
!       if (tem && !is_overflow_infinity (tem))
! 	set_value_range (&new_vr, VR_RANGE, tem, tem, NULL);
!       /* Then dispatch to value-range extracting functions.  */
!       else if (code == GIMPLE_CALL)
  	extract_range_basic (&new_vr, stmt);
        else
  	extract_range_from_assignment (&new_vr, stmt);
*************** vrp_visit_stmt (gimple stmt, edge *taken
*** 6366,6372 ****
        /* In general, assignments with virtual operands are not useful
  	 for deriving ranges, with the obvious exception of calls to
  	 builtin functions.  */
- 
        if ((is_gimple_call (stmt)
  	   && gimple_call_fndecl (stmt) != NULL_TREE
  	   && DECL_IS_BUILTIN (gimple_call_fndecl (stmt)))
--- 6386,6391 ----
Index: gcc/tree-ssa-pre.c
===================================================================
*** gcc/tree-ssa-pre.c.orig	2011-03-17 15:56:58.000000000 +0100
--- gcc/tree-ssa-pre.c	2011-03-17 16:12:23.000000000 +0100
*************** eliminate (void)
*** 4380,4388 ****
  	  if (is_gimple_call (stmt)
  	      && TREE_CODE (gimple_call_fn (stmt)) == SSA_NAME)
  	    {
! 	      tree fn = VN_INFO (gimple_call_fn (stmt))->valnum;
  	      if (TREE_CODE (fn) == ADDR_EXPR
! 		  && TREE_CODE (TREE_OPERAND (fn, 0)) == FUNCTION_DECL)
  		{
  		  bool can_make_abnormal_goto
  		    = stmt_can_make_abnormal_goto (stmt);
--- 4380,4391 ----
  	  if (is_gimple_call (stmt)
  	      && TREE_CODE (gimple_call_fn (stmt)) == SSA_NAME)
  	    {
! 	      tree orig_fn = gimple_call_fn (stmt);
! 	      tree fn = VN_INFO (orig_fn)->valnum;
  	      if (TREE_CODE (fn) == ADDR_EXPR
! 		  && TREE_CODE (TREE_OPERAND (fn, 0)) == FUNCTION_DECL
! 		  && useless_type_conversion_p (TREE_TYPE (orig_fn),
! 						TREE_TYPE (fn)))
  		{
  		  bool can_make_abnormal_goto
  		    = stmt_can_make_abnormal_goto (stmt);



More information about the Gcc-patches mailing list