[IPA] SSA inliner

Jan Hubicka jh@suse.cz
Fri Aug 19 20:35:00 GMT 2005


Hi,
this patch implements inlining on SSA form that allows early
optimizations to be done more cheaply and IPA to use SSA for it's
analysis.  The patch is sort of experimental still as there are number
of problems:
  1) Updating of SSA form across EH edges from inlined body to outer function
  does not work well in all cases.  I am discussing this with Diego.
  2) All IPA passes are disabled right now and will likely need little bit
  of updating to work on SSA
  3) there are some ICEs in libjava testsuite I will need to investigate
  once I figure out how to compile them off the testsuite environment ;)
  4) no early optimizations are done as we can't build SSA aliasing before
  inlining yet.
  5) some of testcases grepping ssa dumps needs renaming as moving
  passes around insist on renaming the dump with current passmanager
  organization.  Perhaps this can be trottled down somwhat...
  6) There are couple of rather serve FIXMEs in the patch.

But still the patch should be useful for development of IPA passes and I would
really welcome any help on addressing the remaining issues.  I plan to give the
patch little bit more testing and then commit it into IPA branch even when it
cause couple of regressions as it don't seem to be too useful to hold it
further.

Bootstrapped i686-pc-gnu-linux and I will commit it soonish.

Honza

2005-08-18  Jan Hubicka  <jh@suse.cz>
	* cgraph.h (rebuild_cgraph_edges): Declare.
	* cgraphunit.c (rebuild_cgraph_edges): Rebuild.
	* ipa-inline.c (cgraph_apply_inline_plan): Initialize bitmaps; free
	dominance info.
	(cgraph_decide_inlining_incrementally): Likewise.
	* opts.c (decode_options): Temporarily disable salias.
	* passes.c (init_optimization_passes): Temporariliy disable all IPA
	passes; move SSA construction into early_optimization_passes.
	* tree-inline.c: Include tree-pass.h
	(remap_ssa_name): New function.
	(remap_decl): Initialize SSA structs.
	(copy_body_r): Handle SSA names; register global vars.
	(copy_bb): Fold all new statements; update SSA_DEF_STMTs.
	(copy_edges_for_bb): Update PHI nodes of EH edges.
	(copy_phis_for_bb): New function.
	(initialize_cfun): Break out from ...
	(copy_cfg_body): ... here.
	(setup_one_parameter): Handle SSA form.
	(declare_return_variable): Likewise.
	(expand_call_inline): Initialize calle_cfun early; deal with
	SSA when replacing original assignment.
	(optimize_inline_call): Some extra checking and rebuilding.
	(tree_function_versioning): Use initialize_cfun; do not construct
	temporary decl; free dominance info.
	* tree-optimize.c (gate_all_optimizations): FOrce optimizations when in
	SSA form.
	(gate_all_early_optimizations): New.
	(pass_all_early_optimizations): New.
	(tree_lowering_passes): Do not compact blocks; do early
	optimizations when called late; free dominance info.
	* tree-pass.h (pass_all_early_optimizations): Declare.
	* tree-ssa-ccp (get_maxval_strlen): Allow SSA to not be complette yet.
Index: bitmap.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/bitmap.c,v
retrieving revision 1.71
diff -c -3 -p -r1.71 bitmap.c
*** bitmap.c	8 Jul 2005 17:34:13 -0000	1.71
--- bitmap.c	19 Aug 2005 16:13:19 -0000
*************** bitmap_obstack_release (bitmap_obstack *
*** 224,229 ****
--- 224,232 ----
    bit_obstack->elements = NULL;
    bit_obstack->heads = NULL;
    obstack_free (&bit_obstack->obstack, NULL);
+ #ifdef ENABLE_CHECKING
+   memset (&bit_obstack->obstack, 0xab, sizeof (*&bit_obstack->obstack));
+ #endif
  }
  
  /* Create a new bitmap on an obstack.  If BIT_OBSTACK is NULL, create
Index: cgraph.h
===================================================================
RCS file: /cvs/gcc/gcc/gcc/cgraph.h,v
retrieving revision 1.64.2.2
diff -c -3 -p -r1.64.2.2 cgraph.h
*** cgraph.h	14 Aug 2005 13:52:35 -0000	1.64.2.2
--- cgraph.h	19 Aug 2005 16:13:19 -0000
*************** void init_cgraph (void);
*** 290,295 ****
--- 290,296 ----
  struct cgraph_node *cgraph_function_versioning (struct cgraph_node *,
                                                  varray_type, varray_type);
  struct cgraph_node *save_inline_function_body (struct cgraph_node *);
+ void rebuild_cgraph_edges (void);
  
  /* In ipa.c  */
  bool cgraph_remove_unreachable_nodes (bool, FILE *);
Index: cgraphunit.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/cgraphunit.c,v
retrieving revision 1.126.2.2
diff -c -3 -p -r1.126.2.2 cgraphunit.c
*** cgraphunit.c	14 Aug 2005 13:52:35 -0000	1.126.2.2
--- cgraphunit.c	19 Aug 2005 16:13:19 -0000
*************** initialize_inline_failed (struct cgraph_
*** 594,600 ****
  
  /* Rebuild call edges from current function after a passes not aware
     of cgraph updating.  */
! static void
  rebuild_cgraph_edges (void)
  {
    basic_block bb;
--- 594,600 ----
  
  /* Rebuild call edges from current function after a passes not aware
     of cgraph updating.  */
! void
  rebuild_cgraph_edges (void)
  {
    basic_block bb;
Index: ipa-inline.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/ipa-inline.c,v
retrieving revision 2.15.2.2
diff -c -3 -p -r2.15.2.2 ipa-inline.c
*** ipa-inline.c	14 Aug 2005 13:52:36 -0000	2.15.2.2
--- ipa-inline.c	19 Aug 2005 16:13:19 -0000
*************** cgraph_apply_inline_plan (void)
*** 525,530 ****
--- 525,534 ----
      if (!order[i]->global.inlined_to)
        order[new_order_pos++] = order[i];
  
+   /* Initialize the default bitmap obstack.  */
+   bitmap_obstack_initialize (NULL);
+ 
+ 
    for (i = 0; i < new_order_pos; i++)
      {
        struct cgraph_edge *e;
*************** cgraph_apply_inline_plan (void)
*** 541,546 ****
--- 545,552 ----
  	  tree_register_cfg_hooks ();
            current_function_decl = node->decl;
  	  optimize_inline_calls (node->decl, false);
+ 	  free_dominance_info (CDI_DOMINATORS);
+ 	  free_dominance_info (CDI_POST_DOMINATORS);
  	  node->local.self_insns = node->global.insns;
  	  pop_cfun ();
  	  ggc_collect ();
*************** cgraph_decide_inlining_incrementally (st
*** 1122,1131 ****
--- 1128,1141 ----
  	}
    if (inlined || (warn_inline && !early))
      {
+       /* Initialize the default bitmap obstack.  */
+       bitmap_obstack_initialize (NULL);
        push_cfun (DECL_STRUCT_FUNCTION (node->decl));
        tree_register_cfg_hooks ();
        current_function_decl = node->decl;
        optimize_inline_calls (current_function_decl, early);
+       free_dominance_info (CDI_DOMINATORS);
+       free_dominance_info (CDI_POST_DOMINATORS);
        node->local.self_insns = node->global.insns;
        current_function_decl = NULL;
        pop_cfun ();
*************** cgraph_decide_inlining_incrementally (st
*** 1138,1144 ****
  static bool
  cgraph_gate_inlining (void)
  {
!   return flag_inline_trees;
  }
  
  struct tree_opt_pass pass_ipa_inline = 
--- 1148,1154 ----
  static bool
  cgraph_gate_inlining (void)
  {
!   return flag_inline_trees /*&& 0*/;
  }
  
  struct tree_opt_pass pass_ipa_inline = 
Index: opts.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/opts.c,v
retrieving revision 1.122
diff -c -3 -p -r1.122 opts.c
*** opts.c	28 Jul 2005 21:48:23 -0000	1.122
--- opts.c	19 Aug 2005 16:13:19 -0000
*************** decode_options (unsigned int argc, const
*** 536,542 ****
        flag_tree_fre = 1;
        flag_tree_copy_prop = 1;
        flag_tree_sink = 1;
!       flag_tree_salias = 1;
        flag_unit_at_a_time = 1;
  
        if (!optimize_size)
--- 536,543 ----
        flag_tree_fre = 1;
        flag_tree_copy_prop = 1;
        flag_tree_sink = 1;
!       /* THere is problem with local annotations and aliasing run early right now.  */
!       /*flag_tree_salias = 1;*/
        flag_unit_at_a_time = 1;
  
        if (!optimize_size)
Index: passes.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/passes.c,v
retrieving revision 2.108.2.1
diff -c -3 -p -r2.108.2.1 passes.c
*** passes.c	9 Aug 2005 11:25:33 -0000	2.108.2.1
--- passes.c	19 Aug 2005 16:13:20 -0000
*************** init_optimization_passes (void)
*** 433,444 ****
    p = &all_ipa_passes;
    NEXT_PASS (pass_early_ipa_inline);
    NEXT_PASS (pass_early_local_passes);
!   NEXT_PASS (pass_ipa_cp);
    NEXT_PASS (pass_ipa_inline);
!   NEXT_PASS (pass_ipa_alias);
    NEXT_PASS (pass_ipa_reference);
    NEXT_PASS (pass_ipa_pure_const); 
!   NEXT_PASS (pass_ipa_type_escape);
    *p = NULL;
  
    /* All passes needed to lower the function into shape optimizers can operate
--- 433,444 ----
    p = &all_ipa_passes;
    NEXT_PASS (pass_early_ipa_inline);
    NEXT_PASS (pass_early_local_passes);
!   /*NEXT_PASS (pass_ipa_cp);*/
    NEXT_PASS (pass_ipa_inline);
!   /*NEXT_PASS (pass_ipa_alias);
    NEXT_PASS (pass_ipa_reference);
    NEXT_PASS (pass_ipa_pure_const); 
!   NEXT_PASS (pass_ipa_type_escape);*/
    *p = NULL;
  
    /* All passes needed to lower the function into shape optimizers can operate
*************** init_optimization_passes (void)
*** 459,470 ****
    p = &pass_early_local_passes.sub;
    NEXT_PASS (pass_tree_profile);
    NEXT_PASS (pass_cleanup_cfg);
    NEXT_PASS (pass_rebuild_cgraph_edges);
    *p = NULL;
  
    p = &all_passes;
    NEXT_PASS (pass_fixup_cfg);
!   NEXT_PASS (pass_init_datastructures);
    NEXT_PASS (pass_all_optimizations);
    NEXT_PASS (pass_warn_function_noreturn);
    NEXT_PASS (pass_mudflap_2);
--- 459,482 ----
    p = &pass_early_local_passes.sub;
    NEXT_PASS (pass_tree_profile);
    NEXT_PASS (pass_cleanup_cfg);
+   NEXT_PASS (pass_all_early_optimizations);
    NEXT_PASS (pass_rebuild_cgraph_edges);
    *p = NULL;
  
+   p = &pass_all_early_optimizations.sub;
+   NEXT_PASS (pass_init_datastructures);
+   NEXT_PASS (pass_referenced_vars);
+   NEXT_PASS (pass_create_structure_vars);
+   NEXT_PASS (pass_build_ssa);
+   /*NEXT_PASS (pass_may_alias);
+   NEXT_PASS (pass_return_slot);
+   NEXT_PASS (pass_rename_ssa_copies);*/
+   NEXT_PASS (pass_early_warn_uninitialized);
+   *p = NULL;
+ 
    p = &all_passes;
    NEXT_PASS (pass_fixup_cfg);
!   /*NEXT_PASS (pass_init_datastructures);*/
    NEXT_PASS (pass_all_optimizations);
    NEXT_PASS (pass_warn_function_noreturn);
    NEXT_PASS (pass_mudflap_2);
*************** init_optimization_passes (void)
*** 476,488 ****
    *p = NULL;
  
    p = &pass_all_optimizations.sub;
!   NEXT_PASS (pass_referenced_vars);
    NEXT_PASS (pass_create_structure_vars);
!   NEXT_PASS (pass_build_ssa);
    NEXT_PASS (pass_may_alias);
    NEXT_PASS (pass_return_slot);
!   NEXT_PASS (pass_rename_ssa_copies);
!   NEXT_PASS (pass_early_warn_uninitialized);
    NEXT_PASS (pass_eliminate_useless_stores);
  
    /* Initial scalar cleanups.  */
--- 488,501 ----
    *p = NULL;
  
    p = &pass_all_optimizations.sub;
!   /*NEXT_PASS (pass_referenced_vars);
    NEXT_PASS (pass_create_structure_vars);
!   NEXT_PASS (pass_build_ssa);*/
!   NEXT_PASS (pass_all_early_optimizations);
    NEXT_PASS (pass_may_alias);
    NEXT_PASS (pass_return_slot);
!   /*NEXT_PASS (pass_rename_ssa_copies);
!   NEXT_PASS (pass_early_warn_uninitialized);*/
    NEXT_PASS (pass_eliminate_useless_stores);
  
    /* Initial scalar cleanups.  */
Index: tree-inline.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-inline.c,v
retrieving revision 1.210.2.2
diff -c -3 -p -r1.210.2.2 tree-inline.c
*** tree-inline.c	14 Aug 2005 13:52:36 -0000	1.210.2.2
--- tree-inline.c	19 Aug 2005 16:13:20 -0000
*************** Boston, MA 02110-1301, USA.  */
*** 50,55 ****
--- 50,56 ----
  #include "debug.h"
  #include "pointer-set.h"
  #include "ipa-prop.h"
+ #include "tree-pass.h"
  
  /* I'm not real happy about this, but we need to handle gimple and
     non-gimple trees.  */
*************** insert_decl_map (inline_data *id, tree k
*** 180,185 ****
--- 181,229 ----
  		       (splay_tree_value) value);
  }
  
+ /* Construct new SSA name for old one.  */
+ 
+ static tree
+ remap_ssa_name (tree name, inline_data *id)
+ {
+   tree new;
+   splay_tree_node n;
+ 
+   gcc_assert (TREE_CODE (name) == SSA_NAME);
+ 
+   n = splay_tree_lookup (id->decl_map, (splay_tree_key) name);
+   if (n)
+     return (tree) n->value;
+ 
+   /* Do not set DEF_STMT yet as statement might not get copied.  */
+   new = remap_decl (SSA_NAME_VAR (name), id);
+   /* We might've substituted constant or another SSA_NAME for the variable.
+    */
+   if ((TREE_CODE (new) == VAR_DECL || TREE_CODE (new) == PARM_DECL)
+       /* Forcingly coalesce all SSA names for return values so we don't need to
+          construct PHI node for possibly multiple return statements by hand.  */
+       && (TREE_CODE (SSA_NAME_VAR (name)) != RESULT_DECL || !inlining_p (id)))
+     {
+       new = make_ssa_name (new, NULL);
+       insert_decl_map (id, name, new);
+       if (IS_EMPTY_STMT (SSA_NAME_DEF_STMT (name))
+ 	  /* When inlining, parameters are replaced by initialized vars.  */
+ 	  && (TREE_CODE (new) == PARM_DECL || TREE_CODE (name) != PARM_DECL))
+ 	{
+ 	  SSA_NAME_DEF_STMT (new) = build_empty_stmt ();
+ 	  if (default_def_fn (id->callee_cfun, SSA_NAME_VAR (name)) == name)
+ 	    set_default_def (SSA_NAME_VAR (new), new);
+ 	}
+       SSA_NAME_OCCURS_IN_ABNORMAL_PHI (new)
+ 	= SSA_NAME_OCCURS_IN_ABNORMAL_PHI (name);
+     }
+   else
+     insert_decl_map (id, name, new);
+   TREE_TYPE (new) = remap_type (TREE_TYPE (name), id);
+   return new;
+ }
+ 
+ 
  /* Remap DECL during the copying of the BLOCK tree for the function.  */
  
  static tree
*************** remap_decl (tree decl, inline_data *id)
*** 250,255 ****
--- 294,318 ----
        /* Remember it, so that if we encounter this local entity
  	 again we can reuse this copy.  */
        insert_decl_map (id, decl, t);
+ 
+       if (in_ssa_p
+ 	  && (TREE_CODE (t) == VAR_DECL
+ 	      || TREE_CODE (t) == RESULT_DECL || TREE_CODE (t) == PARM_DECL))
+ 	{
+ 	  tree def;
+ 
+ 	  get_var_ann (t);
+ 	  if (TREE_CODE (decl) != PARM_DECL
+ 	      && (def = default_def_fn (id->callee_cfun, decl)))
+ 	    {
+ 	      tree map = remap_ssa_name (def, id);
+ 	      /* Watch out RESULT_DECLs whose SSA names map directly to them.
+ 	         */
+ 	      if (TREE_CODE (map) == SSA_NAME)
+ 	        set_default_def (t, map);
+ 	    }
+ 	  add_referenced_tmp_var (t);
+ 	}
        return t;
      }
  
*************** copy_body_r (tree *tp, int *walk_subtree
*** 529,534 ****
--- 592,603 ----
  	  return (void *)1;
  	}
      }
+   else if (TREE_CODE (*tp) == SSA_NAME)
+     {
+       *tp = remap_ssa_name (*tp, id);
+       *walk_subtrees = 0;
+       return NULL;
+     }
  
    /* Local variables and labels need to be replaced by equivalent
       variables.  We don't want to copy static variables; there's only
*************** copy_body_r (tree *tp, int *walk_subtree
*** 644,649 ****
--- 713,723 ----
        /* Here is the "usual case".  Copy this tree node, and then
  	 tweak some special cases.  */
        copy_tree_r (tp, walk_subtrees, id->versioning_p ? data : NULL);
+ 
+       /* Global variables we didn't seen yet needs to go into referenced vars.
+ 	 */
+       if (in_ssa_p && TREE_CODE (*tp) == VAR_DECL)
+ 	add_referenced_tmp_var (*tp);
         
        /* If EXPR has block defined, map it to newly constructed block.
           When inlining we want EXPRs without block appear in the block
*************** copy_bb (inline_data *id, basic_block bb
*** 723,728 ****
--- 797,805 ----
        if (stmt)
  	{
  	  tree call, decl;
+ 
+ 	  fold_stmt (&stmt);
+ 
            bsi_insert_after (&copy_bsi, stmt, BSI_NEW_STMT);
  	  call = get_call_expr_in (stmt);
  	  /* We're duplicating a CALL_EXPR.  Find any corresponding
*************** copy_bb (inline_data *id, basic_block bb
*** 783,788 ****
--- 860,882 ----
  		  && tree_could_throw_p (stmt))
  		add_stmt_to_eh_region (stmt, id->eh_region);
  	    }
+ 	  if (in_ssa_p)
+ 	    {
+ 	       ssa_op_iter i;
+ 	       tree def;
+ 
+ 	       FOR_EACH_SSA_TREE_OPERAND (def, stmt, i, SSA_OP_DEF)
+ 		if (TREE_CODE (def) == SSA_NAME)
+ 		  {
+ 		    /* Not quite right, orig_stmt might be <retval>=blah.  
+ 		       verify_ssa would die later if we really had multiple
+ 		       defintiions...
+ 		    gcc_assert (!SSA_NAME_DEF_STMT (def)
+ 			        || TREE_CODE (orig_stmt) == RETURN_EXPR);
+ 		       */
+ 		    SSA_NAME_DEF_STMT (def) = stmt;
+ 		  }
+ 	    }
  	}
      }
    return copy_basic_block;
*************** copy_bb (inline_data *id, basic_block bb
*** 792,798 ****
     accordingly.  Edges will be taken care of later.  Assume aux
     pointers to point to the copies of each BB.  */
  static void
! copy_edges_for_bb (basic_block bb, int count_scale)
  {
    basic_block new_bb = bb->aux;
    edge_iterator ei;
--- 886,892 ----
     accordingly.  Edges will be taken care of later.  Assume aux
     pointers to point to the copies of each BB.  */
  static void
! copy_edges_for_bb (basic_block bb, int count_scale, basic_block exit_block_map)
  {
    basic_block new_bb = bb->aux;
    edge_iterator ei;
*************** copy_edges_for_bb (basic_block bb, int c
*** 827,832 ****
--- 921,928 ----
  
        copy_stmt = bsi_stmt (bsi);
        update_stmt (copy_stmt);
+       if (in_ssa_p)
+         mark_new_vars_to_rename (copy_stmt);
        /* Do this before the possible split_block.  */
        bsi_next (&bsi);
  
*************** copy_edges_for_bb (basic_block bb, int c
*** 844,859 ****
--- 940,1047 ----
  
        if (tree_can_throw_internal (copy_stmt))
  	{
+ 	  edge_iterator ei;
  	  if (!bsi_end_p (bsi))
  	    /* Note that bb's predecessor edges aren't necessarily
  	       right at this point; split_block doesn't care.  */
  	    {
  	      edge e = split_block (new_bb, copy_stmt);
+ 
  	      new_bb = e->dest;
+ 	      new_bb->aux = e->src->aux;
  	      bsi = bsi_start (new_bb);
  	    }
  
             make_eh_edges (copy_stmt);
+ 
+ 	   /* Update PHIs of EH edges destinating to landing pads in function we
+ 	      inline into.
+ 	      This can be done by clonning PHI arguments from EH edges originally
+ 	      attached to call statement we are replacing.  */
+ 	   if (in_ssa_p)
+ 	     {
+ 	       edge e;
+ 
+ 	       FOR_EACH_EDGE (e, ei, bb_for_stmt (copy_stmt)->succs)
+ 		 if (!e->dest->aux
+ 		     || ((basic_block)e->dest->aux)->index == ENTRY_BLOCK)
+ 		   {
+ 		     tree phi;
+ 
+ 		     gcc_assert (e->flags & EDGE_EH);
+ 		     for (phi = phi_nodes (e->dest); phi; phi = PHI_CHAIN (phi))
+ 		       {
+ 			 /* Lookup original edge from call expression.  */
+ 			 edge master = find_edge (exit_block_map, e->dest);
+ 			 if (master)
+ 			   {
+ 			     tree new_arg = PHI_ARG_DEF_FROM_EDGE (phi, master);
+ 
+ 			     gcc_assert (master->flags & EDGE_EH);
+ 			     /* FIXME: Because of hack bellow we might not
+ 			        find argument.  
+ 			     gcc_assert (new_arg);*/
+ 			     if (new_arg)
+ 			       add_phi_arg (phi, new_arg, e);
+ 			   }
+ 			 else
+ 			   {
+ 			     /* FIXME: It is possible that the edge doesn't
+ 			        exist because we have more information about
+ 				EH type and the edge routes lower in EH region
+ 				tree.
+ 				It seems to me that it is not safe to rely
+ 				on ssa_update here as the variable might have
+ 				overlapping liveranges of SSA_NAMEs and updating
+ 				would screw up, but Diego seems to think it is
+ 				safe, so lets give it a try now.  
+ 				
+ 				If it really is safe, we might just remove
+ 				the code and mark all PHI arguments for
+ 				renaming..  */
+ 			     mark_sym_for_renaming
+ 			       (SSA_NAME_VAR (PHI_RESULT (phi)));
+ 			     break;
+ 			   }
+ 		       }
+ 		   }
+ 	     }
+ 	}
+     }
+ }
+ 
+ /* Copy the PHIs.  All blocks and edges was been copied, some blocks
+    was possibly splited and new outgoing EH edges inserted.
+    BB points to the block of original function and AUX pointers links
+    the original and newly copied blocks.  */
+ static void
+ copy_phis_for_bb (basic_block bb, inline_data *id)
+ {
+   basic_block new_bb = bb->aux;
+   edge_iterator ei;
+   tree phi;
+ 
+   for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
+     {
+       tree res = PHI_RESULT (phi);
+       tree new_res = res;
+       tree new_phi;
+       edge new_edge;
+ 
+       if (is_gimple_reg (res))
+ 	{
+ 	  walk_tree (&new_res, copy_body_r, id, NULL);
+ 	  SSA_NAME_DEF_STMT (new_res) = new_phi = create_phi_node (new_res, new_bb);
+ 	  FOR_EACH_EDGE (new_edge, ei, new_bb->preds)
+ 	    {
+ 	      edge old_edge = find_edge (new_edge->src->aux, bb);
+ 	      tree arg = PHI_ARG_DEF_FROM_EDGE (phi, old_edge);
+ 	      tree new_arg = arg;
+ 
+ 	      walk_tree (&new_arg, copy_body_r, id, NULL);
+ 	      gcc_assert (new_arg);
+ 	      add_phi_arg (new_phi, new_arg, new_edge);
+ 	    }
  	}
      }
  }
*************** remap_decl_1 (tree decl, void *data)
*** 865,870 ****
--- 1053,1112 ----
    return remap_decl (decl, data);
  }
  
+ static void
+ initialize_cfun (tree new_fndecl, tree callee_fndecl, gcov_type count, int frequency)
+ {
+   struct function *new_cfun = (struct function *) ggc_alloc_cleared (sizeof (struct function));
+   struct function *callee_cfun = DECL_STRUCT_FUNCTION (callee_fndecl);
+   int count_scale, frequency_scale;
+ 
+   if (ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count)
+     count_scale = (REG_BR_PROB_BASE * count
+ 		   / ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count);
+   else
+     count_scale = 1;
+ 
+   if (ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency)
+     frequency_scale = (REG_BR_PROB_BASE * frequency
+ 		       /
+ 		       ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency);
+   else
+     frequency_scale = count_scale;
+ 
+   /* Register specific tree functions.  */
+   tree_register_cfg_hooks ();
+   *new_cfun = *DECL_STRUCT_FUNCTION (callee_fndecl);
+   new_cfun->cfg = NULL;
+   new_cfun->decl = new_fndecl /*= copy_node (callee_fndecl)*/;
+   new_cfun->ib_boundaries_block = (varray_type) 0;
+   DECL_STRUCT_FUNCTION (new_fndecl) = new_cfun;
+   push_cfun (new_cfun);
+   init_empty_tree_cfg ();
+ 
+   ENTRY_BLOCK_PTR->count =
+     (ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count * count_scale /
+      REG_BR_PROB_BASE);
+   ENTRY_BLOCK_PTR->frequency =
+     (ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency *
+      frequency_scale / REG_BR_PROB_BASE);
+   EXIT_BLOCK_PTR->count =
+     (EXIT_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count * count_scale /
+      REG_BR_PROB_BASE);
+   EXIT_BLOCK_PTR->frequency =
+     (EXIT_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency *
+      frequency_scale / REG_BR_PROB_BASE);
+   if (callee_cfun->eh)
+     init_eh_for_function ();
+ 
+   if (callee_cfun->ssa)
+     {
+       init_tree_ssa ();
+       cfun->ssa->x_in_ssa_p = true;
+       init_ssa_operands ();
+     }
+   pop_cfun ();
+ }
+ 
  /* Make a copy of the body of FN so that it can be inserted inline in
     another function.  Walks FN via CFG, returns new fndecl.  */
  
*************** copy_cfg_body (inline_data * id, gcov_ty
*** 875,889 ****
    tree callee_fndecl = id->callee;
    /* Original cfun for the callee, doesn't change.  */
    struct function *callee_cfun = DECL_STRUCT_FUNCTION (callee_fndecl);
-   /* Copy, built by this function.  */
-   struct function *new_cfun;
    /* Place to copy from; when a copy of the function was saved off earlier,
       use that instead of the main copy.  */
    struct function *cfun_to_copy =
      (struct function *) ggc_alloc_cleared (sizeof (struct function));
    basic_block bb;
    tree new_fndecl = NULL;
-   bool versioning_or_cloning;
    int count_scale, frequency_scale;
  
    if (ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count)
--- 1117,1128 ----
*************** copy_cfg_body (inline_data * id, gcov_ty
*** 910,958 ****
  
    id->callee_cfun = cfun_to_copy;
  
-   /* If saving or cloning a function body, create new basic_block_info
-      and label_to_block_maps.  Otherwise, we're duplicating a function
-      body for inlining; insert our new blocks and labels into the
-      existing varrays.  */
-   versioning_or_cloning = (id->cloning_p || id->versioning_p);
-   if (versioning_or_cloning)
-     {
-       new_cfun =
- 	(struct function *) ggc_alloc_cleared (sizeof (struct function));
-       *new_cfun = *DECL_STRUCT_FUNCTION (callee_fndecl);
-       new_cfun->cfg = NULL;
-       new_cfun->decl = new_fndecl = copy_node (callee_fndecl);
-       new_cfun->ib_boundaries_block = (varray_type) 0;
-       DECL_STRUCT_FUNCTION (new_fndecl) = new_cfun;
-       push_cfun (new_cfun);
-       init_empty_tree_cfg ();
- 
-       ENTRY_BLOCK_PTR->count =
- 	(ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count * count_scale /
- 	 REG_BR_PROB_BASE);
-       ENTRY_BLOCK_PTR->frequency =
- 	(ENTRY_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency *
- 	 frequency_scale / REG_BR_PROB_BASE);
-       EXIT_BLOCK_PTR->count =
- 	(EXIT_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->count * count_scale /
- 	 REG_BR_PROB_BASE);
-       EXIT_BLOCK_PTR->frequency =
- 	(EXIT_BLOCK_PTR_FOR_FUNCTION (callee_cfun)->frequency *
- 	 frequency_scale / REG_BR_PROB_BASE);
- 
-       entry_block_map = ENTRY_BLOCK_PTR;
-       exit_block_map = EXIT_BLOCK_PTR;
-     }
- 
    ENTRY_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy)->aux = entry_block_map;
    EXIT_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy)->aux = exit_block_map;
  
  
    /* Duplicate any exception-handling regions.  */
    if (cfun->eh)
      {
-       if (versioning_or_cloning)
-         init_eh_for_function ();
        id->eh_region_offset = duplicate_eh_regions (cfun_to_copy,
  		     				   remap_decl_1,
  						   id, id->eh_region);
--- 1149,1163 ----
  
    id->callee_cfun = cfun_to_copy;
  
    ENTRY_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy)->aux = entry_block_map;
    EXIT_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy)->aux = exit_block_map;
+   entry_block_map->aux = ENTRY_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy);
+   exit_block_map->aux = EXIT_BLOCK_PTR_FOR_FUNCTION (cfun_to_copy);
  
  
    /* Duplicate any exception-handling regions.  */
    if (cfun->eh)
      {
        id->eh_region_offset = duplicate_eh_regions (cfun_to_copy,
  		     				   remap_decl_1,
  						   id, id->eh_region);
*************** copy_cfg_body (inline_data * id, gcov_ty
*** 960,974 ****
      }
    /* Use aux pointers to map the original blocks to copy.  */
    FOR_EACH_BB_FN (bb, cfun_to_copy)
!     bb->aux = copy_bb (id, bb, frequency_scale, count_scale);
    /* Now that we've duplicated the blocks, duplicate their edges.  */
    FOR_ALL_BB_FN (bb, cfun_to_copy)
!     copy_edges_for_bb (bb, count_scale);
    FOR_ALL_BB_FN (bb, cfun_to_copy)
!     bb->aux = NULL;
! 
!   if (versioning_or_cloning)
!     pop_cfun ();
  
    return new_fndecl;
  }
--- 1165,1185 ----
      }
    /* Use aux pointers to map the original blocks to copy.  */
    FOR_EACH_BB_FN (bb, cfun_to_copy)
!     {
!       basic_block new = copy_bb (id, bb, frequency_scale, count_scale);
!       bb->aux = new;
!       new->aux = bb;
!     }
    /* Now that we've duplicated the blocks, duplicate their edges.  */
    FOR_ALL_BB_FN (bb, cfun_to_copy)
!     copy_edges_for_bb (bb, count_scale, exit_block_map);
!   if (in_ssa_p)
!     FOR_ALL_BB_FN (bb, cfun_to_copy)
!       copy_phis_for_bb (bb, id);
    FOR_ALL_BB_FN (bb, cfun_to_copy)
!     ((basic_block)bb->aux)->aux = bb->aux = NULL;
!   entry_block_map->aux = NULL;
!   exit_block_map->aux = NULL;
  
    return new_fndecl;
  }
*************** setup_one_parameter (inline_data *id, tr
*** 1025,1037 ****
    tree init_stmt;
    tree var;
    tree var_sub;
  
!   /* If the parameter is never assigned to, we may not need to
!      create a new variable here at all.  Instead, we may be able
!      to just use the argument value.  */
    if (TREE_READONLY (p)
        && !TREE_ADDRESSABLE (p)
!       && value && !TREE_SIDE_EFFECTS (value))
      {
        /* We may produce non-gimple trees by adding NOPs or introduce
  	 invalid sharing when operand is not really constant.
--- 1236,1251 ----
    tree init_stmt;
    tree var;
    tree var_sub;
+   tree rhs = value ? fold_convert (TREE_TYPE (p), value) : NULL;
+   tree def = in_ssa_p ? default_def_fn (id->callee_cfun, p) : NULL;
  
!   /* If the parameter is never assigned to, has no SSA_NAMEs created,
!      we may not need to create a new variable here at all.  Instead, we may
!      be able to just use the argument value.  */
    if (TREE_READONLY (p)
        && !TREE_ADDRESSABLE (p)
!       && value && !TREE_SIDE_EFFECTS (value)
!       && !def)
      {
        /* We may produce non-gimple trees by adding NOPs or introduce
  	 invalid sharing when operand is not really constant.
*************** setup_one_parameter (inline_data *id, tr
*** 1055,1060 ****
--- 1269,1279 ----
       here since the type of this decl must be visible to the calling
       function.  */
    var = copy_decl_for_dup (p, fn, id->caller, /*versioning=*/false);
+   if (in_ssa_p && TREE_CODE (var) == VAR_DECL)
+     {
+       get_var_ann (var);
+       add_referenced_tmp_var (var);
+     }
  
    /* See if the frontend wants to pass this by invisible reference.  If
       so, our new VAR_DECL will have REFERENCE_TYPE, and we need to
*************** setup_one_parameter (inline_data *id, tr
*** 1069,1079 ****
    else
      var_sub = var;
  
-   /* Register the VAR_DECL as the equivalent for the PARM_DECL;
-      that way, when the PARM_DECL is encountered, it will be
-      automatically replaced by the VAR_DECL.  */
-   insert_decl_map (id, p, var_sub);
- 
    /* Declare this new variable.  */
    TREE_CHAIN (var) = *vars;
    *vars = var;
--- 1288,1293 ----
*************** setup_one_parameter (inline_data *id, tr
*** 1093,1111 ****
    if (TYPE_NEEDS_CONSTRUCTING (TREE_TYPE (p)))
      TREE_READONLY (var) = 0;
  
    /* Initialize this VAR_DECL from the equivalent argument.  Convert
       the argument to the proper type in case it was promoted.  */
    if (value)
      {
-       tree rhs = fold_convert (TREE_TYPE (var), value);
        block_stmt_iterator bsi = bsi_last (bb);
  
        if (rhs == error_mark_node)
! 	return;
  
        /* We want to use MODIFY_EXPR, not INIT_EXPR here so that we
  	 keep our trees in gimple form.  */
!       init_stmt = build (MODIFY_EXPR, TREE_TYPE (var), var, rhs);
  
        /* If we did not create a gimple value and we did not create a gimple
  	 cast of a gimple value, then we will need to gimplify INIT_STMTS
--- 1307,1358 ----
    if (TYPE_NEEDS_CONSTRUCTING (TREE_TYPE (p)))
      TREE_READONLY (var) = 0;
  
+   /* Register the VAR_DECL as the equivalent for the PARM_DECL;
+      that way, when the PARM_DECL is encountered, it will be
+      automatically replaced by the VAR_DECL.  */
+   insert_decl_map (id, p, var_sub);
+ 
+   /* If there is no setup required and we are in SSA, take the easy route.
+      We need to construct map for the variable anyway as it might be used
+      in different SSA names when parameter is set in function.
+      FIXME: This usually kills the last connection in between inlined
+      function parameter and the actual value in debug info.  Can we do
+      better here?  If we just inserted the statement, copy propagation
+      would kill it anyway as it always did in older versions of GCC... */
+   if (in_ssa_p && rhs && def && is_gimple_reg (p)
+       && (TREE_CODE (rhs) == SSA_NAME
+ 	  /* Replacing &""[0] has interesting side effects.  Exclude ADDR_EXPRs
+ 	     now.  */
+ 	  || (is_gimple_min_invariant (rhs) && TREE_CODE (rhs) != ADDR_EXPR)))
+     {
+       insert_decl_map (id, def, rhs);
+       return;
+     }
+ 
    /* Initialize this VAR_DECL from the equivalent argument.  Convert
       the argument to the proper type in case it was promoted.  */
    if (value)
      {
        block_stmt_iterator bsi = bsi_last (bb);
  
        if (rhs == error_mark_node)
! 	{
!   	  insert_decl_map (id, p, var_sub);
! 	  return;
! 	}
  
        /* We want to use MODIFY_EXPR, not INIT_EXPR here so that we
  	 keep our trees in gimple form.  */
!       if (def && is_gimple_reg (p))
! 	{
! 	  def = remap_ssa_name (def, id);
!           init_stmt = build (MODIFY_EXPR, TREE_TYPE (var), def, rhs);
! 	  SSA_NAME_DEF_STMT (def) = init_stmt;
! 	  /*gcc_assert (IS_EMPTY_STMT (default_def (var)));*/
! 	  set_default_def (var, NULL);
! 	}
!       else
!         init_stmt = build (MODIFY_EXPR, TREE_TYPE (var), var, rhs);
  
        /* If we did not create a gimple value and we did not create a gimple
  	 cast of a gimple value, then we will need to gimplify INIT_STMTS
*************** setup_one_parameter (inline_data *id, tr
*** 1115,1122 ****
        if (!is_gimple_val (rhs)
  	  && (!is_gimple_cast (rhs)
  	      || !is_gimple_val (TREE_OPERAND (rhs, 0))))
! 	gimplify_stmt (&init_stmt);
        bsi_insert_after (&bsi, init_stmt, BSI_NEW_STMT);
      }
  }
  
--- 1362,1384 ----
        if (!is_gimple_val (rhs)
  	  && (!is_gimple_cast (rhs)
  	      || !is_gimple_val (TREE_OPERAND (rhs, 0))))
! 	{
!           tree_stmt_iterator i;
! 
! 	  push_gimplify_context ();
! 	  gimplify_stmt (&init_stmt);
! 	  if (in_ssa_p && TREE_CODE (init_stmt) == STATEMENT_LIST)
! 	    {
! 	      /* The replacement can expose previously unreferenced variables.  */
! 	      for (i = tsi_start (init_stmt); !tsi_end_p (i); tsi_next (&i))
! 		find_new_referenced_vars (tsi_stmt_ptr (i));
! 	     }
! 	  pop_gimplify_context (NULL);
! 	}
        bsi_insert_after (&bsi, init_stmt, BSI_NEW_STMT);
+       if (in_ssa_p)
+ 	for (;!bsi_end_p (bsi); bsi_next (&bsi))
+ 	  mark_new_vars_to_rename (bsi_stmt (bsi));
      }
  }
  
*************** declare_return_variable (inline_data *id
*** 1206,1214 ****
  	 a modify expression.  */
        gcc_assert (!modify_dest);
        if (DECL_BY_REFERENCE (result))
! 	var = return_slot_addr;
        else
! 	var = build_fold_indirect_ref (return_slot_addr);
        use = NULL;
        goto done;
      }
--- 1468,1488 ----
  	 a modify expression.  */
        gcc_assert (!modify_dest);
        if (DECL_BY_REFERENCE (result))
! 	{
! 	  tree base_var = TREE_OPERAND (return_slot_addr, 0);
! 
! 	  /* FIXME: rewriting random variables in SSA form is going
! 	     to cause missoptimizations once we start optimizing.  */
! 	  if (TREE_CODE (base_var) == SSA_NAME)
! 	    base_var = SSA_NAME_VAR (base_var);
! 	  mark_sym_for_renaming (base_var);
! 	  var = return_slot_addr;
! 	}
        else
! 	{
! 	  mark_sym_for_renaming (TREE_OPERAND (return_slot_addr, 0));
! 	  var = build_fold_indirect_ref (return_slot_addr);
! 	}
        use = NULL;
        goto done;
      }
*************** declare_return_variable (inline_data *id
*** 1217,1223 ****
    gcc_assert (!TREE_ADDRESSABLE (callee_type));
  
    /* Attempt to avoid creating a new temporary variable.  */
!   if (modify_dest)
      {
        bool use_it = false;
  
--- 1491,1501 ----
    gcc_assert (!TREE_ADDRESSABLE (callee_type));
  
    /* Attempt to avoid creating a new temporary variable.  */
!   if (modify_dest
!       /* FIXME: the code to handle undefined return values in
!          expand_inline_call would overwrite the computed return
! 	 value.  */
!       && TREE_CODE (modify_dest) != SSA_NAME)
      {
        bool use_it = false;
  
*************** declare_return_variable (inline_data *id
*** 1250,1255 ****
--- 1528,1540 ----
    gcc_assert (TREE_CODE (TYPE_SIZE_UNIT (callee_type)) == INTEGER_CST);
  
    var = copy_decl_for_dup (result, callee, caller, /*versioning=*/false);
+   if (in_ssa_p)
+     {
+       /* TODO: We probably can go directly into SSA name here without much
+          trouble.  */
+       get_var_ann (var);
+       add_referenced_tmp_var (var);
+     }
  
    DECL_SEEN_IN_BIND_EXPR_P (var) = 1;
    DECL_STRUCT_FUNCTION (caller)->unexpanded_var_list
*************** expand_call_inline (basic_block bb, tree
*** 2029,2038 ****
    /* Initialize the parameters.  */
    args = TREE_OPERAND (t, 1);
  
-   initialize_inlined_parameters (id, args, TREE_OPERAND (t, 2), fn, bb);
- 
    /* Record the function we are about to inline.  */
    id->callee = fn;
  
    if (DECL_INITIAL (fn))
      add_lexical_block (id->block, remap_blocks (DECL_INITIAL (fn), id));
--- 2314,2324 ----
    /* Initialize the parameters.  */
    args = TREE_OPERAND (t, 1);
  
    /* Record the function we are about to inline.  */
    id->callee = fn;
+   id->callee_cfun = DECL_STRUCT_FUNCTION (fn);
+ 
+   initialize_inlined_parameters (id, args, TREE_OPERAND (t, 2), fn, bb);
  
    if (DECL_INITIAL (fn))
      add_lexical_block (id->block, remap_blocks (DECL_INITIAL (fn), id));
*************** expand_call_inline (basic_block bb, tree
*** 2112,2123 ****
    if (use_retvar && (TREE_CODE (bsi_stmt (stmt_bsi)) != CALL_EXPR))
      {
        *tp = use_retvar;
        maybe_clean_or_replace_eh_stmt (stmt, stmt);
      }
    else
      /* We're modifying a TSI owned by gimple_expand_calls_inline();
         tsi_delink() will leave the iterator in a sane state.  */
!     bsi_remove (&stmt_bsi);
  
    bsi_next (&bsi);
    if (bsi_end_p (bsi))
--- 2398,2439 ----
    if (use_retvar && (TREE_CODE (bsi_stmt (stmt_bsi)) != CALL_EXPR))
      {
        *tp = use_retvar;
+       update_stmt (stmt);
+       if (in_ssa_p)
+         mark_new_vars_to_rename (stmt);
        maybe_clean_or_replace_eh_stmt (stmt, stmt);
      }
    else
      /* We're modifying a TSI owned by gimple_expand_calls_inline();
         tsi_delink() will leave the iterator in a sane state.  */
!     {
!       /* Handle case of inlining function that miss return statement so return value becomes
!          undefined.  */
!       if (TREE_CODE (stmt) == MODIFY_EXPR
! 	  && TREE_CODE (TREE_OPERAND (stmt, 0)) == SSA_NAME)
! 	{
! 	  tree name = TREE_OPERAND (stmt, 0);
! 	  tree var = SSA_NAME_VAR (TREE_OPERAND (stmt, 0));
! 	  tree def = default_def (var);
! 
! 	  /* If the variable is used undefined, make this name undefined via
! 	     move.  */
! 	  if (def)
! 	    {
! 	      TREE_OPERAND (stmt, 1) = def;
! 	      update_stmt (stmt);
! 	    }
! 	  /* Otherwise make this variable undefined.  */
! 	  else
! 	    {
! 	      bsi_remove (&stmt_bsi);
! 	      set_default_def (var, name);
! 	      SSA_NAME_DEF_STMT (name) = build_empty_stmt ();
! 	    }
! 	}
!       else
!         bsi_remove (&stmt_bsi);
!     }
  
    bsi_next (&bsi);
    if (bsi_end_p (bsi))
*************** optimize_inline_calls (tree fn, bool ear
*** 2203,2209 ****
        id.caller = current_function_decl;
        prev_fn = current_function_decl;
      }
-   push_gimplify_context ();
  
    /* Reach the trees by walking over the CFG, and note the
       enclosing basic-blocks in the call edges.  */
--- 2519,2524 ----
*************** optimize_inline_calls (tree fn, bool ear
*** 2214,2221 ****
    FOR_EACH_BB (bb)
      gimple_expand_calls_inline (bb, &id);
  
- 
-   pop_gimplify_context (NULL);
    /* Renumber the (code) basic_blocks consecutively.  */
    compact_blocks ();
    /* Renumber the lexical scoping (non-code) blocks consecutively.  */
--- 2529,2534 ----
*************** optimize_inline_calls (tree fn, bool ear
*** 2237,2242 ****
--- 2550,2572 ----
    if (ENTRY_BLOCK_PTR->count)
      counts_to_freqs ();
    fold_cond_expr_cond ();
+   delete_unreachable_blocks ();
+ #ifdef ENABLE_CHECKING
+   verify_stmts ();
+   verify_flow_info ();
+ #endif
+   if (in_ssa_p)
+     {
+       update_ssa (TODO_update_ssa);
+ #ifdef ENABLE_CHECKING
+       verify_ssa (true);
+ #endif
+       fold_cond_expr_cond ();
+       cleanup_tree_cfg ();
+       rebuild_cgraph_edges ();
+     }
+   free_dominance_info (CDI_DOMINATORS);
+   free_dominance_info (CDI_POST_DOMINATORS);
  }
  
  /* FN is a function that has a complete body, and CLONE is a function whose
*************** tree_function_versioning (tree old_decl,
*** 2670,2676 ****
    struct cgraph_node *old_version_node;
    struct cgraph_node *new_version_node;
    inline_data id;
!   tree p, new_fndecl;
    unsigned i;
    struct ipa_replace_map *replace_info;
    basic_block old_entry_block;
--- 3000,3006 ----
    struct cgraph_node *old_version_node;
    struct cgraph_node *new_version_node;
    inline_data id;
!   tree p;
    unsigned i;
    struct ipa_replace_map *replace_info;
    basic_block old_entry_block;
*************** tree_function_versioning (tree old_decl,
*** 2720,2725 ****
--- 3050,3061 ----
    id.callee_cfun = DECL_STRUCT_FUNCTION (old_decl);
    
    current_function_decl = new_decl;
+   old_entry_block = ENTRY_BLOCK_PTR_FOR_FUNCTION
+     (DECL_STRUCT_FUNCTION (old_decl));
+   initialize_cfun (new_decl, old_decl,
+ 		   old_entry_block->count,
+ 		   old_entry_block->frequency);
+   push_cfun (DECL_STRUCT_FUNCTION (new_decl));
    
    /* Copy the function's static chain.  */
    p = DECL_STRUCT_FUNCTION (old_decl)->static_chain_decl;
*************** tree_function_versioning (tree old_decl,
*** 2765,2776 ****
        }
    
    /* Copy the Function's body.  */
!   old_entry_block = ENTRY_BLOCK_PTR_FOR_FUNCTION
!     (DECL_STRUCT_FUNCTION (old_decl));
!   new_fndecl = copy_body (&id,
! 			  old_entry_block->count,
! 			  old_entry_block->frequency, NULL, NULL);
    
    DECL_SAVED_TREE (new_decl) = DECL_SAVED_TREE (new_fndecl);
  
    DECL_STRUCT_FUNCTION (new_decl)->cfg =
--- 3101,3109 ----
        }
    
    /* Copy the Function's body.  */
!   copy_body (&id, old_entry_block->count, old_entry_block->frequency, ENTRY_BLOCK_PTR, EXIT_BLOCK_PTR);
    
+ #if 0
    DECL_SAVED_TREE (new_decl) = DECL_SAVED_TREE (new_fndecl);
  
    DECL_STRUCT_FUNCTION (new_decl)->cfg =
*************** tree_function_versioning (tree old_decl,
*** 2780,2785 ****
--- 3113,3119 ----
      DECL_STRUCT_FUNCTION (new_fndecl)->ib_boundaries_block;
    DECL_STRUCT_FUNCTION (new_decl)->last_label_uid =
      DECL_STRUCT_FUNCTION (new_fndecl)->last_label_uid;
+ #endif
  
    if (DECL_RESULT (old_decl) != NULL_TREE)
      {
*************** tree_function_versioning (tree old_decl,
*** 2788,2800 ****
        lang_hooks.dup_lang_specific_decl (DECL_RESULT (new_decl));
      }
    
-   current_function_decl = NULL;
    /* Renumber the lexical scoping (non-code) blocks consecutively.  */
    number_blocks (new_decl);
  
    /* Clean up.  */
    splay_tree_delete (id.decl_map);
    fold_cond_expr_cond ();
    return;
  }
  
--- 3122,3144 ----
        lang_hooks.dup_lang_specific_decl (DECL_RESULT (new_decl));
      }
    
    /* Renumber the lexical scoping (non-code) blocks consecutively.  */
    number_blocks (new_decl);
  
    /* Clean up.  */
    splay_tree_delete (id.decl_map);
    fold_cond_expr_cond ();
+   if (in_ssa_p)
+     {
+       update_ssa (TODO_update_ssa);
+ #ifdef ENABLE_CHECKING
+       verify_ssa (true);
+ #endif
+     }
+   free_dominance_info (CDI_DOMINATORS);
+   free_dominance_info (CDI_POST_DOMINATORS);
+   pop_cfun ();
+   current_function_decl = NULL;
    return;
  }
  
Index: tree-optimize.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-optimize.c,v
retrieving revision 2.121.2.3
diff -c -3 -p -r2.121.2.3 tree-optimize.c
*** tree-optimize.c	18 Aug 2005 14:43:56 -0000	2.121.2.3
--- tree-optimize.c	19 Aug 2005 16:13:21 -0000
*************** static bool
*** 57,64 ****
  gate_all_optimizations (void)
  {
    return (optimize >= 1
! 	  /* Don't bother doing anything if the program has errors.  */
! 	  && !(errorcount || sorrycount));
  }
  
  struct tree_opt_pass pass_all_optimizations =
--- 57,65 ----
  gate_all_optimizations (void)
  {
    return (optimize >= 1
! 	  /* Don't bother doing anything if the program has errors. 
! 	     We have to pass down the queue if we already went into SSA */
! 	  && (!(errorcount || sorrycount) || in_ssa_p));
  }
  
  struct tree_opt_pass pass_all_optimizations =
*************** struct tree_opt_pass pass_all_optimizati
*** 78,87 ****
    0					/* letter */
  };
  
  struct tree_opt_pass pass_early_local_passes =
  {
    NULL,					/* name */
!   gate_all_optimizations,		/* gate */
    NULL,					/* execute */
    NULL,					/* sub */
    NULL,					/* next */
--- 79,129 ----
    0					/* letter */
  };
  
+ /* Gate: execute, or not, all of the non-trivial optimizations.  */
+ 
+ static bool
+ gate_all_early_local_passes (void)
+ {
+   return (optimize >= 1
+ 	  /* Don't bother doing anything if the program has errors.  */
+ 	  && !(errorcount || sorrycount));
+ }
+ 
  struct tree_opt_pass pass_early_local_passes =
  {
    NULL,					/* name */
!   gate_all_early_local_passes,		/* gate */
!   NULL,					/* execute */
!   NULL,					/* sub */
!   NULL,					/* next */
!   0,					/* static_pass_number */
!   0,					/* tv_id */
!   0,					/* properties_required */
!   0,					/* properties_provided */
!   0,					/* properties_destroyed */
!   0,					/* todo_flags_start */
!   0,					/* todo_flags_finish */
!   0					/* letter */
! };
! /* Gate: execute, or not, all of the non-trivial optimizations.  */
! 
! static bool
! gate_all_early_optimizations (void)
! {
!   return (optimize >= 1
!           /* Sort of Hack: In non-unit-at-a-time we need to run the early
! 	     optimizations anyway.  The early optimization pass is run once
! 	     in IPA queue and once in late local passes.  In unit-at-a-time
! 	     the second invocation will get cgraph_global_info_ready.  */
!           && !cgraph_global_info_ready
! 	  /* Don't bother doing anything if the program has errors.  */
! 	  && !(errorcount || sorrycount));
! }
! 
! struct tree_opt_pass pass_all_early_optimizations =
! {
!   NULL,					/* name */
!   gate_all_early_optimizations,		/* gate */
    NULL,					/* execute */
    NULL,					/* sub */
    NULL,					/* next */
*************** tree_lowering_passes (tree fn)
*** 311,318 ****
    tree_register_cfg_hooks ();
    bitmap_obstack_initialize (NULL);
    execute_pass_list (all_lowering_passes);
    free_dominance_info (CDI_POST_DOMINATORS);
!   compact_blocks ();
    current_function_decl = saved_current_function_decl;
    bitmap_obstack_release (NULL);
    pop_cfun ();
--- 353,362 ----
    tree_register_cfg_hooks ();
    bitmap_obstack_initialize (NULL);
    execute_pass_list (all_lowering_passes);
+   if (cgraph_global_info_ready && optimize)
+     execute_pass_list (pass_all_early_optimizations.sub);
    free_dominance_info (CDI_POST_DOMINATORS);
!   free_dominance_info (CDI_DOMINATORS);
    current_function_decl = saved_current_function_decl;
    bitmap_obstack_release (NULL);
    pop_cfun ();
*************** tree_rest_of_compilation (tree fndecl)
*** 361,368 ****
       Kill it so it won't confuse us.  */
    cgraph_node_remove_callees (node);
  
- 
-   /* Initialize the default bitmap obstack.  */
    bitmap_obstack_initialize (NULL);
    bitmap_obstack_initialize (&reg_obstack); /* FIXME, only at RTL generation*/
    
--- 405,410 ----
Index: tree-pass.h
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-pass.h,v
retrieving revision 2.54.2.1
diff -c -3 -p -r2.54.2.1 tree-pass.h
*** tree-pass.h	9 Aug 2005 11:25:36 -0000	2.54.2.1
--- tree-pass.h	19 Aug 2005 16:13:21 -0000
*************** extern struct tree_opt_pass pass_ipa_typ
*** 294,299 ****
--- 294,300 ----
  extern struct tree_opt_pass pass_early_local_passes;
  
  extern struct tree_opt_pass pass_all_optimizations;
+ extern struct tree_opt_pass pass_all_early_optimizations;
  extern struct tree_opt_pass pass_cleanup_cfg_post_optimizing;
  extern struct tree_opt_pass pass_free_cfg_annotations;
  extern struct tree_opt_pass pass_free_datastructures;
Index: tree-ssa-ccp.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-ssa-ccp.c,v
retrieving revision 2.85
diff -c -3 -p -r2.85 tree-ssa-ccp.c
*** tree-ssa-ccp.c	28 Jul 2005 16:29:54 -0000	2.85
--- tree-ssa-ccp.c	19 Aug 2005 16:13:22 -0000
*************** get_maxval_strlen (tree arg, tree *lengt
*** 2018,2023 ****
--- 2018,2025 ----
  
    var = arg;
    def_stmt = SSA_NAME_DEF_STMT (var);
+   if (!def_stmt)
+     return false;
  
    switch (TREE_CODE (def_stmt))
      {



More information about the Gcc-patches mailing list