This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[patch 5/6] scalar-storage-order merge: rest


This is the rest of the implementation.

	* asan.c (instrument_derefs): Adjust call to get_inner_reference.
	* builtins.c (get_object_alignment_2): Likewise.
	* cfgexpand.c (expand_debug_expr): Adjust call to get_inner_reference
	and get_ref_base_and_extent.
	* dbxout.c (dbxout_expand_expr): Likewise.
	* dwarf2out.c (add_var_loc_to_decl): Likewise.
	(loc_list_for_address_of_addr_expr_of_indirect_ref): Likewise.
	(loc_list_from_tree): Likewise.
	(fortran_common): Likewise.
	* gimple-fold.c (gimple_fold_builtin_memory_op): Adjust calls to
	get_ref_base_and_extent.
	(get_base_constructor): Likewise.
	(fold_const_aggregate_ref_1): Likewise.
	* gimple-ssa-strength-reduction.c (slsr_process_ref): Adjust call to
	get_inner_reference and bail out on reverse storage order.
	* ifcvt.c (noce_emit_move_insn): Adjust calls to store_bit_field.
	* ipa-cp.c (ipa_get_jf_ancestor_result): Adjust call to
	build_ref_for_offset.
	* ipa-polymorphic-call.c (set_by_invariant): Adjust call to
	get_ref_base_and_extent.
	(ipa_polymorphic_call_context): Likewise.
	(extr_type_from_vtbl_ptr_store): Likewise.
	(check_stmt_for_type_change): Likewise.
	(get_dynamic_type): Likewise.
	* ipa-prop.c (ipa_load_from_parm_agg_1): Adjust call to
	get_ref_base_and_extent.
	(compute_complex_assign_jump_func): Likewise.
	(get_ancestor_addr_info): Likewise.
	(compute_known_type_jump_func): Likewise.
	(determine_known_aggregate_parts): Likewise.
	(ipa_get_adjustment_candidate): Likewise.
	(ipa_modify_call_arguments): Set REF_REVERSE_STORAGE_ORDER on MEM_REF.
	* ipa-prop.h (ipa_parm_adjustment): Add REVERSE field.
	(build_ref_for_offset): Adjust prototype.
	* lto-streamer-out.c (hash_tree): Deal with TYPE_REVERSE_STORAGE_ORDER.
	* simplify-rtx.c (delegitimize_mem_from_attrs): Adjust call to
	get_inner_reference.
	* tree-affine.c (tree_to_aff_combination): Adjust call to
	get_inner_reference.
	(get_inner_reference_aff): Likewise.
	* tree-data-ref.c (split_constant_offset_1): Likewise.
	(dr_analyze_innermost): Likewise.  Bail out if reverse storage order.
	* tree-scalar-evolution.c (interpret_rhs_expr): Adjust call to
	get_inner_reference.
	* tree-sra.c (struct access): Add REVERSE and move WRITE around.
	(dump_access): Print new fields.
	(create_access): Adjust call to get_ref_base_and_extent and set the
	REVERSE flag according to the result.
	(completely_scalarize_record): Set the REVERSE flag.
	(build_access_from_expr_1): Preserve storage order barriers.
	(build_accesses_from_assign): Likewise.
	(build_ref_for_offset): Add REVERSE parameter and set the
	REF_REVERSE_STORAGE_ORDER flag accordingly.
	(build_ref_for_model): Adjust call to build_ref_for_offset and clear
	the REF_REVERSE_STORAGE_ORDER flag if there are components.
	(analyze_access_subtree): Likewise.
	(get_access_for_expr): Adjust call to get_ref_base_and_extent.
	(turn_representatives_into_adjustments): Propagate REVERSE flag.
	(ipa_sra_check_caller): Adjust call to get_inner_reference.
	* tree-ssa-alias.c (ao_ref_base): Adjust call to get_ref_base_and_extent
	(aliasing_component_refs_p): Likewise.
	(stmt_kills_ref_p_1): Likewise.
	* tree-ssa-dce.c (mark_aliased_reaching_defs_necessary_1): Likewise.
	* tree-ssa-loop-ivopts.c (may_be_nonaddressable_p) <MEM_REF>: New case.
	Return true if reverse storage order.
	<BIT_FIELD_REF>: Likewise.
	<COMPONENT_REF>: Likewise.
	<ARRAY_REF>: Likewise.
	<ARRAY_RANGE_REF>: Likewise.
	(split_address_cost): Likewise.  Bail out if reverse storage order.
	* tree-ssa-math-opts.c (find_bswap_or_nop_load): Adjust call to
	get_inner_reference.  Bail out if reverse storage order.
	(bswap_replace): Adjust call to get_inner_reference.
	* tree-ssa-pre.c (create_component_ref_by_pieces_1) <MEM_REF>: Set the
	REF_REVERSE_STORAGE_ORDER flag.
	<BIT_FIELD_REF>: Likewise.
	* tree-ssa-sccvn.c (vn_reference_eq): Return false on storage order
	barriers.
	(copy_reference_ops_from_ref) <MEM_REF>: Set REVERSE field according to
	the REF_REVERSE_STORAGE_ORDER flag.
	<BIT_FIELD_REF>: Likewise.
	<VIEW_CONVERT_EXPR>: Set it for storage order barriers.
	(contains_storage_order_barrier_p): New predicate.
	(vn_reference_lookup_3): Adjust calls to get_ref_base_and_extent.  Punt
	on storage order barriers if necessary.
	* tree-ssa-sccvn.h (struct vn_reference_op_struct): Add REVERSE field.
	* tree-ssa-structalias.c (get_constraint_for_component_ref): Adjust
	call to get_ref_base_and_extent.
	(do_structure_copy): Likewise.
	* tree-streamer-in.c (unpack_ts_base_value_fields): Deal with
	TYPE_REVERSE_STORAGE_ORDER and REF_REVERSE_STORAGE_ORDER.
	* tree-streamer-out.c (pack_ts_base_value_fields): Likewise.
	* tree-vect-data-refs.c (vect_check_gather): Adjust call to
	get_inner_reference.
	(vect_analyze_data_refs): Likewise.  Bail out if reverse storage order.
	* tsan.c (instrument_expr): Adjust call to get_inner_reference.
	* ubsan.c (instrument_bool_enum_load): Likewise.
	(instrument_object_size): Likewise.
	* var-tracking.c (track_expr_p): Adjust call to get_ref_base_and_extent
	* config/arm/arm.c (arm_assemble_integer): Adjust call to assemble_real
	* config/arm/arm.md (consttable_4): Likewise.
	(consttable_8): Likewise.
	(consttable_16): Likewise.
	* config/mips/mips.md (consttable_float): Likewise.
	* config/s390/s390.c (s390_output_pool_entry): Likewise.
	* config/sh/sh.md (consttable_sf): Likewise.
	(consttable_df): Likewise.
lto/
	* lto.c (compare_tree_sccs_1): Deal with TYPE_REVERSE_STORAGE_ORDER.

 asan.c                          |    6 +--
 builtins.c                      |    6 +--
 cfgexpand.c                     |   12 ++++---
 dbxout.c                        |    6 +--
 dwarf2out.c                     |   21 ++++++-------
 gimple-fold.c                   |   13 +++++---
 gimple-ssa-strength-reduction.c |    6 ++-
 ifcvt.c                         |    4 +-
 ipa-cp.c                        |    2 -
 ipa-polymorphic-call.c          |   23 +++++++++-----
 ipa-prop.c                      |   25 +++++++++++----
 ipa-prop.h                      |    6 +++
 lto-streamer-out.c              |    3 +
 simplify-rtx.c                  |    7 ++--
 tree-affine.c                   |   10 +++---
 tree-data-ref.c                 |   20 ++++++++----
 tree-scalar-evolution.c         |    7 ++--
 tree-sra.c                      |   64 +++++++++++++++++++++++---------------
 tree-ssa-alias.c                |   18 +++++++----
 tree-ssa-dce.c                  |    4 +-
 tree-ssa-loop-ivopts.c          |   27 ++++++++++++----
 tree-ssa-math-opts.c            |   10 +++---
 tree-ssa-pre.c                  |    8 +++--
 tree-ssa-sccvn.c                |   55 ++++++++++++++++++++++++++++++----
 tree-ssa-sccvn.h                |    1 
 tree-ssa-structalias.c          |   10 ++++--
 tree-streamer-in.c              |    7 +++-
 tree-streamer-out.c             |    7 +++-
 tree-vect-data-refs.c           |   21 +++++++++----
 tsan.c                          |    6 +--
 ubsan.c                         |    8 ++---
 var-tracking.c                  |    3 +
 config/arm/arm.c                |    2 -
 config/arm/arm.md               |    6 +--
 config/mips/mips.md             |    2 -
 config/s390/s390.c              |    2 -
 config/sh/sh.md                 |    4 +-
 lto/lto.c                       |    5 ++-

 38 files changed, 302 insertions(+), 145 deletions(-)

-- 
Eric Botcazou
Index: gcc/ipa-polymorphic-call.c
===================================================================
--- gcc/ipa-polymorphic-call.c	(.../trunk)	(revision 224461)
+++ gcc/ipa-polymorphic-call.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -774,6 +774,7 @@ ipa_polymorphic_call_context::set_by_inv
 						HOST_WIDE_INT off)
 {
   HOST_WIDE_INT offset2, size, max_size;
+  bool reverse;
   tree base;
 
   invalid = false;
@@ -784,7 +785,7 @@ ipa_polymorphic_call_context::set_by_inv
     return false;
 
   cst = TREE_OPERAND (cst, 0);
-  base = get_ref_base_and_extent (cst, &offset2, &size, &max_size);
+  base = get_ref_base_and_extent (cst, &offset2, &size, &max_size, &reverse);
   if (!DECL_P (base) || max_size == -1 || max_size != size)
     return false;
 
@@ -914,8 +915,10 @@ ipa_polymorphic_call_context::ipa_polymo
 	{
 	  HOST_WIDE_INT size, max_size;
 	  HOST_WIDE_INT offset2;
-	  tree base = get_ref_base_and_extent (TREE_OPERAND (base_pointer, 0),
-					       &offset2, &size, &max_size);
+	  bool reverse;
+	  tree base
+	    = get_ref_base_and_extent (TREE_OPERAND (base_pointer, 0),
+				       &offset2, &size, &max_size, &reverse);
 
 	  if (max_size != -1 && max_size == size)
 	    combine_speculation_with (TYPE_MAIN_VARIANT (TREE_TYPE (base)),
@@ -1183,6 +1186,7 @@ extr_type_from_vtbl_ptr_store (gimple st
 {
   HOST_WIDE_INT offset, size, max_size;
   tree lhs, rhs, base;
+  bool reverse;
 
   if (!gimple_assign_single_p (stmt))
     return NULL_TREE;
@@ -1201,7 +1205,7 @@ extr_type_from_vtbl_ptr_store (gimple st
     ;
   else
     {
-      base = get_ref_base_and_extent (lhs, &offset, &size, &max_size);
+      base = get_ref_base_and_extent (lhs, &offset, &size, &max_size, &reverse);
       if (DECL_P (tci->instance))
 	{
 	  if (base != tci->instance)
@@ -1390,6 +1394,7 @@ check_stmt_for_type_change (ao_ref *ao A
 	tree op = walk_ssa_copies (gimple_call_arg (stmt, 0));
 	tree type = TYPE_METHOD_BASETYPE (TREE_TYPE (fn));
 	HOST_WIDE_INT offset = 0, size, max_size;
+	bool reverse;
 
 	if (dump_file)
 	  {
@@ -1400,8 +1405,8 @@ check_stmt_for_type_change (ao_ref *ao A
 	/* See if THIS parameter seems like instance pointer.  */
 	if (TREE_CODE (op) == ADDR_EXPR)
 	  {
-	    op = get_ref_base_and_extent (TREE_OPERAND (op, 0),
-					  &offset, &size, &max_size);
+	    op = get_ref_base_and_extent (TREE_OPERAND (op, 0), &offset,
+					  &size, &max_size, &reverse);
 	    if (size != max_size || max_size == -1)
 	      {
                 tci->speculative = true;
@@ -1544,6 +1549,7 @@ ipa_polymorphic_call_context::get_dynami
     {
       tree ref = gimple_call_fn (call);
       HOST_WIDE_INT offset2, size, max_size;
+      bool reverse;
 
       if (TREE_CODE (ref) == OBJ_TYPE_REF)
 	{
@@ -1573,8 +1579,9 @@ ipa_polymorphic_call_context::get_dynami
 		  && gimple_assign_load_p (SSA_NAME_DEF_STMT (ref)))
 		{
 		  tree ref_exp = gimple_assign_rhs1 (SSA_NAME_DEF_STMT (ref));
-		  tree base_ref = get_ref_base_and_extent
-				   (ref_exp, &offset2, &size, &max_size);
+		  tree base_ref
+		    = get_ref_base_and_extent (ref_exp, &offset2, &size,
+					       &max_size, &reverse);
 
 		  /* Finally verify that what we found looks like read from OTR_OBJECT
 		     or from INSTANCE with offset OFFSET.  */
Index: gcc/ipa-cp.c
===================================================================
--- gcc/ipa-cp.c	(.../trunk)	(revision 224461)
+++ gcc/ipa-cp.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -948,7 +948,7 @@ ipa_get_jf_ancestor_result (struct ipa_j
     {
       tree t = TREE_OPERAND (input, 0);
       t = build_ref_for_offset (EXPR_LOCATION (t), t,
-				ipa_get_jf_ancestor_offset (jfunc),
+				ipa_get_jf_ancestor_offset (jfunc), false,
 				ptr_type_node, NULL, false);
       return build_fold_addr_expr (t);
     }
Index: gcc/tree-scalar-evolution.c
===================================================================
--- gcc/tree-scalar-evolution.c	(.../trunk)	(revision 224461)
+++ gcc/tree-scalar-evolution.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1742,15 +1742,16 @@ interpret_rhs_expr (struct loop *loop, g
         {
 	  machine_mode mode;
 	  HOST_WIDE_INT bitsize, bitpos;
-	  int unsignedp;
+	  int unsignedp, reversep;
 	  int volatilep = 0;
 	  tree base, offset;
 	  tree chrec3;
 	  tree unitpos;
 
 	  base = get_inner_reference (TREE_OPERAND (rhs1, 0),
-				      &bitsize, &bitpos, &offset,
-				      &mode, &unsignedp, &volatilep, false);
+				      &bitsize, &bitpos, &offset, &mode,
+				      &unsignedp, &reversep, &volatilep,
+				      false);
 
 	  if (TREE_CODE (base) == MEM_REF)
 	    {
Index: gcc/builtins.c
===================================================================
--- gcc/builtins.c	(.../trunk)	(revision 224461)
+++ gcc/builtins.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -299,14 +299,14 @@ get_object_alignment_2 (tree exp, unsign
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
   machine_mode mode;
-  int unsignedp, volatilep;
+  int unsignedp, reversep, volatilep;
   unsigned int align = BITS_PER_UNIT;
   bool known_alignment = false;
 
   /* Get the innermost object and the constant (bitpos) and possibly
      variable (offset) offset of the access.  */
-  exp = get_inner_reference (exp, &bitsize, &bitpos, &offset,
-			     &mode, &unsignedp, &volatilep, true);
+  exp = get_inner_reference (exp, &bitsize, &bitpos, &offset, &mode,
+			     &unsignedp, &reversep, &volatilep, true);
 
   /* Extract alignment information from the innermost object and
      possibly adjust bitpos and offset.  */
Index: gcc/tree-ssa-sccvn.c
===================================================================
--- gcc/tree-ssa-sccvn.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-sccvn.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -732,6 +732,9 @@ vn_reference_eq (const_vn_reference_t co
 	{
 	  if (vro1->opcode == MEM_REF)
 	    deref1 = true;
+	  /* Do not look through a storage order barrier.  */
+	  else if (vro1->opcode == VIEW_CONVERT_EXPR && vro1->reverse)
+	    return false;
 	  if (vro1->off == -1)
 	    break;
 	  off1 += vro1->off;
@@ -740,6 +743,9 @@ vn_reference_eq (const_vn_reference_t co
 	{
 	  if (vro2->opcode == MEM_REF)
 	    deref2 = true;
+	  /* Do not look through a storage order barrier.  */
+	  else if (vro2->opcode == VIEW_CONVERT_EXPR && vro2->reverse)
+	    return false;
 	  if (vro2->off == -1)
 	    break;
 	  off2 += vro2->off;
@@ -839,11 +845,13 @@ copy_reference_ops_from_ref (tree ref, v
 	  temp.op0 = TREE_OPERAND (ref, 1);
 	  if (tree_fits_shwi_p (TREE_OPERAND (ref, 1)))
 	    temp.off = tree_to_shwi (TREE_OPERAND (ref, 1));
+	  temp.reverse = REF_REVERSE_STORAGE_ORDER (ref);
 	  break;
 	case BIT_FIELD_REF:
-	  /* Record bits and position.  */
+	  /* Record bits, position and storage order.  */
 	  temp.op0 = TREE_OPERAND (ref, 1);
 	  temp.op1 = TREE_OPERAND (ref, 2);
+	  temp.reverse = REF_REVERSE_STORAGE_ORDER (ref);
 	  break;
 	case COMPONENT_REF:
 	  /* The field decl is enough to unambiguously specify the field,
@@ -940,8 +948,11 @@ copy_reference_ops_from_ref (tree ref, v
 	     operand), so we don't have to put anything
 	     for op* as it will be handled by the iteration  */
 	case REALPART_EXPR:
+	  temp.off = 0;
+	  break;
 	case VIEW_CONVERT_EXPR:
 	  temp.off = 0;
+	  temp.reverse = storage_order_barrier_p (ref);
 	  break;
 	case IMAGPART_EXPR:
 	  /* This is only interesting for its constant offset.  */
@@ -1426,6 +1437,21 @@ fully_constant_vn_reference_p (vn_refere
   return NULL_TREE;
 }
 
+/* Return true if OPS contain a storage order barrier.  */
+
+static bool
+contains_storage_order_barrier_p (vec<vn_reference_op_s> ops)
+{
+  vn_reference_op_t op;
+  unsigned i;
+
+  FOR_EACH_VEC_ELT (ops, i, op)
+    if (op->opcode == VIEW_CONVERT_EXPR && op->reverse)
+      return true;
+
+  return false;
+}
+
 /* Transform any SSA_NAME's in a vector of vn_reference_op_s
    structures into their value numbers.  This is done in-place, and
    the vector passed in is returned.  *VALUEIZED_ANYTHING will specify
@@ -1738,7 +1764,9 @@ vn_reference_lookup_3 (ao_ref *ref, tree
       tree ref2 = TREE_OPERAND (gimple_call_arg (def_stmt, 0), 0);
       tree base2;
       HOST_WIDE_INT offset2, size2, maxsize2;
-      base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &maxsize2);
+      bool reverse;
+      base2 = get_ref_base_and_extent (ref2, &offset2, &size2, &maxsize2,
+				       &reverse);
       size2 = tree_to_uhwi (gimple_call_arg (def_stmt, 2)) * 8;
       if ((unsigned HOST_WIDE_INT)size2 / 8
 	  == tree_to_uhwi (gimple_call_arg (def_stmt, 2))
@@ -1761,8 +1789,9 @@ vn_reference_lookup_3 (ao_ref *ref, tree
     {
       tree base2;
       HOST_WIDE_INT offset2, size2, maxsize2;
+      bool reverse;
       base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt),
-				       &offset2, &size2, &maxsize2);
+				       &offset2, &size2, &maxsize2, &reverse);
       if (maxsize2 != -1
 	  && operand_equal_p (base, base2, 0)
 	  && offset2 <= offset
@@ -1782,13 +1811,15 @@ vn_reference_lookup_3 (ao_ref *ref, tree
 	   && maxsize % BITS_PER_UNIT == 0
 	   && offset % BITS_PER_UNIT == 0
 	   && is_gimple_reg_type (vr->type)
+	   && !contains_storage_order_barrier_p (vr->operands)
 	   && gimple_assign_single_p (def_stmt)
 	   && is_gimple_min_invariant (gimple_assign_rhs1 (def_stmt)))
     {
       tree base2;
       HOST_WIDE_INT offset2, size2, maxsize2;
+      bool reverse;
       base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt),
-				       &offset2, &size2, &maxsize2);
+				       &offset2, &size2, &maxsize2, &reverse);
       if (maxsize2 != -1
 	  && maxsize2 == size2
 	  && size2 % BITS_PER_UNIT == 0
@@ -1821,6 +1852,7 @@ vn_reference_lookup_3 (ao_ref *ref, tree
      to access pieces from.  */
   else if (ref->size == maxsize
 	   && is_gimple_reg_type (vr->type)
+	   && !contains_storage_order_barrier_p (vr->operands)
 	   && gimple_assign_single_p (def_stmt)
 	   && TREE_CODE (gimple_assign_rhs1 (def_stmt)) == SSA_NAME)
     {
@@ -1833,8 +1865,10 @@ vn_reference_lookup_3 (ao_ref *ref, tree
 	{
 	  tree base2;
 	  HOST_WIDE_INT offset2, size2, maxsize2, off;
+	  bool reverse;
 	  base2 = get_ref_base_and_extent (gimple_assign_lhs (def_stmt),
-					   &offset2, &size2, &maxsize2);
+					   &offset2, &size2, &maxsize2,
+					   &reverse);
 	  off = offset - offset2;
 	  if (maxsize2 != -1
 	      && maxsize2 == size2
@@ -1885,7 +1919,7 @@ vn_reference_lookup_3 (ao_ref *ref, tree
     {
       tree base2;
       HOST_WIDE_INT offset2, size2, maxsize2;
-      int i, j;
+      int i, j, k;
       auto_vec<vn_reference_op_s> rhs;
       vn_reference_op_t vro;
       ao_ref r;
@@ -1948,6 +1982,14 @@ vn_reference_lookup_3 (ao_ref *ref, tree
       if (j != -1)
 	return (void *)-1;
 
+      /* Punt if the additional ops contain a storage order barrier.  */
+      for (k = i; k >= 0; k--)
+	{
+	  vro = &vr->operands[k];
+	  if (vro->opcode == VIEW_CONVERT_EXPR && vro->reverse)
+	    return (void *)-1;
+	}
+
       /* Now re-write REF to be based on the rhs of the assignment.  */
       copy_reference_ops_from_ref (gimple_assign_rhs1 (def_stmt), &rhs);
 
@@ -2022,7 +2064,6 @@ vn_reference_lookup_3 (ao_ref *ref, tree
       vn_reference_op_s op;
       HOST_WIDE_INT at;
 
-
       /* Only handle non-variable, addressable refs.  */
       if (ref->size != maxsize
 	  || offset % BITS_PER_UNIT != 0
Index: gcc/tree-ssa-sccvn.h
===================================================================
--- gcc/tree-ssa-sccvn.h	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-sccvn.h	(.../branches/scalar-storage-order)	(revision 224467)
@@ -89,6 +89,7 @@ typedef struct vn_reference_op_struct
   tree op0;
   tree op1;
   tree op2;
+  bool reverse;
 } vn_reference_op_s;
 typedef vn_reference_op_s *vn_reference_op_t;
 typedef const vn_reference_op_s *const_vn_reference_op_t;
Index: gcc/dbxout.c
===================================================================
--- gcc/dbxout.c	(.../trunk)	(revision 224461)
+++ gcc/dbxout.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -2495,11 +2495,11 @@ dbxout_expand_expr (tree expr)
 	machine_mode mode;
 	HOST_WIDE_INT bitsize, bitpos;
 	tree offset, tem;
-	int volatilep = 0, unsignedp = 0;
+	int unsignedp, reversep, volatilep = 0;
 	rtx x;
 
-	tem = get_inner_reference (expr, &bitsize, &bitpos, &offset,
-				   &mode, &unsignedp, &volatilep, true);
+	tem = get_inner_reference (expr, &bitsize, &bitpos, &offset, &mode,
+				   &unsignedp, &reversep, &volatilep, true);
 
 	x = dbxout_expand_expr (tem);
 	if (x == NULL || !MEM_P (x))
Index: gcc/tree-ssa-loop-ivopts.c
===================================================================
--- gcc/tree-ssa-loop-ivopts.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-loop-ivopts.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1858,10 +1858,27 @@ may_be_nonaddressable_p (tree expr)
 	 target, thus they are always addressable.  */
       return false;
 
+    case MEM_REF:
+      /* Likewise for MEM_REFs, modulo the storage order.  */
+      return REF_REVERSE_STORAGE_ORDER (expr);
+
+    case BIT_FIELD_REF:
+      if (REF_REVERSE_STORAGE_ORDER (expr))
+	return true;
+      return may_be_nonaddressable_p (TREE_OPERAND (expr, 0));
+
     case COMPONENT_REF:
+      if (TYPE_REVERSE_STORAGE_ORDER (TREE_TYPE (TREE_OPERAND (expr, 0))))
+	return true;
       return DECL_NONADDRESSABLE_P (TREE_OPERAND (expr, 1))
 	     || may_be_nonaddressable_p (TREE_OPERAND (expr, 0));
 
+    case ARRAY_REF:
+    case ARRAY_RANGE_REF:
+      if (TYPE_REVERSE_STORAGE_ORDER (TREE_TYPE (TREE_OPERAND (expr, 0))))
+	return true;
+      return may_be_nonaddressable_p (TREE_OPERAND (expr, 0));
+
     case VIEW_CONVERT_EXPR:
       /* This kind of view-conversions may wrap non-addressable objects
 	 and make them look addressable.  After some processing the
@@ -1870,11 +1887,6 @@ may_be_nonaddressable_p (tree expr)
       if (is_gimple_reg (TREE_OPERAND (expr, 0))
 	  || !is_gimple_addressable (TREE_OPERAND (expr, 0)))
 	return true;
-
-      /* ... fall through ... */
-
-    case ARRAY_REF:
-    case ARRAY_RANGE_REF:
       return may_be_nonaddressable_p (TREE_OPERAND (expr, 0));
 
     CASE_CONVERT:
@@ -4136,13 +4148,14 @@ split_address_cost (struct ivopts_data *
   HOST_WIDE_INT bitpos;
   tree toffset;
   machine_mode mode;
-  int unsignedp, volatilep;
+  int unsignedp, reversep, volatilep;
 
   core = get_inner_reference (addr, &bitsize, &bitpos, &toffset, &mode,
-			      &unsignedp, &volatilep, false);
+			      &unsignedp, &reversep, &volatilep, false);
 
   if (toffset != 0
       || bitpos % BITS_PER_UNIT != 0
+      || reversep
       || TREE_CODE (core) != VAR_DECL)
     {
       *symbol_present = false;
Index: gcc/lto-streamer-out.c
===================================================================
--- gcc/lto-streamer-out.c	(.../trunk)	(revision 224461)
+++ gcc/lto-streamer-out.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -986,7 +986,8 @@ hash_tree (struct streamer_tree_cache_d
     hstate.add_flag (TREE_PRIVATE (t));
   if (TYPE_P (t))
     {
-      hstate.add_flag (TYPE_SATURATING (t));
+      hstate.add_flag (AGGREGATE_TYPE_P (t)
+		       ? TYPE_REVERSE_STORAGE_ORDER (t) : TYPE_SATURATING (t));
       hstate.add_flag (TYPE_ADDR_SPACE (t));
     }
   else if (code == SSA_NAME)
Index: gcc/tree-ssa-math-opts.c
===================================================================
--- gcc/tree-ssa-math-opts.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-math-opts.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -2058,7 +2058,7 @@ find_bswap_or_nop_load (gimple stmt, tre
      offset from base to compare to other such leaf node.  */
   HOST_WIDE_INT bitsize, bitpos;
   machine_mode mode;
-  int unsignedp, volatilep;
+  int unsignedp, reversep, volatilep;
   tree offset, base_addr;
 
   /* Not prepared to handle PDP endian.  */
@@ -2069,7 +2069,7 @@ find_bswap_or_nop_load (gimple stmt, tre
     return false;
 
   base_addr = get_inner_reference (ref, &bitsize, &bitpos, &offset, &mode,
-				   &unsignedp, &volatilep, false);
+				   &unsignedp, &reversep, &volatilep, false);
 
   if (TREE_CODE (base_addr) == MEM_REF)
     {
@@ -2108,6 +2108,8 @@ find_bswap_or_nop_load (gimple stmt, tre
     return false;
   if (bitsize % BITS_PER_UNIT)
     return false;
+  if (reversep)
+    return false;
 
   if (!init_symbolic_number (n, ref))
     return false;
@@ -2555,11 +2557,11 @@ bswap_replace (gimple cur_stmt, gimple s
 	{
 	  HOST_WIDE_INT bitsize, bitpos;
 	  machine_mode mode;
-	  int unsignedp, volatilep;
+	  int unsignedp, reversep, volatilep;
 	  tree offset;
 
 	  get_inner_reference (src, &bitsize, &bitpos, &offset, &mode,
-			       &unsignedp, &volatilep, false);
+			       &unsignedp, &reversep, &volatilep, false);
 	  if (n->range < (unsigned HOST_WIDE_INT) bitsize)
 	    {
 	      load_offset = (bitsize - n->range) / BITS_PER_UNIT;
Index: gcc/tree-ssa-alias.c
===================================================================
--- gcc/tree-ssa-alias.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-alias.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -571,10 +571,12 @@ ao_ref_init (ao_ref *r, tree ref)
 tree
 ao_ref_base (ao_ref *ref)
 {
+  bool reverse;
+
   if (ref->base)
     return ref->base;
   ref->base = get_ref_base_and_extent (ref->ref, &ref->offset, &ref->size,
-				       &ref->max_size);
+				       &ref->max_size, &reverse);
   return ref->base;
 }
 
@@ -755,9 +757,10 @@ aliasing_component_refs_p (tree ref1,
   else if (same_p == 1)
     {
       HOST_WIDE_INT offadj, sztmp, msztmp;
-      get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp);
+      bool reverse;
+      get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp, &reverse);
       offset2 -= offadj;
-      get_ref_base_and_extent (base1, &offadj, &sztmp, &msztmp);
+      get_ref_base_and_extent (base1, &offadj, &sztmp, &msztmp, &reverse);
       offset1 -= offadj;
       return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
     }
@@ -773,9 +776,10 @@ aliasing_component_refs_p (tree ref1,
   else if (same_p == 1)
     {
       HOST_WIDE_INT offadj, sztmp, msztmp;
-      get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp);
+      bool reverse;
+      get_ref_base_and_extent (*refp, &offadj, &sztmp, &msztmp, &reverse);
       offset1 -= offadj;
-      get_ref_base_and_extent (base2, &offadj, &sztmp, &msztmp);
+      get_ref_base_and_extent (base2, &offadj, &sztmp, &msztmp, &reverse);
       offset2 -= offadj;
       return ranges_overlap_p (offset1, max_size1, offset2, max_size2);
     }
@@ -2320,7 +2324,9 @@ stmt_kills_ref_p (gimple stmt, ao_ref *r
       if (ref->max_size == -1)
 	return false;
       HOST_WIDE_INT size, offset, max_size, ref_offset = ref->offset;
-      tree base = get_ref_base_and_extent (lhs, &offset, &size, &max_size);
+      bool reverse;
+      tree base
+	= get_ref_base_and_extent (lhs, &offset, &size, &max_size, &reverse);
       /* We can get MEM[symbol: sZ, index: D.8862_1] here,
 	 so base == ref->base does not always hold.  */
       if (base != ref->base)
Index: gcc/ifcvt.c
===================================================================
--- gcc/ifcvt.c	(.../trunk)	(revision 224461)
+++ gcc/ifcvt.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -961,7 +961,7 @@ noce_emit_move_insn (rtx x, rtx y)
 		}
 
 	      gcc_assert (start < (MEM_P (op) ? BITS_PER_UNIT : BITS_PER_WORD));
-	      store_bit_field (op, size, start, 0, 0, GET_MODE (x), y);
+	      store_bit_field (op, size, start, 0, 0, GET_MODE (x), y, false);
 	      return;
 	    }
 
@@ -1016,7 +1016,7 @@ noce_emit_move_insn (rtx x, rtx y)
   outmode = GET_MODE (outer);
   bitpos = SUBREG_BYTE (outer) * BITS_PER_UNIT;
   store_bit_field (inner, GET_MODE_BITSIZE (outmode), bitpos,
-		   0, 0, outmode, y);
+		   0, 0, outmode, y, false);
 }
 
 /* Return the CC reg if it is used in COND.  */
Index: gcc/dwarf2out.c
===================================================================
--- gcc/dwarf2out.c	(.../trunk)	(revision 224461)
+++ gcc/dwarf2out.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -5249,9 +5249,10 @@ add_var_loc_to_decl (tree decl, rtx loc_
 	      && TREE_CODE (TREE_OPERAND (realdecl, 0)) == ADDR_EXPR))
 	{
 	  HOST_WIDE_INT maxsize;
-	  tree innerdecl;
-	  innerdecl
-	    = get_ref_base_and_extent (realdecl, &bitpos, &bitsize, &maxsize);
+	  bool reverse;
+	  tree innerdecl
+	    = get_ref_base_and_extent (realdecl, &bitpos, &bitsize, &maxsize,
+				       &reverse);
 	  if (!DECL_P (innerdecl)
 	      || DECL_IGNORED_P (innerdecl)
 	      || TREE_STATIC (innerdecl)
@@ -14431,12 +14432,12 @@ loc_list_for_address_of_addr_expr_of_ind
   tree obj, offset;
   HOST_WIDE_INT bitsize, bitpos, bytepos;
   machine_mode mode;
-  int unsignedp, volatilep = 0;
+  int unsignedp, reversep, volatilep = 0;
   dw_loc_list_ref list_ret = NULL, list_ret1 = NULL;
 
   obj = get_inner_reference (TREE_OPERAND (loc, 0),
 			     &bitsize, &bitpos, &offset, &mode,
-			     &unsignedp, &volatilep, false);
+			     &unsignedp, &reversep, &volatilep, false);
   STRIP_NOPS (obj);
   if (bitpos % BITS_PER_UNIT)
     {
@@ -14765,10 +14766,10 @@ loc_list_from_tree (tree loc, int want_a
 	tree obj, offset;
 	HOST_WIDE_INT bitsize, bitpos, bytepos;
 	machine_mode mode;
-	int unsignedp, volatilep = 0;
+	int unsignedp, reversep, volatilep = 0;
 
 	obj = get_inner_reference (loc, &bitsize, &bitpos, &offset, &mode,
-				   &unsignedp, &volatilep, false);
+				   &unsignedp, &reversep, &volatilep, false);
 
 	gcc_assert (obj != loc);
 
@@ -16068,7 +16069,7 @@ fortran_common (tree decl, HOST_WIDE_INT
   machine_mode mode;
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
-  int unsignedp, volatilep = 0;
+  int unsignedp, reversep, volatilep = 0;
 
   /* If the decl isn't a VAR_DECL, or if it isn't static, or if
      it does not have a value (the offset into the common area), or if it
@@ -16084,8 +16085,8 @@ fortran_common (tree decl, HOST_WIDE_INT
   if (TREE_CODE (val_expr) != COMPONENT_REF)
     return NULL_TREE;
 
-  cvar = get_inner_reference (val_expr, &bitsize, &bitpos, &offset,
-			      &mode, &unsignedp, &volatilep, true);
+  cvar = get_inner_reference (val_expr, &bitsize, &bitpos, &offset, &mode,
+			      &unsignedp, &reversep, &volatilep, true);
 
   if (cvar == NULL_TREE
       || TREE_CODE (cvar) != VAR_DECL
Index: gcc/tsan.c
===================================================================
--- gcc/tsan.c	(.../trunk)	(revision 224461)
+++ gcc/tsan.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -141,9 +141,9 @@ instrument_expr (gimple_stmt_iterator gs
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
   machine_mode mode;
-  int volatilep = 0, unsignedp = 0;
-  base = get_inner_reference (expr, &bitsize, &bitpos, &offset,
-			      &mode, &unsignedp, &volatilep, false);
+  int unsignedp, reversep, volatilep = 0;
+  base = get_inner_reference (expr, &bitsize, &bitpos, &offset, &mode,
+			      &unsignedp, &reversep, &volatilep, false);
 
   /* No need to instrument accesses to decls that don't escape,
      they can't escape to other threads then.  */
Index: gcc/asan.c
===================================================================
--- gcc/asan.c	(.../trunk)	(revision 224461)
+++ gcc/asan.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1814,9 +1814,9 @@ instrument_derefs (gimple_stmt_iterator
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
   machine_mode mode;
-  int volatilep = 0, unsignedp = 0;
-  tree inner = get_inner_reference (t, &bitsize, &bitpos, &offset,
-				    &mode, &unsignedp, &volatilep, false);
+  int unsignedp, reversep, volatilep = 0;
+  tree inner = get_inner_reference (t, &bitsize, &bitpos, &offset, &mode,
+				    &unsignedp, &reversep, &volatilep, false);
 
   if (TREE_CODE (t) == COMPONENT_REF
       && DECL_BIT_FIELD_REPRESENTATIVE (TREE_OPERAND (t, 1)) != NULL_TREE)
Index: gcc/gimple-ssa-strength-reduction.c
===================================================================
--- gcc/gimple-ssa-strength-reduction.c	(.../trunk)	(revision 224461)
+++ gcc/gimple-ssa-strength-reduction.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1001,7 +1001,7 @@ slsr_process_ref (gimple gs)
   tree ref_expr, base, offset, type;
   HOST_WIDE_INT bitsize, bitpos;
   machine_mode mode;
-  int unsignedp, volatilep;
+  int unsignedp, reversep, volatilep;
   slsr_cand_t c;
 
   if (gimple_vdef (gs))
@@ -1016,7 +1016,9 @@ slsr_process_ref (gimple gs)
     return;
 
   base = get_inner_reference (ref_expr, &bitsize, &bitpos, &offset, &mode,
-			      &unsignedp, &volatilep, false);
+			      &unsignedp, &reversep, &volatilep, false);
+  if (reversep)
+    return;
   widest_int index = bitpos;
 
   if (!restructure_reference (&base, &offset, &index, &type))
Index: gcc/tree-data-ref.c
===================================================================
--- gcc/tree-data-ref.c	(.../trunk)	(revision 224461)
+++ gcc/tree-data-ref.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -636,11 +636,12 @@ split_constant_offset_1 (tree type, tree
 	tree base, poffset;
 	HOST_WIDE_INT pbitsize, pbitpos;
 	machine_mode pmode;
-	int punsignedp, pvolatilep;
+	int punsignedp, preversep, pvolatilep;
 
 	op0 = TREE_OPERAND (op0, 0);
-	base = get_inner_reference (op0, &pbitsize, &pbitpos, &poffset,
-				    &pmode, &punsignedp, &pvolatilep, false);
+	base
+	  = get_inner_reference (op0, &pbitsize, &pbitpos, &poffset, &pmode,
+				 &punsignedp, &preversep, &pvolatilep, false);
 
 	if (pbitpos % BITS_PER_UNIT != 0)
 	  return false;
@@ -784,7 +785,7 @@ dr_analyze_innermost (struct data_refere
   HOST_WIDE_INT pbitsize, pbitpos;
   tree base, poffset;
   machine_mode pmode;
-  int punsignedp, pvolatilep;
+  int punsignedp, preversep, pvolatilep;
   affine_iv base_iv, offset_iv;
   tree init, dinit, step;
   bool in_loop = (loop && loop->num);
@@ -792,8 +793,8 @@ dr_analyze_innermost (struct data_refere
   if (dump_file && (dump_flags & TDF_DETAILS))
     fprintf (dump_file, "analyze_innermost: ");
 
-  base = get_inner_reference (ref, &pbitsize, &pbitpos, &poffset,
-			      &pmode, &punsignedp, &pvolatilep, false);
+  base = get_inner_reference (ref, &pbitsize, &pbitpos, &poffset, &pmode,
+			      &punsignedp, &preversep, &pvolatilep, false);
   gcc_assert (base != NULL_TREE);
 
   if (pbitpos % BITS_PER_UNIT != 0)
@@ -803,6 +804,13 @@ dr_analyze_innermost (struct data_refere
       return false;
     }
 
+  if (preversep)
+    {
+      if (dump_file && (dump_flags & TDF_DETAILS))
+	fprintf (dump_file, "failed: reverse storage order.\n");
+      return false;
+    }
+
   if (TREE_CODE (base) == MEM_REF)
     {
       if (!integer_zerop (TREE_OPERAND (base, 1)))
Index: gcc/tree-affine.c
===================================================================
--- gcc/tree-affine.c	(.../trunk)	(revision 224461)
+++ gcc/tree-affine.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -283,7 +283,7 @@ tree_to_aff_combination (tree expr, tree
   tree cst, core, toffset;
   HOST_WIDE_INT bitpos, bitsize;
   machine_mode mode;
-  int unsignedp, volatilep;
+  int unsignedp, reversep, volatilep;
 
   STRIP_NOPS (expr);
 
@@ -340,8 +340,8 @@ tree_to_aff_combination (tree expr, tree
 	  return;
 	}
       core = get_inner_reference (TREE_OPERAND (expr, 0), &bitsize, &bitpos,
-				  &toffset, &mode, &unsignedp, &volatilep,
-				  false);
+				  &toffset, &mode, &unsignedp, &reversep,
+				  &volatilep, false);
       if (bitpos % BITS_PER_UNIT != 0)
 	break;
       aff_combination_const (comb, type, bitpos / BITS_PER_UNIT);
@@ -908,10 +908,10 @@ get_inner_reference_aff (tree ref, aff_t
   HOST_WIDE_INT bitsize, bitpos;
   tree toff;
   machine_mode mode;
-  int uns, vol;
+  int uns, rev, vol;
   aff_tree tmp;
   tree base = get_inner_reference (ref, &bitsize, &bitpos, &toff, &mode,
-				   &uns, &vol, false);
+				   &uns, &rev, &vol, false);
   tree base_addr = build_fold_addr_expr (base);
 
   /* ADDR = &BASE + TOFF + BITPOS / BITS_PER_UNIT.  */
Index: gcc/tree-vect-data-refs.c
===================================================================
--- gcc/tree-vect-data-refs.c	(.../trunk)	(revision 224461)
+++ gcc/tree-vect-data-refs.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -2974,7 +2974,7 @@ vect_check_gather (gimple stmt, loop_vec
   tree offtype = NULL_TREE;
   tree decl, base, off;
   machine_mode pmode;
-  int punsignedp, pvolatilep;
+  int punsignedp, reversep, pvolatilep = 0;
 
   base = DR_REF (dr);
   /* For masked loads/stores, DR_REF (dr) is an artificial MEM_REF,
@@ -3006,9 +3006,9 @@ vect_check_gather (gimple stmt, loop_vec
      vectorized.  The following code attempts to find such a preexistng
      SSA_NAME OFF and put the loop invariants into a tree BASE
      that can be gimplified before the loop.  */
-  base = get_inner_reference (base, &pbitsize, &pbitpos, &off,
-			      &pmode, &punsignedp, &pvolatilep, false);
-  gcc_assert (base != NULL_TREE && (pbitpos % BITS_PER_UNIT) == 0);
+  base = get_inner_reference (base, &pbitsize, &pbitpos, &off, &pmode,
+			      &punsignedp, &reversep, &pvolatilep, false);
+  gcc_assert (base && (pbitpos % BITS_PER_UNIT) == 0 && !reversep);
 
   if (TREE_CODE (base) == MEM_REF)
     {
@@ -3547,7 +3547,7 @@ again:
 	  HOST_WIDE_INT pbitsize, pbitpos;
 	  tree poffset;
 	  machine_mode pmode;
-	  int punsignedp, pvolatilep;
+	  int punsignedp, preversep, pvolatilep;
 	  affine_iv base_iv, offset_iv;
 	  tree dinit;
 
@@ -3566,7 +3566,8 @@ again:
 	    }
 
 	  outer_base = get_inner_reference (inner_base, &pbitsize, &pbitpos,
-		          &poffset, &pmode, &punsignedp, &pvolatilep, false);
+					    &poffset, &pmode, &punsignedp,
+					    &preversep, &pvolatilep, false);
 	  gcc_assert (outer_base != NULL_TREE);
 
 	  if (pbitpos % BITS_PER_UNIT != 0)
@@ -3577,6 +3578,14 @@ again:
 	      return false;
 	    }
 
+	  if (preversep)
+	    {
+	      if (dump_enabled_p ())
+		dump_printf_loc (MSG_MISSED_OPTIMIZATION, vect_location,
+				 "failed: reverse storage order.\n");
+	      return false;
+	    }
+
 	  outer_base = build_fold_addr_expr (outer_base);
 	  if (!simple_iv (loop, loop_containing_stmt (stmt), outer_base,
                           &base_iv, false))
Index: gcc/gimple-fold.c
===================================================================
--- gcc/gimple-fold.c	(.../trunk)	(revision 224461)
+++ gcc/gimple-fold.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -926,13 +926,14 @@ gimple_fold_builtin_memory_op (gimple_st
 	      HOST_WIDE_INT src_offset = 0, dest_offset = 0;
 	      HOST_WIDE_INT size = -1;
 	      HOST_WIDE_INT maxsize = -1;
+	      bool reverse;
 
 	      srcvar = TREE_OPERAND (src, 0);
 	      src_base = get_ref_base_and_extent (srcvar, &src_offset,
-						  &size, &maxsize);
+						  &size, &maxsize, &reverse);
 	      destvar = TREE_OPERAND (dest, 0);
 	      dest_base = get_ref_base_and_extent (destvar, &dest_offset,
-						   &size, &maxsize);
+						   &size, &maxsize, &reverse);
 	      if (tree_fits_uhwi_p (len))
 		maxsize = tree_to_uhwi (len);
 	      else
@@ -5219,6 +5220,8 @@ get_base_constructor (tree base, HOST_WI
 		      tree (*valueize)(tree))
 {
   HOST_WIDE_INT bit_offset2, size, max_size;
+  bool reverse;
+
   if (TREE_CODE (base) == MEM_REF)
     {
       if (!integer_zerop (TREE_OPERAND (base, 1)))
@@ -5259,7 +5262,8 @@ get_base_constructor (tree base, HOST_WI
 
     case ARRAY_REF:
     case COMPONENT_REF:
-      base = get_ref_base_and_extent (base, &bit_offset2, &size, &max_size);
+      base = get_ref_base_and_extent (base, &bit_offset2, &size, &max_size,
+				      &reverse);
       if (max_size == -1 || size != max_size)
 	return NULL_TREE;
       *bit_offset +=  bit_offset2;
@@ -5503,6 +5507,7 @@ fold_const_aggregate_ref_1 (tree t, tree
   tree ctor, idx, base;
   HOST_WIDE_INT offset, size, max_size;
   tree tem;
+  bool reverse;
 
   if (TREE_THIS_VOLATILE (t))
     return NULL_TREE;
@@ -5573,7 +5578,7 @@ fold_const_aggregate_ref_1 (tree t, tree
     case BIT_FIELD_REF:
     case TARGET_MEM_REF:
     case MEM_REF:
-      base = get_ref_base_and_extent (t, &offset, &size, &max_size);
+      base = get_ref_base_and_extent (t, &offset, &size, &max_size, &reverse);
       ctor = get_base_constructor (base, &offset, valueize);
 
       /* Empty constructor.  Always fold to 0.  */
Index: gcc/cfgexpand.c
===================================================================
--- gcc/cfgexpand.c	(.../trunk)	(revision 224461)
+++ gcc/cfgexpand.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -4121,9 +4121,10 @@ expand_debug_expr (tree exp)
 	machine_mode mode1;
 	HOST_WIDE_INT bitsize, bitpos;
 	tree offset;
-	int volatilep = 0;
-	tree tem = get_inner_reference (exp, &bitsize, &bitpos, &offset,
-					&mode1, &unsignedp, &volatilep, false);
+	int reversep, volatilep = 0;
+	tree tem
+	  = get_inner_reference (exp, &bitsize, &bitpos, &offset, &mode1,
+				 &unsignedp, &reversep, &volatilep, false);
 	rtx orig_op0;
 
 	if (bitsize == 0)
@@ -4548,9 +4549,10 @@ expand_debug_expr (tree exp)
 	  if (handled_component_p (TREE_OPERAND (exp, 0)))
 	    {
 	      HOST_WIDE_INT bitoffset, bitsize, maxsize;
+	      bool reverse;
 	      tree decl
-		= get_ref_base_and_extent (TREE_OPERAND (exp, 0),
-					   &bitoffset, &bitsize, &maxsize);
+		= get_ref_base_and_extent (TREE_OPERAND (exp, 0), &bitoffset,
+					   &bitsize, &maxsize, &reverse);
 	      if ((TREE_CODE (decl) == VAR_DECL
 		   || TREE_CODE (decl) == PARM_DECL
 		   || TREE_CODE (decl) == RESULT_DECL)
Index: gcc/simplify-rtx.c
===================================================================
--- gcc/simplify-rtx.c	(.../trunk)	(revision 224461)
+++ gcc/simplify-rtx.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -322,10 +322,11 @@ delegitimize_mem_from_attrs (rtx x)
 	  {
 	    HOST_WIDE_INT bitsize, bitpos;
 	    tree toffset;
-	    int unsignedp, volatilep = 0;
+	    int unsignedp, reversep, volatilep = 0;
 
-	    decl = get_inner_reference (decl, &bitsize, &bitpos, &toffset,
-					&mode, &unsignedp, &volatilep, false);
+	    decl
+	      = get_inner_reference (decl, &bitsize, &bitpos, &toffset, &mode,
+				     &unsignedp, &reversep, &volatilep, false);
 	    if (bitsize != GET_MODE_BITSIZE (mode)
 		|| (bitpos % BITS_PER_UNIT)
 		|| (toffset && !tree_fits_shwi_p (toffset)))
Index: gcc/tree-ssa-pre.c
===================================================================
--- gcc/tree-ssa-pre.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-pre.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -2552,7 +2552,9 @@ create_component_ref_by_pieces_1 (basic_
 						     off));
 	    baseop = build_fold_addr_expr (base);
 	  }
-	return fold_build2 (MEM_REF, currop->type, baseop, offset);
+	tree t = fold_build2 (MEM_REF, currop->type, baseop, offset);
+	REF_REVERSE_STORAGE_ORDER (t) = currop->reverse;
+	return t;
       }
 
     case TARGET_MEM_REF:
@@ -2617,7 +2619,9 @@ create_component_ref_by_pieces_1 (basic_
 	  return NULL_TREE;
 	tree op1 = currop->op0;
 	tree op2 = currop->op1;
-	return fold_build3 (BIT_FIELD_REF, currop->type, genop0, op1, op2);
+	tree t = build3 (BIT_FIELD_REF, currop->type, genop0, op1, op2);
+	REF_REVERSE_STORAGE_ORDER (t) = currop->reverse;
+	return fold (t);
       }
 
       /* For array ref vn_reference_op's, operand 1 of the array ref
Index: gcc/tree-sra.c
===================================================================
--- gcc/tree-sra.c	(.../trunk)	(revision 224461)
+++ gcc/tree-sra.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -207,12 +207,15 @@ struct access
      when grp_to_be_replaced flag is set.  */
   tree replacement_decl;
 
-  /* Is this particular access write access? */
-  unsigned write : 1;
-
   /* Is this access an access to a non-addressable field? */
   unsigned non_addressable : 1;
 
+  /* Is this access made in reverse storage order? */
+  unsigned reverse : 1;
+
+  /* Is this particular access write access? */
+  unsigned write : 1;
+
   /* Is this access currently in the work queue?  */
   unsigned grp_queued : 1;
 
@@ -483,6 +486,8 @@ dump_access (FILE *f, struct access *acc
   print_generic_expr (f, access->expr, 0);
   fprintf (f, ", type = ");
   print_generic_expr (f, access->type, 0);
+  fprintf (f, ", non_addressable = %d, reverse = %d",
+	   access->non_addressable, access->reverse);
   if (grp)
     fprintf (f, ", grp_read = %d, grp_write = %d, grp_assignment_read = %d, "
 	     "grp_assignment_write = %d, grp_scalar_read = %d, "
@@ -899,9 +904,9 @@ create_access (tree expr, gimple stmt, b
   struct access *access;
   HOST_WIDE_INT offset, size, max_size;
   tree base = expr;
-  bool ptr, unscalarizable_region = false;
+  bool reverse, ptr, unscalarizable_region = false;
 
-  base = get_ref_base_and_extent (expr, &offset, &size, &max_size);
+  base = get_ref_base_and_extent (expr, &offset, &size, &max_size, &reverse);
 
   if (sra_mode == SRA_MODE_EARLY_IPA
       && TREE_CODE (base) == MEM_REF)
@@ -955,6 +960,7 @@ create_access (tree expr, gimple stmt, b
   access->write = write;
   access->grp_unscalarizable_region = unscalarizable_region;
   access->stmt = stmt;
+  access->reverse = reverse;
 
   if (TREE_CODE (expr) == COMPONENT_REF
       && DECL_NONADDRESSABLE_P (TREE_OPERAND (expr, 1)))
@@ -1022,6 +1028,7 @@ completely_scalarize_record (tree base,
 	    access->expr = nref;
 	    access->type = ft;
 	    access->grp_total_scalarization = 1;
+	    access->reverse = TYPE_REVERSE_STORAGE_ORDER (decl_type);
 	    /* Accesses for intraprocedural SRA can have their stmt NULL.  */
 	  }
 	else
@@ -1101,7 +1108,7 @@ build_access_from_expr_1 (tree expr, gim
      and not the result type.  Ada produces such statements.  We are also
      capable of handling the topmost V_C_E but not any of those buried in other
      handled components.  */
-  if (TREE_CODE (expr) == VIEW_CONVERT_EXPR)
+  if (TREE_CODE (expr) == VIEW_CONVERT_EXPR && !storage_order_barrier_p (expr))
     expr = TREE_OPERAND (expr, 0);
 
   if (contains_view_convert_expr_p (expr))
@@ -1234,7 +1241,11 @@ build_accesses_from_assign (gimple stmt)
   lacc = build_access_from_expr_1 (lhs, stmt, true);
 
   if (lacc)
-    lacc->grp_assignment_write = 1;
+    {
+      lacc->grp_assignment_write = 1;
+      if (storage_order_barrier_p (rhs))
+	lacc->grp_unscalarizable_region = 1;
+    }
 
   if (racc)
     {
@@ -1242,6 +1253,8 @@ build_accesses_from_assign (gimple stmt)
       if (should_scalarize_away_bitmap && !gimple_has_volatile_ops (stmt)
 	  && !is_gimple_reg_type (racc->type))
 	bitmap_set_bit (should_scalarize_away_bitmap, DECL_UID (racc->base));
+      if (storage_order_barrier_p (lhs))
+	racc->grp_unscalarizable_region = 1;
     }
 
   if (lacc && racc
@@ -1549,17 +1562,15 @@ make_fancy_name (tree expr)
 }
 
 /* Construct a MEM_REF that would reference a part of aggregate BASE of type
-   EXP_TYPE at the given OFFSET.  If BASE is something for which
-   get_addr_base_and_unit_offset returns NULL, gsi must be non-NULL and is used
-   to insert new statements either before or below the current one as specified
-   by INSERT_AFTER.  This function is not capable of handling bitfields.
-
-   BASE must be either a declaration or a memory reference that has correct
-   alignment ifformation embeded in it (e.g. a pre-existing one in SRA).  */
+   EXP_TYPE at the given OFFSET and with storage order REVERSE.  If BASE is
+   something for which get_addr_base_and_unit_offset returns NULL, gsi must
+   be non-NULL and is used to insert new statements either before or below
+   the current one as specified by INSERT_AFTER.  This function is not capable
+   of handling bitfields.  */
 
 tree
 build_ref_for_offset (location_t loc, tree base, HOST_WIDE_INT offset,
-		      tree exp_type, gimple_stmt_iterator *gsi,
+		      bool reverse, tree exp_type, gimple_stmt_iterator *gsi,
 		      bool insert_after)
 {
   tree prev_base = base;
@@ -1616,6 +1627,7 @@ build_ref_for_offset (location_t loc, tr
     exp_type = build_aligned_type (exp_type, align);
 
   mem_ref = fold_build2_loc (loc, MEM_REF, exp_type, base, off);
+  REF_REVERSE_STORAGE_ORDER (mem_ref) = reverse;
   if (TREE_THIS_VOLATILE (prev_base))
     TREE_THIS_VOLATILE (mem_ref) = 1;
   if (TREE_SIDE_EFFECTS (prev_base))
@@ -1642,13 +1654,17 @@ build_ref_for_model (location_t loc, tre
 
       offset -= int_bit_position (fld);
       exp_type = TREE_TYPE (TREE_OPERAND (model->expr, 0));
-      t = build_ref_for_offset (loc, base, offset, exp_type, gsi, insert_after);
+      t = build_ref_for_offset (loc, base, offset, model->reverse, exp_type,
+				gsi, insert_after);
+      /* The flag will be set on the record type.  */
+      REF_REVERSE_STORAGE_ORDER (t) = 0;
       return fold_build3_loc (loc, COMPONENT_REF, TREE_TYPE (fld), t, fld,
 			      NULL_TREE);
     }
   else
-    return build_ref_for_offset (loc, base, offset, model->type,
-				 gsi, insert_after);
+    return
+      build_ref_for_offset (loc, base, offset, model->reverse, model->type,
+			    gsi, insert_after);
 }
 
 /* Attempt to build a memory reference that we could but into a gimple
@@ -2305,8 +2321,8 @@ analyze_access_subtree (struct access *r
 		      && (root->size % BITS_PER_UNIT) == 0);
 	  root->type = build_nonstandard_integer_type (root->size,
 						       TYPE_UNSIGNED (rt));
-	  root->expr = build_ref_for_offset (UNKNOWN_LOCATION,
-					     root->base, root->offset,
+	  root->expr = build_ref_for_offset (UNKNOWN_LOCATION, root->base,
+					     root->offset, root->reverse,
 					     root->type, NULL, false);
 
 	  if (dump_file && (dump_flags & TDF_DETAILS))
@@ -2804,6 +2820,7 @@ get_access_for_expr (tree expr)
 {
   HOST_WIDE_INT offset, size, max_size;
   tree base;
+  bool reverse;
 
   /* FIXME: This should not be necessary but Ada produces V_C_Es with a type of
      a different size than the size of its argument and we need the latter
@@ -2811,7 +2828,7 @@ get_access_for_expr (tree expr)
   if (TREE_CODE (expr) == VIEW_CONVERT_EXPR)
     expr = TREE_OPERAND (expr, 0);
 
-  base = get_ref_base_and_extent (expr, &offset, &size, &max_size);
+  base = get_ref_base_and_extent (expr, &offset, &size, &max_size, &reverse);
   if (max_size == -1 || !DECL_P (base))
     return NULL;
 
@@ -4444,6 +4461,7 @@ turn_representatives_into_adjustments (v
 	      adj.type = repr->type;
 	      adj.alias_ptr_type = reference_alias_ptr_type (repr->expr);
 	      adj.offset = repr->offset;
+	      adj.reverse = repr->reverse;
 	      adj.by_ref = (POINTER_TYPE_P (TREE_TYPE (repr->base))
 			    && (repr->grp_maybe_modified
 				|| repr->grp_not_necessarilly_dereferenced));
@@ -5070,9 +5088,9 @@ ipa_sra_check_caller (struct cgraph_node
 	  tree offset;
 	  HOST_WIDE_INT bitsize, bitpos;
 	  machine_mode mode;
-	  int unsignedp, volatilep = 0;
+	  int unsignedp, reversep, volatilep = 0;
 	  get_inner_reference (arg, &bitsize, &bitpos, &offset, &mode,
-			       &unsignedp, &volatilep, false);
+			       &unsignedp, &reversep, &volatilep, false);
 	  if (bitpos % BITS_PER_UNIT)
 	    {
 	      iscc->bad_arg_alignment = true;
Index: gcc/ubsan.c
===================================================================
--- gcc/ubsan.c	(.../trunk)	(revision 224461)
+++ gcc/ubsan.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1398,9 +1398,9 @@ instrument_bool_enum_load (gimple_stmt_i
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
   machine_mode mode;
-  int volatilep = 0, unsignedp = 0;
+  int volatilep = 0, unsignedp = 0, reversep;
   tree base = get_inner_reference (rhs, &bitsize, &bitpos, &offset, &mode,
-				   &unsignedp, &volatilep, false);
+				   &unsignedp, &reversep, &volatilep, false);
   tree utype = build_nonstandard_integer_type (modebitsize, 1);
 
   if ((TREE_CODE (base) == VAR_DECL && DECL_HARD_REGISTER (base))
@@ -1783,9 +1783,9 @@ instrument_object_size (gimple_stmt_iter
   HOST_WIDE_INT bitsize, bitpos;
   tree offset;
   machine_mode mode;
-  int volatilep = 0, unsignedp = 0;
+  int volatilep = 0, unsignedp = 0, reversep;
   tree inner = get_inner_reference (t, &bitsize, &bitpos, &offset, &mode,
-				    &unsignedp, &volatilep, false);
+				    &unsignedp, &reversep, &volatilep, false);
 
   if (bitpos % BITS_PER_UNIT != 0
       || bitsize != size_in_bytes * BITS_PER_UNIT)
Index: gcc/lto/lto.c
===================================================================
--- gcc/lto/lto.c	(.../trunk)	(revision 224461)
+++ gcc/lto/lto.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1031,7 +1031,10 @@ compare_tree_sccs_1 (tree t1, tree t2, t
   compare_values (TREE_DEPRECATED);
   if (TYPE_P (t1))
     {
-      compare_values (TYPE_SATURATING);
+      if (AGGREGATE_TYPE_P (t1))
+	compare_values (TYPE_REVERSE_STORAGE_ORDER);
+      else
+	compare_values (TYPE_SATURATING);
       compare_values (TYPE_ADDR_SPACE);
     }
   else if (code == SSA_NAME)
Index: gcc/tree-streamer-out.c
===================================================================
--- gcc/tree-streamer-out.c	(.../trunk)	(revision 224461)
+++ gcc/tree-streamer-out.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -126,9 +126,14 @@ pack_ts_base_value_fields (struct bitpac
   bp_pack_value (bp, TREE_DEPRECATED (expr), 1);
   if (TYPE_P (expr))
     {
-      bp_pack_value (bp, TYPE_SATURATING (expr), 1);
+      if (AGGREGATE_TYPE_P (expr))
+	bp_pack_value (bp, TYPE_REVERSE_STORAGE_ORDER (expr), 1);
+      else
+	bp_pack_value (bp, TYPE_SATURATING (expr), 1);
       bp_pack_value (bp, TYPE_ADDR_SPACE (expr), 8);
     }
+  else if (TREE_CODE (expr) == BIT_FIELD_REF || TREE_CODE (expr) == MEM_REF)
+    bp_pack_value (bp, REF_REVERSE_STORAGE_ORDER (expr), 1);
   else if (TREE_CODE (expr) == SSA_NAME)
     {
       bp_pack_value (bp, SSA_NAME_IS_DEFAULT_DEF (expr), 1);
Index: gcc/ipa-prop.c
===================================================================
--- gcc/ipa-prop.c	(.../trunk)	(revision 224461)
+++ gcc/ipa-prop.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -1037,7 +1037,9 @@ ipa_load_from_parm_agg_1 (struct func_bo
 {
   int index;
   HOST_WIDE_INT size, max_size;
-  tree base = get_ref_base_and_extent (op, offset_p, &size, &max_size);
+  bool reverse;
+  tree base
+    = get_ref_base_and_extent (op, offset_p, &size, &max_size, &reverse);
 
   if (max_size == -1 || max_size != size || *offset_p < 0)
     return false;
@@ -1174,6 +1176,7 @@ compute_complex_assign_jump_func (struct
 {
   HOST_WIDE_INT offset, size, max_size;
   tree op1, tc_ssa, base, ssa;
+  bool reverse;
   int index;
 
   op1 = gimple_assign_rhs1 (stmt);
@@ -1221,7 +1224,7 @@ compute_complex_assign_jump_func (struct
   op1 = TREE_OPERAND (op1, 0);
   if (TREE_CODE (TREE_TYPE (op1)) != RECORD_TYPE)
     return;
-  base = get_ref_base_and_extent (op1, &offset, &size, &max_size);
+  base = get_ref_base_and_extent (op1, &offset, &size, &max_size, &reverse);
   if (TREE_CODE (base) != MEM_REF
       /* If this is a varying address, punt.  */
       || max_size == -1
@@ -1257,6 +1260,7 @@ get_ancestor_addr_info (gimple assign, t
 {
   HOST_WIDE_INT size, max_size;
   tree expr, parm, obj;
+  bool reverse;
 
   if (!gimple_assign_single_p (assign))
     return NULL_TREE;
@@ -1266,7 +1270,7 @@ get_ancestor_addr_info (gimple assign, t
     return NULL_TREE;
   expr = TREE_OPERAND (expr, 0);
   obj = expr;
-  expr = get_ref_base_and_extent (expr, offset, &size, &max_size);
+  expr = get_ref_base_and_extent (expr, offset, &size, &max_size, &reverse);
 
   if (TREE_CODE (expr) != MEM_REF
       /* If this is a varying address, punt.  */
@@ -1532,10 +1536,11 @@ determine_locally_known_aggregate_parts
       else if (TREE_CODE (arg) == ADDR_EXPR)
 	{
 	  HOST_WIDE_INT arg_max_size;
+	  bool reverse;
 
 	  arg = TREE_OPERAND (arg, 0);
 	  arg_base = get_ref_base_and_extent (arg, &arg_offset, &arg_size,
-					  &arg_max_size);
+					      &arg_max_size, &reverse);
 	  if (arg_max_size == -1
 	      || arg_max_size != arg_size
 	      || arg_offset < 0)
@@ -1554,13 +1559,14 @@ determine_locally_known_aggregate_parts
   else
     {
       HOST_WIDE_INT arg_max_size;
+      bool reverse;
 
       gcc_checking_assert (AGGREGATE_TYPE_P (TREE_TYPE (arg)));
 
       by_ref = false;
       check_ref = false;
       arg_base = get_ref_base_and_extent (arg, &arg_offset, &arg_size,
-					  &arg_max_size);
+					  &arg_max_size, &reverse);
       if (arg_max_size == -1
 	  || arg_max_size != arg_size
 	  || arg_offset < 0)
@@ -1581,6 +1587,7 @@ determine_locally_known_aggregate_parts
       gimple stmt = gsi_stmt (gsi);
       HOST_WIDE_INT lhs_offset, lhs_size, lhs_max_size;
       tree lhs, rhs, lhs_base;
+      bool reverse;
 
       if (!stmt_may_clobber_ref_p_1 (stmt, &r))
 	continue;
@@ -1595,7 +1602,7 @@ determine_locally_known_aggregate_parts
 	break;
 
       lhs_base = get_ref_base_and_extent (lhs, &lhs_offset, &lhs_size,
-					  &lhs_max_size);
+					  &lhs_max_size, &reverse);
       if (lhs_max_size == -1
 	  || lhs_max_size != lhs_size)
 	break;
@@ -4060,6 +4067,7 @@ ipa_modify_call_arguments (struct cgraph
 	      base = force_gimple_operand_gsi (&gsi, base,
 					       true, NULL, true, GSI_SAME_STMT);
 	      expr = fold_build2_loc (loc, MEM_REF, type, base, off);
+	      REF_REVERSE_STORAGE_ORDER (expr) = adj->reverse;
 	      /* If expr is not a valid gimple call argument emit
 	         a load into a temporary.  */
 	      if (is_gimple_reg_type (TREE_TYPE (expr)))
@@ -4079,6 +4087,7 @@ ipa_modify_call_arguments (struct cgraph
 	  else
 	    {
 	      expr = fold_build2_loc (loc, MEM_REF, adj->type, base, off);
+	      REF_REVERSE_STORAGE_ORDER (expr) = adj->reverse;
 	      expr = build_fold_addr_expr (expr);
 	      expr = force_gimple_operand_gsi (&gsi, expr,
 					       true, NULL, true, GSI_SAME_STMT);
@@ -4250,7 +4259,9 @@ ipa_get_adjustment_candidate (tree **exp
     }
 
   HOST_WIDE_INT offset, size, max_size;
-  tree base = get_ref_base_and_extent (**expr, &offset, &size, &max_size);
+  bool reverse;
+  tree base
+    = get_ref_base_and_extent (**expr, &offset, &size, &max_size, &reverse);
   if (!base || size == -1 || max_size == -1)
     return NULL;
 
Index: gcc/ipa-prop.h
===================================================================
--- gcc/ipa-prop.h	(.../trunk)	(revision 224461)
+++ gcc/ipa-prop.h	(.../branches/scalar-storage-order)	(revision 224467)
@@ -683,6 +683,10 @@ struct ipa_parm_adjustment
      or one about to be removed.  */
   enum ipa_parm_op op;
 
+  /* Storage order of the original parameter (for the cases when the new
+     parameter is a component of an original one).  */
+  unsigned reverse : 1;
+
   /* The parameter is to be passed by reference.  */
   unsigned by_ref : 1;
 };
@@ -720,7 +724,7 @@ ipa_parm_adjustment *ipa_get_adjustment_
 
 
 /* From tree-sra.c:  */
-tree build_ref_for_offset (location_t, tree, HOST_WIDE_INT, tree,
+tree build_ref_for_offset (location_t, tree, HOST_WIDE_INT, bool, tree,
 			   gimple_stmt_iterator *, bool);
 
 /* In ipa-cp.c  */
Index: gcc/tree-ssa-dce.c
===================================================================
--- gcc/tree-ssa-dce.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-dce.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -499,8 +499,10 @@ mark_aliased_reaching_defs_necessary_1 (
     {
       tree base, lhs = gimple_get_lhs (def_stmt);
       HOST_WIDE_INT size, offset, max_size;
+      bool reverse;
       ao_ref_base (ref);
-      base = get_ref_base_and_extent (lhs, &offset, &size, &max_size);
+      base
+	= get_ref_base_and_extent (lhs, &offset, &size, &max_size, &reverse);
       /* We can get MEM[symbol: sZ, index: D.8862_1] here,
 	 so base == refd->base does not always hold.  */
       if (base == ref->base)
Index: gcc/tree-streamer-in.c
===================================================================
--- gcc/tree-streamer-in.c	(.../trunk)	(revision 224461)
+++ gcc/tree-streamer-in.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -155,9 +155,14 @@ unpack_ts_base_value_fields (struct bitp
   TREE_DEPRECATED (expr) = (unsigned) bp_unpack_value (bp, 1);
   if (TYPE_P (expr))
     {
-      TYPE_SATURATING (expr) = (unsigned) bp_unpack_value (bp, 1);
+      if (AGGREGATE_TYPE_P (expr))
+	TYPE_REVERSE_STORAGE_ORDER (expr) = (unsigned) bp_unpack_value (bp, 1);
+      else
+	TYPE_SATURATING (expr) = (unsigned) bp_unpack_value (bp, 1);
       TYPE_ADDR_SPACE (expr) = (unsigned) bp_unpack_value (bp, 8);
     }
+  else if (TREE_CODE (expr) == BIT_FIELD_REF || TREE_CODE (expr) == MEM_REF)
+    REF_REVERSE_STORAGE_ORDER (expr) = (unsigned) bp_unpack_value (bp, 1);
   else if (TREE_CODE (expr) == SSA_NAME)
     {
       SSA_NAME_IS_DEFAULT_DEF (expr) = (unsigned) bp_unpack_value (bp, 1);
Index: gcc/var-tracking.c
===================================================================
--- gcc/var-tracking.c	(.../trunk)	(revision 224461)
+++ gcc/var-tracking.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -5165,9 +5165,10 @@ track_expr_p (tree expr, bool need_rtl)
 		  && TREE_CODE (TREE_OPERAND (realdecl, 0)) == ADDR_EXPR))
 	    {
 	      HOST_WIDE_INT bitsize, bitpos, maxsize;
+	      bool reverse;
 	      tree innerdecl
 		= get_ref_base_and_extent (realdecl, &bitpos, &bitsize,
-					   &maxsize);
+					   &maxsize, &reverse);
 	      if (!DECL_P (innerdecl)
 		  || DECL_IGNORED_P (innerdecl)
 		  /* Do not track declarations for parts of tracked parameters
Index: gcc/tree-ssa-structalias.c
===================================================================
--- gcc/tree-ssa-structalias.c	(.../trunk)	(revision 224461)
+++ gcc/tree-ssa-structalias.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -3192,6 +3192,7 @@ get_constraint_for_component_ref (tree t
   HOST_WIDE_INT bitsize = -1;
   HOST_WIDE_INT bitmaxsize = -1;
   HOST_WIDE_INT bitpos;
+  bool reverse;
   tree forzero;
 
   /* Some people like to do cute things like take the address of
@@ -3213,7 +3214,7 @@ get_constraint_for_component_ref (tree t
       return;
     }
 
-  t = get_ref_base_and_extent (t, &bitpos, &bitsize, &bitmaxsize);
+  t = get_ref_base_and_extent (t, &bitpos, &bitsize, &bitmaxsize, &reverse);
 
   /* Pretend to take the address of the base, we'll take care of
      adding the required subset of sub-fields below.  */
@@ -3639,9 +3640,12 @@ do_structure_copy (tree lhsop, tree rhso
     {
       HOST_WIDE_INT lhssize, lhsmaxsize, lhsoffset;
       HOST_WIDE_INT rhssize, rhsmaxsize, rhsoffset;
+      bool reverse;
       unsigned k = 0;
-      get_ref_base_and_extent (lhsop, &lhsoffset, &lhssize, &lhsmaxsize);
-      get_ref_base_and_extent (rhsop, &rhsoffset, &rhssize, &rhsmaxsize);
+      get_ref_base_and_extent (lhsop, &lhsoffset, &lhssize, &lhsmaxsize,
+			       &reverse);
+      get_ref_base_and_extent (rhsop, &rhsoffset, &rhssize, &rhsmaxsize,
+			       &reverse);
       for (j = 0; lhsc.iterate (j, &lhsp);)
 	{
 	  varinfo_t lhsv, rhsv;
Index: gcc/config/s390/s390.c
===================================================================
--- gcc/config/s390/s390.c	(.../trunk)	(revision 224461)
+++ gcc/config/s390/s390.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -8587,7 +8587,7 @@ s390_output_pool_entry (rtx exp, machine
       gcc_assert (GET_CODE (exp) == CONST_DOUBLE);
 
       REAL_VALUE_FROM_CONST_DOUBLE (r, exp);
-      assemble_real (r, mode, align);
+      assemble_real (r, mode, align, false);
       break;
 
     case MODE_INT:
Index: gcc/config/sh/sh.md
===================================================================
--- gcc/config/sh/sh.md	(.../trunk)	(revision 224461)
+++ gcc/config/sh/sh.md	(.../branches/scalar-storage-order)	(revision 224467)
@@ -12382,7 +12382,7 @@ (define_insn "consttable_sf"
     {
       REAL_VALUE_TYPE d;
       REAL_VALUE_FROM_CONST_DOUBLE (d, operands[0]);
-      assemble_real (d, SFmode, GET_MODE_ALIGNMENT (SFmode));
+      assemble_real (d, SFmode, GET_MODE_ALIGNMENT (SFmode), false);
     }
   return "";
 }
@@ -12400,7 +12400,7 @@ (define_insn "consttable_df"
     {
       REAL_VALUE_TYPE d;
       REAL_VALUE_FROM_CONST_DOUBLE (d, operands[0]);
-      assemble_real (d, DFmode, GET_MODE_ALIGNMENT (DFmode));
+      assemble_real (d, DFmode, GET_MODE_ALIGNMENT (DFmode), false);
     }
   return "";
 }
Index: gcc/config/arm/arm.c
===================================================================
--- gcc/config/arm/arm.c	(.../trunk)	(revision 224461)
+++ gcc/config/arm/arm.c	(.../branches/scalar-storage-order)	(revision 224467)
@@ -22325,7 +22325,7 @@ arm_assemble_integer (rtx x, unsigned in
 
             assemble_real
               (rval, GET_MODE_INNER (mode),
-              i == 0 ? BIGGEST_ALIGNMENT : size * BITS_PER_UNIT);
+              i == 0 ? BIGGEST_ALIGNMENT : size * BITS_PER_UNIT, false);
           }
 
       return true;
Index: gcc/config/arm/arm.md
===================================================================
--- gcc/config/arm/arm.md	(.../trunk)	(revision 224461)
+++ gcc/config/arm/arm.md	(.../branches/scalar-storage-order)	(revision 224467)
@@ -10743,7 +10743,7 @@ (define_insn "consttable_4"
 	{
 	  REAL_VALUE_TYPE r;
 	  REAL_VALUE_FROM_CONST_DOUBLE (r, x);
-	  assemble_real (r, GET_MODE (x), BITS_PER_WORD);
+	  assemble_real (r, GET_MODE (x), BITS_PER_WORD, false);
 	  break;
 	}
       default:
@@ -10776,7 +10776,7 @@ (define_insn "consttable_8"
         {
           REAL_VALUE_TYPE r;
           REAL_VALUE_FROM_CONST_DOUBLE (r, operands[0]);
-          assemble_real (r, GET_MODE (operands[0]), BITS_PER_WORD);
+          assemble_real (r, GET_MODE (operands[0]), BITS_PER_WORD, false);
           break;
         }
       default:
@@ -10801,7 +10801,7 @@ (define_insn "consttable_16"
         {
           REAL_VALUE_TYPE r;
           REAL_VALUE_FROM_CONST_DOUBLE (r, operands[0]);
-          assemble_real (r, GET_MODE (operands[0]), BITS_PER_WORD);
+          assemble_real (r, GET_MODE (operands[0]), BITS_PER_WORD, false);
           break;
         }
       default:
Index: gcc/config/mips/mips.md
===================================================================
--- gcc/config/mips/mips.md	(.../trunk)	(revision 224461)
+++ gcc/config/mips/mips.md	(.../branches/scalar-storage-order)	(revision 224467)
@@ -7214,7 +7214,7 @@ (define_insn "consttable_float"
   gcc_assert (GET_CODE (operands[0]) == CONST_DOUBLE);
   REAL_VALUE_FROM_CONST_DOUBLE (d, operands[0]);
   assemble_real (d, GET_MODE (operands[0]),
-		 GET_MODE_BITSIZE (GET_MODE (operands[0])));
+		 GET_MODE_BITSIZE (GET_MODE (operands[0])), false);
   return "";
 }
   [(set (attr "length")

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]