This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[PATCH] Fix PRs 52080, 52097 and 48124, rewrite bitfield expansion, enable the C++ memory model wrt bitfields everywhere


This fixes PRs 52080 (unexpected access to adjacent fields for bitfield
access), 52097 (ICE with the C++ memory model enabled) and 48124
(wrong code accessing adjacent objects).  It's an unchanged re-post
of the RFCs I sent out during stage4 (it now re-uses the pointer
for DECL_QUALIFIER instead of enlarging each FIELD_DECL).

The core of the patch is computation of an underlying FIELD_DECL that
should be used for the access of bitfield FIELD_DECLs (or at least
constrain such access).  This computation is done when laying out
a RECORD_TYPE, for simplicity after it has been layed out.  This
FIELD_DECL is currently only used to constrain bitfield accesses,
so it employs a 1:1 mapping.  I plan to change this to a 1:n
(1 bitfield group, multiple possibly overlapping representatives)
when resuming work on lowering bitfield accesses on the gimple level
(to address PR48696).

This patch also completely replaces get_bit_range (which is where
PR52097 ICEs) by a trivial implementation.

There is PR52134 which will make this patch cause 1 gnat regression.

Bootstrapped and tested on 
{x86_64,i586,s390,s390x,ia64,ppc,ppc64}-suse-linux-gnu.

If there are no serious objections (or reasonable requests) I plan
to go forward with this version early next week.

If you have a weird target it might be good to double-check codegen
quality for bitfield stores (which is what you should have done
with the C++ memory model enabled already).

Thanks,
Richard.

2012-03-05  Richard Guenther  <rguenther@suse.de>

	* tree.h (DECL_BIT_FIELD_REPRESENTATIVE): New define.
	* stor-layout.c (start_bitfield_representative): New function.
	(finish_bitfield_representative): Likewise.
	(finish_bitfield_layout): Likewise.
	(finish_record_layout): Call finish_bitfield_layout.
	* tree-streamer-in.c (lto_input_ts_field_decl_tree_pointers):
	Stream DECL_BIT_FIELD_REPRESENTATIVE.
	* tree-streamer-out.c (write_ts_field_decl_tree_pointers): Likewise.

	PR middle-end/52080
	PR middle-end/52097
	PR middle-end/48124
	* expr.c (get_bit_range): Unconditionally extract bitrange
	from DECL_BIT_FIELD_REPRESENTATIVE.
	(expand_assignment): Adjust call to get_bit_range.

	* gimplify.c (gimplify_expr): Translate bitfield accesses
	to BIT_FIELD_REFs of the representative.

        * tree-sra.c (create_access_replacement): Only rename the
        replacement if we can rewrite it into SSA form.  Properly
        mark register typed replacements that we cannot rewrite
        with TREE_ADDRESSABLE.

	* gcc.dg/torture/pr48124-1.c: New testcase.
	* gcc.dg/torture/pr48124-2.c: Likewise.
	* gcc.dg/torture/pr48124-3.c: Likewise.
	* gcc.dg/torture/pr48124-4.c: Likewise.

Index: gcc/stor-layout.c
===================================================================
*** gcc/stor-layout.c.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/stor-layout.c	2012-02-24 12:53:11.000000000 +0100
*************** finalize_type_size (tree type)
*** 1722,1727 ****
--- 1722,1908 ----
      }
  }
  
+ /* Return a new underlying object for a bitfield started with FIELD.  */
+ 
+ static tree
+ start_bitfield_representative (tree field)
+ {
+   tree repr = make_node (FIELD_DECL);
+   DECL_FIELD_OFFSET (repr) = DECL_FIELD_OFFSET (field);
+   /* Force the representative to begin at an BITS_PER_UNIT aligned
+      boundary - C++ may use tail-padding of a base object to
+      continue packing bits so the bitfield region does not start
+      at bit zero (see g++.dg/abi/bitfield5.C for example).
+      Unallocated bits may happen for other reasons as well,
+      for example Ada which allows explicit bit-granular structure layout.  */
+   DECL_FIELD_BIT_OFFSET (repr)
+     = size_binop (BIT_AND_EXPR,
+ 		  DECL_FIELD_BIT_OFFSET (field),
+ 		  bitsize_int (~(BITS_PER_UNIT - 1)));
+   SET_DECL_OFFSET_ALIGN (repr, DECL_OFFSET_ALIGN (field));
+   DECL_SIZE (repr) = DECL_SIZE (field);
+   DECL_PACKED (repr) = DECL_PACKED (field);
+   DECL_CONTEXT (repr) = DECL_CONTEXT (field);
+   return repr;
+ }
+ 
+ /* Finish up a bitfield group that was started by creating the underlying
+    object REPR with the last fied in the bitfield group FIELD.  */
+ 
+ static void
+ finish_bitfield_representative (tree repr, tree field)
+ {
+   unsigned HOST_WIDE_INT bitsize, maxbitsize, modesize;
+   enum machine_mode mode = BLKmode;
+   tree nextf, size;
+ 
+   size = size_diffop (DECL_FIELD_OFFSET (field),
+ 		      DECL_FIELD_OFFSET (repr));
+   gcc_assert (host_integerp (size, 1));
+   bitsize = (tree_low_cst (size, 1) * BITS_PER_UNIT
+ 	     + tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
+ 	     - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1)
+ 	     + tree_low_cst (DECL_SIZE (field), 1));
+ 
+   /* Now nothing tells us how to pad out bitsize ...  */
+   nextf = DECL_CHAIN (field);
+   while (nextf && TREE_CODE (nextf) != FIELD_DECL)
+     nextf = DECL_CHAIN (nextf);
+   if (nextf)
+     {
+       tree maxsize;
+       /* If there was an error, the field may be not layed out
+          correctly.  Don't bother to do anything.  */
+       if (TREE_TYPE (nextf) == error_mark_node)
+ 	return;
+       maxsize = size_diffop (DECL_FIELD_OFFSET (nextf),
+ 			     DECL_FIELD_OFFSET (repr));
+       gcc_assert (host_integerp (maxsize, 1));
+       maxbitsize = (tree_low_cst (maxsize, 1) * BITS_PER_UNIT
+ 		    + tree_low_cst (DECL_FIELD_BIT_OFFSET (nextf), 1)
+ 		    - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
+     }
+   else
+     {
+       /* ???  If you consider that tail-padding of this struct might be
+          re-used when deriving from it we cannot really do the following
+ 	 and thus need to set maxsize to bitsize?  */
+       tree maxsize = size_diffop (TYPE_SIZE_UNIT (DECL_CONTEXT (field)),
+ 				  DECL_FIELD_OFFSET (repr));
+       gcc_assert (host_integerp (maxsize, 1));
+       maxbitsize = (tree_low_cst (maxsize, 1) * BITS_PER_UNIT
+ 		    - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
+     }
+ 
+   /* Only if we don't artificially break up the representative in
+      the middle of a large bitfield with different possibly
+      overlapping representatives.  And all representatives start
+      at byte offset.  */
+   gcc_assert (maxbitsize % BITS_PER_UNIT == 0);
+ 
+   /* Round up bitsize to multiples of BITS_PER_UNIT.  */
+   bitsize = (bitsize + BITS_PER_UNIT - 1) & ~(BITS_PER_UNIT - 1);
+ 
+   /* Find the smallest nice mode to use.
+      ???  Possibly use get_best_mode with appropriate arguments instead
+      (which would eventually require splitting representatives here).  */
+   for (modesize = bitsize; modesize <= maxbitsize; modesize += BITS_PER_UNIT)
+     {
+       mode = mode_for_size (modesize, MODE_INT, 1);
+       if (mode != BLKmode)
+ 	break;
+     }
+ 
+   if (mode == BLKmode)
+     {
+       /* We really want a BLKmode representative only as a last resort,
+          considering the member b in
+ 	   struct { int a : 7; int b : 17; int c; } __attribute__((packed));
+ 	 Otherwise we simply want to split the representative up
+ 	 allowing for overlaps within the bitfield region as required for
+ 	   struct { int a : 7; int b : 7; int c : 10; int d; } __attribute__((packed));
+ 	 [0, 15] HImode for a and b, [8, 23] HImode for c.  */
+       DECL_SIZE (repr) = bitsize_int (bitsize);
+       DECL_SIZE_UNIT (repr) = size_int (bitsize / BITS_PER_UNIT);
+       DECL_MODE (repr) = BLKmode;
+       TREE_TYPE (repr) = build_array_type_nelts (unsigned_char_type_node,
+ 						 bitsize / BITS_PER_UNIT);
+     }
+   else
+     {
+       DECL_SIZE (repr) = bitsize_int (modesize);
+       DECL_SIZE_UNIT (repr) = size_int (modesize / BITS_PER_UNIT);
+       DECL_MODE (repr) = mode;
+       TREE_TYPE (repr) = lang_hooks.types.type_for_mode (mode, 1);
+     }
+ 
+   /* Remember whether the bitfield group is at the end of the
+      structure or not.  */
+   DECL_CHAIN (repr) = nextf;
+ }
+ 
+ /* Compute and set FIELD_DECLs for the underlying objects we should
+    use for bitfield access for the structure layed out with RLI.  */
+ 
+ static void
+ finish_bitfield_layout (record_layout_info rli)
+ {
+   tree field, prev;
+   tree repr = NULL_TREE;
+ 
+   /* Unions would be special, for the ease of type-punning optimizations
+      we could use the underlying type as hint for the representative
+      if the bitfield would fit and the representative would not exceed
+      the union in size.  */
+   if (TREE_CODE (rli->t) != RECORD_TYPE)
+     return;
+ 
+   for (prev = NULL_TREE, field = TYPE_FIELDS (rli->t);
+        field; field = DECL_CHAIN (field))
+     {
+       if (TREE_CODE (field) != FIELD_DECL)
+ 	continue;
+ 
+       /* In the C++ memory model, consecutive bit fields in a structure are
+ 	 considered one memory location and updating a memory location
+ 	 may not store into adjacent memory locations.  */
+       if (!repr
+ 	  && DECL_BIT_FIELD_TYPE (field))
+ 	{
+ 	  /* Start new representative.  */
+ 	  repr = start_bitfield_representative (field);
+ 	}
+       else if (repr
+ 	       && ! DECL_BIT_FIELD_TYPE (field))
+ 	{
+ 	  /* Finish off new representative.  */
+ 	  finish_bitfield_representative (repr, prev);
+ 	  repr = NULL_TREE;
+ 	}
+       else if (DECL_BIT_FIELD_TYPE (field))
+ 	{
+ 	  /* Zero-size bitfields finish off a representative and
+ 	     do not have a representative themselves.  This is
+ 	     required by the C++ memory model.  */
+ 	  if (integer_zerop (DECL_SIZE (field)))
+ 	    {
+ 	      finish_bitfield_representative (repr, prev);
+ 	      repr = NULL_TREE;
+ 	    }
+ 	}
+       else
+ 	continue;
+ 
+       if (repr)
+ 	DECL_BIT_FIELD_REPRESENTATIVE (field) = repr;
+ 
+       prev = field;
+     }
+ 
+   if (repr)
+     finish_bitfield_representative (repr, prev);
+ }
+ 
  /* Do all of the work required to layout the type indicated by RLI,
     once the fields have been laid out.  This function will call `free'
     for RLI, unless FREE_P is false.  Passing a value other than false
*************** finish_record_layout (record_layout_info
*** 1742,1747 ****
--- 1923,1931 ----
    /* Perform any last tweaks to the TYPE_SIZE, etc.  */
    finalize_type_size (rli->t);
  
+   /* Compute bitfield representatives.  */
+   finish_bitfield_layout (rli);
+ 
    /* Propagate TYPE_PACKED to variants.  With C++ templates,
       handle_packed_attribute is too early to do this.  */
    for (variant = TYPE_NEXT_VARIANT (rli->t); variant;
Index: gcc/tree.h
===================================================================
*** gcc/tree.h.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/tree.h	2012-02-24 12:52:39.000000000 +0100
*************** struct GTY(()) tree_decl_with_rtl {
*** 3021,3026 ****
--- 3021,3031 ----
  #define DECL_BIT_FIELD_TYPE(NODE) \
    (FIELD_DECL_CHECK (NODE)->field_decl.bit_field_type)
  
+ /* In a FIELD_DECL of a RECORD_TYPE, this is a pointer to the storage
+    representative FIELD_DECL.  */
+ #define DECL_BIT_FIELD_REPRESENTATIVE(NODE) \
+   (FIELD_DECL_CHECK (NODE)->field_decl.qualifier)
+ 
  /* For a FIELD_DECL in a QUAL_UNION_TYPE, records the expression, which
     if nonzero, indicates that the field occupies the type.  */
  #define DECL_QUALIFIER(NODE) (FIELD_DECL_CHECK (NODE)->field_decl.qualifier)
Index: gcc/tree-sra.c
===================================================================
*** gcc/tree-sra.c.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/tree-sra.c	2012-02-24 12:52:39.000000000 +0100
*************** create_access_replacement (struct access
*** 1922,1934 ****
  
    repl = create_tmp_var (access->type, "SR");
    add_referenced_var (repl);
!   if (rename)
      mark_sym_for_renaming (repl);
  
!   if (!access->grp_partial_lhs
!       && (TREE_CODE (access->type) == COMPLEX_TYPE
! 	  || TREE_CODE (access->type) == VECTOR_TYPE))
!     DECL_GIMPLE_REG_P (repl) = 1;
  
    DECL_SOURCE_LOCATION (repl) = DECL_SOURCE_LOCATION (access->base);
    DECL_ARTIFICIAL (repl) = 1;
--- 1922,1940 ----
  
    repl = create_tmp_var (access->type, "SR");
    add_referenced_var (repl);
!   if (!access->grp_partial_lhs
!       && rename)
      mark_sym_for_renaming (repl);
  
!   if (TREE_CODE (access->type) == COMPLEX_TYPE
!       || TREE_CODE (access->type) == VECTOR_TYPE)
!     {
!       if (!access->grp_partial_lhs)
! 	DECL_GIMPLE_REG_P (repl) = 1;
!     }
!   else if (access->grp_partial_lhs
! 	   && is_gimple_reg_type (access->type))
!     TREE_ADDRESSABLE (repl) = 1;
  
    DECL_SOURCE_LOCATION (repl) = DECL_SOURCE_LOCATION (access->base);
    DECL_ARTIFICIAL (repl) = 1;
Index: gcc/tree-cfg.c
===================================================================
*** gcc/tree-cfg.c.orig	2012-02-24 12:29:18.000000000 +0100
--- gcc/tree-cfg.c	2012-02-24 12:52:39.000000000 +0100
*************** verify_expr (tree *tp, int *walk_subtree
*** 2879,2895 ****
  		  error ("invalid position or size operand to BIT_FIELD_REF");
  		  return t;
  		}
! 	      else if (INTEGRAL_TYPE_P (TREE_TYPE (t))
! 		       && (TYPE_PRECISION (TREE_TYPE (t))
! 			   != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
  		{
  		  error ("integral result type precision does not match "
  			 "field size of BIT_FIELD_REF");
  		  return t;
  		}
! 	      if (!INTEGRAL_TYPE_P (TREE_TYPE (t))
! 		  && (GET_MODE_PRECISION (TYPE_MODE (TREE_TYPE (t)))
! 		      != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
  		{
  		  error ("mode precision of non-integral result does not "
  			 "match field size of BIT_FIELD_REF");
--- 2879,2897 ----
  		  error ("invalid position or size operand to BIT_FIELD_REF");
  		  return t;
  		}
! 	      if (INTEGRAL_TYPE_P (TREE_TYPE (t))
! 		  && (TYPE_PRECISION (TREE_TYPE (t))
! 		      != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
  		{
  		  error ("integral result type precision does not match "
  			 "field size of BIT_FIELD_REF");
  		  return t;
  		}
! 	      else if (!INTEGRAL_TYPE_P (TREE_TYPE (t))
! 		       && !AGGREGATE_TYPE_P (TREE_TYPE (t))
! 		       && TYPE_MODE (TREE_TYPE (t)) != BLKmode
! 		       && (GET_MODE_PRECISION (TYPE_MODE (TREE_TYPE (t)))
! 			   != TREE_INT_CST_LOW (TREE_OPERAND (t, 1))))
  		{
  		  error ("mode precision of non-integral result does not "
  			 "match field size of BIT_FIELD_REF");
Index: gcc/expr.c
===================================================================
*** gcc/expr.c.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/expr.c	2012-02-24 12:52:39.000000000 +0100
*************** optimize_bitfield_assignment_op (unsigne
*** 4439,4551 ****
  /* In the C++ memory model, consecutive bit fields in a structure are
     considered one memory location.
  
!    Given a COMPONENT_REF, this function returns the bit range of
!    consecutive bits in which this COMPONENT_REF belongs in.  The
!    values are returned in *BITSTART and *BITEND.  If either the C++
!    memory model is not activated, or this memory access is not thread
!    visible, 0 is returned in *BITSTART and *BITEND.
! 
!    EXP is the COMPONENT_REF.
!    INNERDECL is the actual object being referenced.
!    BITPOS is the position in bits where the bit starts within the structure.
!    BITSIZE is size in bits of the field being referenced in EXP.
! 
!    For example, while storing into FOO.A here...
! 
!       struct {
!         BIT 0:
!           unsigned int a : 4;
! 	  unsigned int b : 1;
! 	BIT 8:
! 	  unsigned char c;
! 	  unsigned int d : 6;
!       } foo;
! 
!    ...we are not allowed to store past <b>, so for the layout above, a
!    range of 0..7 (because no one cares if we store into the
!    padding).  */
  
  static void
  get_bit_range (unsigned HOST_WIDE_INT *bitstart,
  	       unsigned HOST_WIDE_INT *bitend,
! 	       tree exp, tree innerdecl,
! 	       HOST_WIDE_INT bitpos, HOST_WIDE_INT bitsize)
  {
!   tree field, record_type, fld;
!   bool found_field = false;
!   bool prev_field_is_bitfield;
  
    gcc_assert (TREE_CODE (exp) == COMPONENT_REF);
  
!   /* If other threads can't see this value, no need to restrict stores.  */
!   if (ALLOW_STORE_DATA_RACES
!       || ((TREE_CODE (innerdecl) == MEM_REF
! 	   || TREE_CODE (innerdecl) == TARGET_MEM_REF)
! 	  && !ptr_deref_may_alias_global_p (TREE_OPERAND (innerdecl, 0)))
!       || (DECL_P (innerdecl)
! 	  && ((TREE_CODE (innerdecl) == VAR_DECL
! 	       && DECL_THREAD_LOCAL_P (innerdecl))
! 	      || !TREE_STATIC (innerdecl))))
      {
        *bitstart = *bitend = 0;
        return;
      }
  
!   /* Bit field we're storing into.  */
!   field = TREE_OPERAND (exp, 1);
!   record_type = DECL_FIELD_CONTEXT (field);
! 
!   /* Count the contiguous bitfields for the memory location that
!      contains FIELD.  */
!   *bitstart = 0;
!   prev_field_is_bitfield = true;
!   for (fld = TYPE_FIELDS (record_type); fld; fld = DECL_CHAIN (fld))
!     {
!       tree t, offset;
!       enum machine_mode mode;
!       int unsignedp, volatilep;
! 
!       if (TREE_CODE (fld) != FIELD_DECL)
! 	continue;
! 
!       t = build3 (COMPONENT_REF, TREE_TYPE (exp),
! 		  unshare_expr (TREE_OPERAND (exp, 0)),
! 		  fld, NULL_TREE);
!       get_inner_reference (t, &bitsize, &bitpos, &offset,
! 			   &mode, &unsignedp, &volatilep, true);
! 
!       if (field == fld)
! 	found_field = true;
! 
!       if (DECL_BIT_FIELD_TYPE (fld) && bitsize > 0)
! 	{
! 	  if (prev_field_is_bitfield == false)
! 	    {
! 	      *bitstart = bitpos;
! 	      prev_field_is_bitfield = true;
! 	    }
! 	}
!       else
! 	{
! 	  prev_field_is_bitfield = false;
! 	  if (found_field)
! 	    break;
! 	}
!     }
!   gcc_assert (found_field);
  
!   if (fld)
!     {
!       /* We found the end of the bit field sequence.  Include the
! 	 padding up to the next field and be done.  */
!       *bitend = bitpos - 1;
!     }
!   else
!     {
!       /* If this is the last element in the structure, include the padding
! 	 at the end of structure.  */
!       *bitend = TREE_INT_CST_LOW (TYPE_SIZE (record_type)) - 1;
!     }
  }
  
  /* Returns true if the MEM_REF REF refers to an object that does not
--- 4439,4481 ----
  /* In the C++ memory model, consecutive bit fields in a structure are
     considered one memory location.
  
!    Given a COMPONENT_REF EXP at bit position BITPOS, this function
!    returns the bit range of consecutive bits in which this COMPONENT_REF
!    belongs in.  The values are returned in *BITSTART and *BITEND.
!    If the access does not need to be restricted 0 is returned in
!    *BITSTART and *BITEND.  */
  
  static void
  get_bit_range (unsigned HOST_WIDE_INT *bitstart,
  	       unsigned HOST_WIDE_INT *bitend,
! 	       tree exp,
! 	       HOST_WIDE_INT bitpos)
  {
!   unsigned HOST_WIDE_INT bitoffset;
!   tree field, repr, offset;
  
    gcc_assert (TREE_CODE (exp) == COMPONENT_REF);
  
!   field = TREE_OPERAND (exp, 1);
!   repr = DECL_BIT_FIELD_REPRESENTATIVE (field);
!   /* If we do not have a DECL_BIT_FIELD_REPRESENTATIVE there is no
!      need to limit the range we can access.  */
!   if (!repr)
      {
        *bitstart = *bitend = 0;
        return;
      }
  
!   /* Compute the adjustment to bitpos from the offset of the field
!      relative to the representative.  */
!   offset = size_diffop (DECL_FIELD_OFFSET (field),
! 			DECL_FIELD_OFFSET (repr));
!   bitoffset = (tree_low_cst (offset, 1) * BITS_PER_UNIT
! 	       + tree_low_cst (DECL_FIELD_BIT_OFFSET (field), 1)
! 	       - tree_low_cst (DECL_FIELD_BIT_OFFSET (repr), 1));
  
!   *bitstart = bitpos - bitoffset;
!   *bitend = *bitstart + tree_low_cst (DECL_SIZE (repr), 1) - 1;
  }
  
  /* Returns true if the MEM_REF REF refers to an object that does not
*************** expand_assignment (tree to, tree from, b
*** 4673,4680 ****
  
        if (TREE_CODE (to) == COMPONENT_REF
  	  && DECL_BIT_FIELD_TYPE (TREE_OPERAND (to, 1)))
! 	get_bit_range (&bitregion_start, &bitregion_end,
! 		       to, tem, bitpos, bitsize);
  
        /* If we are going to use store_bit_field and extract_bit_field,
  	 make sure to_rtx will be safe for multiple use.  */
--- 4603,4609 ----
  
        if (TREE_CODE (to) == COMPONENT_REF
  	  && DECL_BIT_FIELD_TYPE (TREE_OPERAND (to, 1)))
! 	get_bit_range (&bitregion_start, &bitregion_end, to, bitpos);
  
        /* If we are going to use store_bit_field and extract_bit_field,
  	 make sure to_rtx will be safe for multiple use.  */
Index: gcc/testsuite/gcc.dg/torture/pr48124-1.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/gcc.dg/torture/pr48124-1.c	2012-02-24 12:52:39.000000000 +0100
***************
*** 0 ****
--- 1,33 ----
+ /* { dg-do run } */
+ /* { dg-options "-fno-toplevel-reorder" } */
+ 
+ extern void abort (void);
+ 
+ struct S
+ {
+   signed a : 26;
+   signed b : 16;
+   signed c : 10;
+   volatile signed d : 14;
+ };
+ 
+ static struct S e = { 0, 0, 0, 1 };
+ static int f = 1;
+ 
+ void __attribute__((noinline))
+ foo (void)
+ {
+   e.d = 0;
+   f = 2;
+ }
+ 
+ int
+ main ()
+ {
+   if (e.a || e.b || e.c || e.d != 1 || f != 1)
+     abort ();
+   foo ();
+   if (e.a || e.b || e.c || e.d || f != 2)
+     abort ();
+   return 0;
+ }
Index: gcc/testsuite/gcc.dg/torture/pr48124-2.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/gcc.dg/torture/pr48124-2.c	2012-02-24 12:52:39.000000000 +0100
***************
*** 0 ****
--- 1,27 ----
+ /* { dg-do run } */
+ 
+ extern void abort (void);
+ 
+ static volatile struct S0 {
+     short f3[9];
+     unsigned f8 : 15;
+ } s = {1};
+ static unsigned short sh = 0x1234;
+ 
+ struct S0 a, b;
+ int vi = 0;
+ 
+ void func_4()
+ {
+   s.f8 |= 1;
+   sh = 15;
+   if (vi) a = b;
+ }
+ 
+ int main()
+ {
+   func_4();
+   if (sh != 15)
+     abort ();
+   return 0;
+ }
Index: gcc/testsuite/gcc.dg/torture/pr48124-3.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/gcc.dg/torture/pr48124-3.c	2012-02-24 12:52:39.000000000 +0100
***************
*** 0 ****
--- 1,32 ----
+ /* { dg-do run } */
+ 
+ extern void abort (void);
+ struct S1
+ {
+   int f0;
+   int:1;
+   int f3;
+   int:1;
+   int:0;
+   int f6:1;
+ };
+ int g_13 = 1;
+ volatile struct S1 g_118 = {
+     1
+ };
+ 
+ void __attribute__((noinline))
+ func_46 ()
+ {
+   for (g_13 = 0; g_13 >= 0; g_13 -= 1)
+     g_118.f6 = 0;
+ }
+ 
+ int
+ main ()
+ {
+   func_46 ();
+   if (g_13 != -1)
+     abort ();
+   return 0;
+ }
Index: gcc/testsuite/gcc.dg/torture/pr48124-4.c
===================================================================
*** /dev/null	1970-01-01 00:00:00.000000000 +0000
--- gcc/testsuite/gcc.dg/torture/pr48124-4.c	2012-02-24 12:52:39.000000000 +0100
***************
*** 0 ****
--- 1,28 ----
+ /* { dg-do run } */
+ 
+ extern void abort (void);
+ struct S1 {
+     unsigned f0, f1;
+     unsigned short f2, f3;
+     unsigned f4 : 16;
+     unsigned f5, f6;
+     volatile unsigned f7 : 28;
+ };
+ static struct S1 g_76;
+ static struct S1 g_245 = {0,0,0,0,0,0,0,1};
+ static signed char g_323 = 0x80;
+ static void func_1(void)
+ {
+   g_245.f7 &= 1;
+   for (g_323 = 0; g_323 <= -1; g_323 -= 2) {
+       g_76 = g_76;
+       g_76.f4 ^= 11;
+   }
+ }
+ int main()
+ {
+   func_1();
+   if (g_323 != 0 || g_245.f7 != 1)
+     abort ();
+   return 0;
+ }
Index: gcc/tree-streamer-in.c
===================================================================
*** gcc/tree-streamer-in.c.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/tree-streamer-in.c	2012-02-24 12:52:39.000000000 +0100
*************** lto_input_ts_field_decl_tree_pointers (s
*** 640,646 ****
  {
    DECL_FIELD_OFFSET (expr) = stream_read_tree (ib, data_in);
    DECL_BIT_FIELD_TYPE (expr) = stream_read_tree (ib, data_in);
!   /* Do not stream DECL_QUALIFIER, it is useless after gimplification.  */
    DECL_FIELD_BIT_OFFSET (expr) = stream_read_tree (ib, data_in);
    DECL_FCONTEXT (expr) = stream_read_tree (ib, data_in);
  }
--- 640,646 ----
  {
    DECL_FIELD_OFFSET (expr) = stream_read_tree (ib, data_in);
    DECL_BIT_FIELD_TYPE (expr) = stream_read_tree (ib, data_in);
!   DECL_BIT_FIELD_REPRESENTATIVE (expr) = stream_read_tree (ib, data_in);
    DECL_FIELD_BIT_OFFSET (expr) = stream_read_tree (ib, data_in);
    DECL_FCONTEXT (expr) = stream_read_tree (ib, data_in);
  }
Index: gcc/tree-streamer-out.c
===================================================================
*** gcc/tree-streamer-out.c.orig	2012-02-21 12:59:35.000000000 +0100
--- gcc/tree-streamer-out.c	2012-02-24 12:52:39.000000000 +0100
*************** write_ts_field_decl_tree_pointers (struc
*** 552,558 ****
  {
    stream_write_tree (ob, DECL_FIELD_OFFSET (expr), ref_p);
    stream_write_tree (ob, DECL_BIT_FIELD_TYPE (expr), ref_p);
!   /* Do not stream DECL_QUALIFIER, it is useless after gimplification.  */
    stream_write_tree (ob, DECL_FIELD_BIT_OFFSET (expr), ref_p);
    stream_write_tree (ob, DECL_FCONTEXT (expr), ref_p);
  }
--- 552,558 ----
  {
    stream_write_tree (ob, DECL_FIELD_OFFSET (expr), ref_p);
    stream_write_tree (ob, DECL_BIT_FIELD_TYPE (expr), ref_p);
!   stream_write_tree (ob, DECL_BIT_FIELD_REPRESENTATIVE (expr), ref_p);
    stream_write_tree (ob, DECL_FIELD_BIT_OFFSET (expr), ref_p);
    stream_write_tree (ob, DECL_FCONTEXT (expr), ref_p);
  }


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]