This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Refine check for using bitfield instructions
- From: kenner at vlsi1 dot ultra dot nyu dot edu (Richard Kenner)
- To: gcc-patches at gcc dot gnu dot org
- Date: Mon, 5 May 03 14:02:28 EDT
- Subject: Refine check for using bitfield instructions
Unfortunately, I can't find what the test case for this was: it was
part of other changes that I split off.
Tested on i686-pc-linux-gnu.
Mon Jun 17 16:02:24 2002 Olivier Hainque <hainque@act-europe.fr>
* expr.c (expand_expr, case BIT_FIELD_REF): Refine the test forcing
usage of bitfield instructions for mode1 != BLKmode, only ignoring
SLOW_UNALIGNED_ACCESS if the field is not byte aligned.
(store_field): Likewise.
*** gcc/expr.c.ori Mon Jun 17 09:17:05 2002
--- gcc/expr.c Mon Jun 17 09:15:35 2002
*************** store_field (target, bitsize, bitpos, mo
*** 5110,5118 ****
/* If the field isn't aligned enough to store as an ordinary memref,
store it as a bit field. */
|| (mode != BLKmode
! && ((SLOW_UNALIGNED_ACCESS (mode, MEM_ALIGN (target))
! && (MEM_ALIGN (target) < GET_MODE_ALIGNMENT (mode)))
! || bitpos % GET_MODE_ALIGNMENT (mode)))
/* If the RHS and field are a constant size and the size of the
RHS isn't the same size as the bitfield, we must use bitfield
operations. */
--- 5110,5119 ----
/* If the field isn't aligned enough to store as an ordinary memref,
store it as a bit field. */
|| (mode != BLKmode
! && ((((MEM_ALIGN (target) < GET_MODE_ALIGNMENT (mode))
! || bitpos % GET_MODE_ALIGNMENT (mode))
! && SLOW_UNALIGNED_ACCESS (mode, MEM_ALIGN (target)))
! || (bitpos % BITS_PER_UNIT != 0)))
/* If the RHS and field are a constant size and the size of the
RHS isn't the same size as the bitfield, we must use bitfield
operations. */
*************** expand_expr (exp, target, tmode, modifie
*** 7009,7017 ****
/* If the field isn't aligned enough to fetch as a memref,
fetch it as a bit field. */
|| (mode1 != BLKmode
! && ((TYPE_ALIGN (TREE_TYPE (tem)) < GET_MODE_ALIGNMENT (mode)
&& SLOW_UNALIGNED_ACCESS (mode1, MEM_ALIGN (op0)))
! || (bitpos % GET_MODE_ALIGNMENT (mode) != 0)))
/* If the type and the field are a constant size and the
size of the type isn't the same size as the bitfield,
we must use bitfield operations. */
--- 7010,7019 ----
/* If the field isn't aligned enough to fetch as a memref,
fetch it as a bit field. */
|| (mode1 != BLKmode
! && (((TYPE_ALIGN (TREE_TYPE (tem)) < GET_MODE_ALIGNMENT (mode)
! || (bitpos % GET_MODE_ALIGNMENT (mode) != 0))
&& SLOW_UNALIGNED_ACCESS (mode1, MEM_ALIGN (op0)))
! || (bitpos % BITS_PER_UNIT != 0)))
/* If the type and the field are a constant size and the
size of the type isn't the same size as the bitfield,
we must use bitfield operations. */