This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
[PATCH][12/n] Remove GENERIC stmt combining from SCCVN
- From: Richard Biener <rguenther at suse dot de>
- To: gcc-patches at gcc dot gnu dot org
- Date: Thu, 2 Jul 2015 16:08:52 +0200 (CEST)
- Subject: [PATCH][12/n] Remove GENERIC stmt combining from SCCVN
- Authentication-results: sourceware.org; auth=none
This moves the alignment folding to a match.pd pattern (it's
surprising how often the old one triggered via SCCVN stmt combining).
Bootstrap and regtest running on x86_64-unknown-linux-gnu.
Richard.
2015-07-02 Richard Biener <rguenther@suse.de>
* fold-const.c (fold_binary_loc): Move (T)ptr & CST folding...
* match.pd: ... here.
Index: gcc/fold-const.c
===================================================================
--- gcc/fold-const.c (revision 225312)
+++ gcc/fold-const.c (working copy)
@@ -11069,25 +10729,6 @@ fold_binary_loc (location_t loc,
fold_convert_loc (loc, type, TREE_OPERAND (arg0, 0));
}
- /* If arg0 is derived from the address of an object or function, we may
- be able to fold this expression using the object or function's
- alignment. */
- if (POINTER_TYPE_P (TREE_TYPE (arg0)) && TREE_CODE (arg1) == INTEGER_CST)
- {
- unsigned int align;
- unsigned HOST_WIDE_INT bitpos;
-
- get_pointer_alignment_1 (arg0, &align, &bitpos);
-
- /* This works because modulus is a power of 2. If this weren't the
- case, we'd have to replace it by its greatest power-of-2
- divisor: modulus & -modulus. */
- if (wi::ltu_p (arg1, align / BITS_PER_UNIT))
- return wide_int_to_tree (type,
- wi::bit_and (arg1,
- bitpos / BITS_PER_UNIT));
- }
-
goto associate;
case RDIV_EXPR:
Index: gcc/match.pd
===================================================================
--- gcc/match.pd (revision 225312)
+++ gcc/match.pd (working copy)
@@ -668,6 +688,21 @@ (define_operator_list swapped_tcc_compar
(if (ptr_difference_const (@0, @1, &diff))
{ build_int_cst_type (type, diff); }))))
+/* If arg0 is derived from the address of an object or function, we may
+ be able to fold this expression using the object or function's
+ alignment. */
+(simplify
+ (bit_and (convert? @0) INTEGER_CST@1)
+ (if (POINTER_TYPE_P (TREE_TYPE (@0))
+ && tree_nop_conversion_p (type, TREE_TYPE (@0)))
+ (with
+ {
+ unsigned int align;
+ unsigned HOST_WIDE_INT bitpos;
+ get_pointer_alignment_1 (@0, &align, &bitpos);
+ }
+ (if (wi::ltu_p (@1, align / BITS_PER_UNIT))
+ { wide_int_to_tree (type, wi::bit_and (@1, bitpos / BITS_PER_UNIT)); }))))
/* We can't reassociate at all for saturating types. */