Speedup int_bit_from_pos

Jan Hubicka hubicka@ucw.cz
Sat Sep 20 04:04:00 GMT 2014

int_bit_position is used by ipa-devirt's type walking code.  It is currently a bottleneck
since I introduced speculation into contextes (I plan to solve this by changing the
way i cache results). But this patch seems to make sense anyway: we do not need to go
through folding:
bit_from_pos (tree offset, tree bitpos)
  if (TREE_CODE (offset) == PLUS_EXPR)
    offset = size_binop (PLUS_EXPR,
                         fold_convert (bitsizetype, TREE_OPERAND (offset, 0)),
                         fold_convert (bitsizetype, TREE_OPERAND (offset, 1)));
    offset = fold_convert (bitsizetype, offset);
  return size_binop (PLUS_EXPR, bitpos,
                     size_binop (MULT_EXPR, offset, bitsize_unit_node));

Because all the code cares only about constant offsets, we do not need to go through fold_convert,
because all the codes go via int_bit_position that already expects result to be host wide int,
it seems to make sense to implement quick path for that.

Bootstrap/regtest x86_64 in progress, OK?


	* stor-layout.c (int_bit_from_pos): New function.
	* stor-layout.h (int_bit_from_pos): Declare.
	* tree.c (int_bit_from_pos): Use it.
Index: stor-layout.c
--- stor-layout.c	(revision 215409)
+++ stor-layout.c	(working copy)
@@ -858,6 +858,20 @@
 		     size_binop (MULT_EXPR, offset, bitsize_unit_node));
+/* Like int_bit_from_pos, but return the result as HOST_WIDE_INT.
+   OFFSET and BITPOS must be constant.  */
+int_bit_from_pos (tree offset, tree bitpos)
+  if (TREE_CODE (offset) == PLUS_EXPR)
+    off = tree_to_shwi (TREE_OPERAND (offset, 0)) + tree_to_shwi (TREE_OPERAND (offset, 1));
+  else
+    off = tree_to_shwi (offset);
+  return off * BITS_PER_UNIT + tree_to_shwi (bitpos);
 /* Return the combined truncated byte position for the byte offset OFFSET and
    the bit position BITPOS.  */
Index: stor-layout.h
--- stor-layout.h	(revision 215409)
+++ stor-layout.h	(working copy)
@@ -27,6 +27,7 @@
                                                 unsigned int);
 extern record_layout_info start_record_layout (tree);
 extern tree bit_from_pos (tree, tree);
+extern HOST_WIDE_INT int_bit_from_pos (tree, tree);
 extern tree byte_from_pos (tree, tree);
 extern void pos_from_bit (tree *, tree *, unsigned int, tree);
 extern void normalize_offset (tree *, tree *, unsigned int);
Index: tree.c
--- tree.c	(revision 215409)
+++ tree.c	(working copy)
@@ -2839,7 +2839,8 @@
 int_bit_position (const_tree field)
-  return tree_to_shwi (bit_position (field));
+  return int_bit_from_pos (DECL_FIELD_OFFSET (field),
+			   DECL_FIELD_BIT_OFFSET (field));
 /* Return the byte position of FIELD, in bytes from the start of the record.

More information about the Gcc-patches mailing list