This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: [patch] TARGET_MEM_REF (resent 2)
Hello,
here is the new version of the patch, reflecting the comments I have received.
Since Richard asked for review by Diego for the part of the patch dealing with
virtual operands on TARGET_MEM_REFs, and since I had to resolve a conflict
with his recent patch to ivopts, I am including him to cc.
Given that currently mainline does not build on any architecture I have
tried, I could not test the patch properly; I only bootstrapped it
on i686. Also, the patch causes some regressions in testsuite just now
-- it exposes a latent bug in updating of virtual operands in vectorizer
probably introduced during merge from tree-cleanup-branch. I of course
must fix that before I may submit the patch properly; but I would like
to get some feedback on the changes I made, and to answer your
questions.
> On Mon, Apr 04, 2005 at 10:25:04PM +0200, Zdenek Dvorak wrote:
> > *************** gimplify_addr_expr (tree *expr_p, tree *
> > *** 3270,3275 ****
> > --- 3270,3280 ----
> > ret = GS_OK;
> > break;
> >
> > + case TARGET_MEM_REF:
> > + *expr_p = tree_mem_ref_addr (TREE_TYPE (expr), op0);
> > + ret = GS_OK;
> > + break;
>
> Presumably this is here to deal with force_gimple_operands?
This leaked in from another version of the patch where it really
could happen that we took address of a TARGET_MEM_REF and called
force_gimple_operands on it. This does not happen in this version
of the patch, so I have dropped the chunk.
> > *************** tree_could_trap_p (tree expr)
> > *** 1739,1744 ****
> > --- 1739,1751 ----
> > restart:
> > switch (code)
> > {
> > + case TARGET_MEM_REF:
> > + /* For TARGET_MEM_REFs use the information based on the original
> > + reference. */
> > + expr = tmr_ann (expr)->original;
> > + code = TREE_CODE (expr);
>
> So tmr annotations are non-optional? Why, then, are they annotations
> instead of direct operands?
Moreorless for consistency; the annotations contain the tag for virtual
operands for the TARGET_MEM_REF, and such information is stored in
the annotations for all other memory reference types. Also, neither
the tag nor the original reference look like "expression operand" to
me. Of course, if you prefer I will turn them into operands of
the TARGET_MEM_REF.
> > + sym = gen_rtx_SYMBOL_REF (Pmode, ggc_strdup ("test_symbol"));
>
> Problems: There are at least 5 different kinds of symbols that
> you can run up against:
>
> (1) binds_local_p, small data area.
> (2) binds_local_p, eg local statics
> (3) !binds_local_p, eg global variables
> (4) thread local, local_exec
> (5) thread local, !local_exec
>
> Now, (1) won't appear often in an array context, but it certainly can.
> All you have to do is set -GN high enough, or explicitly mark any
> random object __attribute__((section (".sdata"))).
>
> All of these affect whether or not a symbol is in fact a valid address.
> The only one you've tested here is (3). And that result may very well
> be incorrect for (4) or (5).
>
> Fortunately, an incorrect result here doesn't appear to yield incorrect
> results out the back end, because it would seem that the expr.c expander
> would wind up validizing the address anyway.
>
> But it is something you ought to be thinking about if you're intending
> to be accurately modeling target addresses.
I have added the comment to tree-ssa-address.c. There should not be
correctness problems, as the expr.c expander calls memory_address on the
result.
> > + if (symbol)
> > + {
> > + act_elem = symbol;
> > + if (offset)
> > + {
> > + act_elem = gen_rtx_fmt_e (CONST, Pmode,
> > + gen_rtx_fmt_ee (PLUS, Pmode,
> > + act_elem, offset));
> > + if (offset_p)
> > + *offset_p = &XEXP (XEXP (act_elem, 0), 1);
> > + }
> > +
> > + if (*addr)
> > + *addr = gen_rtx_fmt_ee (PLUS, Pmode, *addr, act_elem);
> > + else
> > + *addr = act_elem;
> > + }
>
> What is the purpose of having symbol+offset be different fields,
> when they are both simply addends? And apparently constant ones
> at that? We are talking about "sym+off(base, index, step)" type
> of x86 addressing modes, are we not?
Yes. As far as I understand, the canonical format of the address in
rtl is
(PLUS (PLUS (MULT index step) base) (CONST (PLUS symbol offset)))
in case both symbol and offset are present
(PLUS (PLUS (MULT index step) base) symbol)
in case only symbol is present
and
(PLUS (PLUS (MULT index step) base) offset)
in case only offset is present
As for why there are different fields for symbol and offset -- it is
easier to work with and it saves one PLUS_EXPR and ADDR_EXPR. Also,
some architectures allow only adding a (small) constant offset, but not symbol,
to some addressing modes, so it makes more sense to keep them
separated to match the semantics of such modes more precisely.
> > + create_mem_ref (block_stmt_iterator *bsi, tree type, tree addr)
> > + {
> > + tree mem_ref, tmp;
> > + tree addr_type = build_pointer_type (type);
> > + struct mem_address parts;
> > +
> > + addr_to_parts (addr, &parts);
>
> You're being exceedingly heroic decomposing these arbitrarily complex
> addresses into something intelligable. Is there any way you could avoid
> building up these complex expressions only to have to decompose them
> again? Further...
>
> > + /* The expression is too complicated. Try making it simpler. */
>
> And then still more work simplifying the expression until its valid. It
> would be a Good Thing if you remembered what kinds of addresses are not
> valid, and don't try to generate them any more.
>
> Combining this with the previous step, you might halt construction of
> the complex address much earlier and thus do less work.
>
> The poster child here is going to be ia64, in which *all* of this work
> is pointless. The question becomes how soon do you figure out to stop.
The problem here is that the work has to be done somewhere. Even in case
the result of addr_to_parts cannot be mapped to the addressing mode of
the architecture, it quite often is very useful, since it simplifies
the expression. For example ivopts sometimes produce expressions of
type (base + x + y) - x + idx. Just folding this expression does
not necessarily lead to simplification to base + x + idx, since fold scans the
expression only to limited depth (and this should not be changed due to performance
reasons), and also it is sometimes gets confused by type conversions. The code in
addr_to_parts quite often catches such cases where fold failed.
Now, folding of such expressions has to be done somewhere (or the produced code
is really ugly), and the code realizing this operation won't get much simpler
then what addr_to_parts does. I have "/* TODO -- produce TARGET_MEM_REF
directly. */" comment in rewrite_use_address, but it would not make the
code simpler, since this type of folding would have to be done in
tree-ssa-loop-ivopts anyway.
> > ! /* Extract the alias analysis info for the memory reference REF. There are
> > ! several ways how this information may be stored and what precisely is
> > ! its semantics depending on the type of the reference, but there always is
> > ! somewhere hidden one _DECL node that is used to determine the set of
> > ! virtual operands for the reference. The code below deciphers this jungle
> > ! and extracts this single useful piece of information. */
> >
> > ! static tree
> > ! get_ref_tag (tree ref)
>
> I have no clue whether or not this is correct. You'll have to get a
> ruling from Diego on this.
Zdenek
* tree-ssa-address.c: New file.
* Makefile.in (tree-ssa-address.o): Add.
* expr.c (expand_expr_real_1): Do not handle REF_ORIGINAL on
INDIRECT_REFs. Handle TARGET_MEM_REFs.
* tree-dfa.c (create_tmr_ann): New function.
* tree-eh.c (tree_could_trap_p): Handle TARGET_MEM_REFs.
* tree-flow-inline.h (tmr_ann, get_tmr_ann): New functions.
* tree-flow.h (enum tree_ann_type): Add TMR_ANN.
(struct tmr_ann_d, tmr_ann_t): New.
(union tree_ann_d): Add tmr.
(tmr_ann, get_tmr_ann, create_tmr_ann): Declare.
(struct mem_address): New.
(create_mem_ref, addr_for_mem_ref, get_address_description,
maybe_fold_tmr): Declare.
* tree-mudflap.c (mf_xform_derefs_1): Handle TARGET_MEM_REFs.
* tree-pretty-print.c (dump_generic_node): Ditto.
* tree-ssa-loop-im.c (for_each_index): Ditto.
* tree-ssa-loop-ivopts.c (may_be_unaligned_p,
find_interesting_uses_address): Ditto.
(rewrite_address_base): Removed.
(get_ref_tag, copy_ref_info): New functions.
(rewrite_use_address): Produce TARGET_MEM_REFs.
(tree_ssa_iv_optimize): Do not call update_ssa
and rewrite_into_loop_closed_ssa.
* tree-ssa-operands.c (get_tmr_operands): New function.
(get_expr_operands): Handle TARGET_MEM_REFs.
* tree.c (copy_node_stat): Copy annotations for TARGET_MEM_REFs.
(build): Handle 5 arguments.
(build5_stat): New function.
* tree.def (TARGET_MEM_DEF): New.
* tree.h (REF_ORIGINAL): Removed.
(TMR_SYMBOL, TMR_BASE, TMR_INDEX, TMR_STEP, TMR_OFFSET, build5): New
macros.
(build5_stat, tree_mem_ref_addr, copy_mem_ref_info): Declare.
* tree-ssa-ccp.c (fold_stmt_r): Call maybe_fold_tmr.
* doc/c-tree.texi: Document TARGET_MEM_REF.
* doc/tree-ssa.texi: Add TARGET_MEM_REF to gimple grammar.
Index: Makefile.in
===================================================================
RCS file: /cvs/gcc/gcc/gcc/Makefile.in,v
retrieving revision 1.1468
diff -c -3 -p -r1.1468 Makefile.in
*** Makefile.in 11 Apr 2005 20:17:38 -0000 1.1468
--- Makefile.in 12 Apr 2005 12:38:26 -0000
*************** OBJS-common = \
*** 924,930 ****
tree-chrec.o tree-scalar-evolution.o tree-data-ref.o \
tree-cfg.o tree-dfa.o tree-eh.o tree-ssa.o tree-optimize.o tree-gimple.o \
gimplify.o tree-pretty-print.o tree-into-ssa.o \
! tree-outof-ssa.o tree-ssa-ccp.o tree-vn.o \
tree-ssa-dce.o tree-ssa-copy.o tree-nrv.o tree-ssa-copyrename.o \
tree-ssa-pre.o tree-ssa-live.o tree-ssa-operands.o tree-ssa-alias.o \
tree-ssa-phiopt.o tree-ssa-forwprop.o tree-nested.o tree-ssa-dse.o \
--- 924,930 ----
tree-chrec.o tree-scalar-evolution.o tree-data-ref.o \
tree-cfg.o tree-dfa.o tree-eh.o tree-ssa.o tree-optimize.o tree-gimple.o \
gimplify.o tree-pretty-print.o tree-into-ssa.o \
! tree-outof-ssa.o tree-ssa-ccp.o tree-vn.o tree-ssa-address.o \
tree-ssa-dce.o tree-ssa-copy.o tree-nrv.o tree-ssa-copyrename.o \
tree-ssa-pre.o tree-ssa-live.o tree-ssa-operands.o tree-ssa-alias.o \
tree-ssa-phiopt.o tree-ssa-forwprop.o tree-nested.o tree-ssa-dse.o \
*************** tree-ssa-loop.o : tree-ssa-loop.c $(TREE
*** 1733,1738 ****
--- 1733,1743 ----
$(SYSTEM_H) $(RTL_H) $(TREE_H) $(TM_P_H) $(CFGLOOP_H) \
output.h diagnostic.h $(TIMEVAR_H) $(TM_H) coretypes.h $(TREE_DUMP_H) \
tree-pass.h $(FLAGS_H) tree-inline.h $(SCEV_H)
+ tree-ssa-address.o : tree-ssa-address.c $(TREE_FLOW_H) $(CONFIG_H) \
+ $(SYSTEM_H) $(RTL_H) $(TREE_H) $(TM_P_H) \
+ output.h diagnostic.h $(TIMEVAR_H) $(TM_H) coretypes.h $(TREE_DUMP_H) \
+ tree-pass.h $(FLAGS_H) tree-inline.h $(RECOG_H) insn-config.h $(EXPR_H) \
+ gt-tree-ssa-address.h $(GGC_H)
tree-ssa-loop-unswitch.o : tree-ssa-loop-unswitch.c $(TREE_FLOW_H) $(CONFIG_H) \
$(SYSTEM_H) $(RTL_H) $(TREE_H) $(TM_P_H) $(CFGLOOP_H) domwalk.h $(PARAMS_H)\
output.h diagnostic.h $(TIMEVAR_H) $(TM_H) coretypes.h $(TREE_DUMP_H) \
*************** GTFILES = $(srcdir)/input.h $(srcdir)/co
*** 2466,2472 ****
$(srcdir)/stringpool.c $(srcdir)/tree.c $(srcdir)/varasm.c \
$(srcdir)/tree-mudflap.c $(srcdir)/tree-flow.h \
$(srcdir)/c-objc-common.c $(srcdir)/c-common.c $(srcdir)/c-parser.c \
! $(srcdir)/tree-ssanames.c $(srcdir)/tree-eh.c \
$(srcdir)/tree-phinodes.c $(srcdir)/tree-cfg.c \
$(srcdir)/tree-dfa.c $(srcdir)/tree-ssa-propagate.c \
$(srcdir)/tree-iterator.c $(srcdir)/gimplify.c \
--- 2471,2477 ----
$(srcdir)/stringpool.c $(srcdir)/tree.c $(srcdir)/varasm.c \
$(srcdir)/tree-mudflap.c $(srcdir)/tree-flow.h \
$(srcdir)/c-objc-common.c $(srcdir)/c-common.c $(srcdir)/c-parser.c \
! $(srcdir)/tree-ssanames.c $(srcdir)/tree-eh.c $(srcdir)/tree-ssa-address.c \
$(srcdir)/tree-phinodes.c $(srcdir)/tree-cfg.c \
$(srcdir)/tree-dfa.c $(srcdir)/tree-ssa-propagate.c \
$(srcdir)/tree-iterator.c $(srcdir)/gimplify.c \
*************** gt-dwarf2out.h gt-reg-stack.h gt-dwarf2a
*** 2490,2496 ****
gt-dbxout.h gt-c-common.h gt-c-decl.h gt-c-parser.h \
gt-c-pragma.h gtype-c.h gt-cfglayout.h \
gt-tree-mudflap.h gt-tree-complex.h \
! gt-tree-eh.h \
gt-tree-ssanames.h gt-tree-iterator.h gt-gimplify.h \
gt-tree-phinodes.h gt-tree-cfg.h gt-tree-nested.h \
gt-tree-ssa-operands.h gt-tree-ssa-propagate.h \
--- 2495,2501 ----
gt-dbxout.h gt-c-common.h gt-c-decl.h gt-c-parser.h \
gt-c-pragma.h gtype-c.h gt-cfglayout.h \
gt-tree-mudflap.h gt-tree-complex.h \
! gt-tree-eh.h gt-tree-ssa-address.h \
gt-tree-ssanames.h gt-tree-iterator.h gt-gimplify.h \
gt-tree-phinodes.h gt-tree-cfg.h gt-tree-nested.h \
gt-tree-ssa-operands.h gt-tree-ssa-propagate.h \
Index: expr.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/expr.c,v
retrieving revision 1.783
diff -c -3 -p -r1.783 expr.c
*** expr.c 30 Mar 2005 21:34:23 -0000 1.783
--- expr.c 12 Apr 2005 12:38:26 -0000
*************** expand_expr_real_1 (tree exp, rtx target
*** 6822,6828 ****
case INDIRECT_REF:
{
tree exp1 = TREE_OPERAND (exp, 0);
- tree orig;
if (modifier != EXPAND_WRITE)
{
--- 6822,6827 ----
*************** expand_expr_real_1 (tree exp, rtx target
*** 6845,6854 ****
temp = gen_rtx_MEM (mode, op0);
! orig = REF_ORIGINAL (exp);
! if (!orig)
! orig = exp;
! set_mem_attributes (temp, orig, 0);
/* Resolve the misalignment now, so that we don't have to remember
to resolve it later. Of course, this only works for reads. */
--- 6844,6850 ----
temp = gen_rtx_MEM (mode, op0);
! set_mem_attributes (temp, exp, 0);
/* Resolve the misalignment now, so that we don't have to remember
to resolve it later. Of course, this only works for reads. */
*************** expand_expr_real_1 (tree exp, rtx target
*** 6880,6885 ****
--- 6876,6893 ----
return temp;
}
+ case TARGET_MEM_REF:
+ {
+ struct mem_address addr;
+
+ get_address_description (exp, &addr);
+ op0 = addr_for_mem_ref (&addr, true);
+ op0 = memory_address (mode, op0);
+ temp = gen_rtx_MEM (mode, op0);
+ set_mem_attributes (temp, tmr_ann (exp)->original, 0);
+ }
+ return temp;
+
case ARRAY_REF:
{
Index: tree-dfa.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-dfa.c,v
retrieving revision 2.52
diff -c -3 -p -r2.52 tree-dfa.c
*** tree-dfa.c 9 Apr 2005 01:37:23 -0000 2.52
--- tree-dfa.c 12 Apr 2005 12:38:26 -0000
*************** create_stmt_ann (tree t)
*** 186,191 ****
--- 186,210 ----
return ann;
}
+ /* Create a new annotation for a TARGET_MEM_REF node T. */
+
+ tmr_ann_t
+ create_tmr_ann (tree t)
+ {
+ tmr_ann_t ann;
+
+ gcc_assert (TREE_CODE (t) == TARGET_MEM_REF);
+ gcc_assert (!t->common.ann || t->common.ann->common.type == TMR_ANN);
+
+ ann = ggc_alloc (sizeof (*ann));
+ memset ((void *) ann, 0, sizeof (*ann));
+
+ ann->common.type = TMR_ANN;
+
+ t->common.ann = (tree_ann_t) ann;
+
+ return ann;
+ }
/* Create a new annotation for a tree T. */
Index: tree-eh.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-eh.c,v
retrieving revision 2.29
diff -c -3 -p -r2.29 tree-eh.c
*** tree-eh.c 4 Apr 2005 22:50:53 -0000 2.29
--- tree-eh.c 12 Apr 2005 12:38:27 -0000
*************** tree_could_trap_p (tree expr)
*** 1739,1744 ****
--- 1739,1751 ----
restart:
switch (code)
{
+ case TARGET_MEM_REF:
+ /* For TARGET_MEM_REFs use the information based on the original
+ reference. */
+ expr = tmr_ann (expr)->original;
+ code = TREE_CODE (expr);
+ goto restart;
+
case COMPONENT_REF:
case REALPART_EXPR:
case IMAGPART_EXPR:
Index: tree-flow-inline.h
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-flow-inline.h,v
retrieving revision 2.37
diff -c -3 -p -r2.37 tree-flow-inline.h
*** tree-flow-inline.h 9 Apr 2005 01:37:23 -0000 2.37
--- tree-flow-inline.h 12 Apr 2005 12:38:27 -0000
*************** get_stmt_ann (tree stmt)
*** 67,72 ****
--- 67,92 ----
}
+ /* Return the annotation for T, which must be a TARGET_MEM_REF
+ node. Return NULL if the statement annotation doesn't exist. */
+ static inline tmr_ann_t
+ tmr_ann (tree t)
+ {
+ #ifdef ENABLE_CHECKING
+ gcc_assert (TREE_CODE (t) == TARGET_MEM_REF);
+ #endif
+ return (tmr_ann_t) t->common.ann;
+ }
+
+ /* Return the annotation for T, which must be a TARGET_MEM_REF
+ node. Create the annotation if it doesn't exist. */
+ static inline tmr_ann_t
+ get_tmr_ann (tree t)
+ {
+ tmr_ann_t ann = tmr_ann (t);
+ return ann ? ann : create_tmr_ann (t);
+ }
+
/* Return the annotation type for annotation ANN. */
static inline enum tree_ann_type
ann_type (tree_ann_t ann)
Index: tree-flow.h
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-flow.h,v
retrieving revision 2.93
diff -c -3 -p -r2.93 tree-flow.h
*** tree-flow.h 9 Apr 2005 01:37:23 -0000 2.93
--- tree-flow.h 12 Apr 2005 12:38:27 -0000
*************** typedef struct value_range_def value_ran
*** 111,117 ****
/*---------------------------------------------------------------------------
Tree annotations stored in tree_common.ann
---------------------------------------------------------------------------*/
! enum tree_ann_type { TREE_ANN_COMMON, VAR_ANN, STMT_ANN };
struct tree_ann_common_d GTY(())
{
--- 111,117 ----
/*---------------------------------------------------------------------------
Tree annotations stored in tree_common.ann
---------------------------------------------------------------------------*/
! enum tree_ann_type { TREE_ANN_COMMON, VAR_ANN, STMT_ANN, TMR_ANN };
struct tree_ann_common_d GTY(())
{
*************** struct stmt_ann_d GTY(())
*** 333,343 ****
--- 333,358 ----
void * GTY ((skip (""))) histograms;
};
+ /* Annotation for TARGET_MEM_REFs. */
+
+ struct tmr_ann_d GTY(())
+ {
+ struct tree_ann_common_d common;
+
+ /* Name tag, type tag, or base object, whichever is applicable
+ for the reference from that the TARGET_MEM_REF was created. */
+ tree tag;
+
+ /* Original memory reference corresponding to the TARGET_MEM_REF. */
+ tree original;
+ };
+
union tree_ann_d GTY((desc ("ann_type ((tree_ann_t)&%h)")))
{
struct tree_ann_common_d GTY((tag ("TREE_ANN_COMMON"))) common;
struct var_ann_d GTY((tag ("VAR_ANN"))) decl;
struct stmt_ann_d GTY((tag ("STMT_ANN"))) stmt;
+ struct tmr_ann_d GTY((tag ("TMR_ANN"))) tmr;
};
extern GTY(()) VEC(tree) *modified_noreturn_calls;
*************** extern GTY(()) VEC(tree) *modified_noret
*** 345,350 ****
--- 360,366 ----
typedef union tree_ann_d *tree_ann_t;
typedef struct var_ann_d *var_ann_t;
typedef struct stmt_ann_d *stmt_ann_t;
+ typedef struct tmr_ann_d *tmr_ann_t;
static inline tree_ann_t tree_ann (tree);
static inline tree_ann_t get_tree_ann (tree);
*************** static inline var_ann_t var_ann (tree);
*** 352,357 ****
--- 368,375 ----
static inline var_ann_t get_var_ann (tree);
static inline stmt_ann_t stmt_ann (tree);
static inline stmt_ann_t get_stmt_ann (tree);
+ static inline tmr_ann_t tmr_ann (tree);
+ static inline tmr_ann_t get_tmr_ann (tree);
static inline enum tree_ann_type ann_type (tree_ann_t);
static inline basic_block bb_for_stmt (tree);
extern void set_bb_for_stmt (tree, basic_block);
*************** extern void dump_generic_bb (FILE *, bas
*** 556,561 ****
--- 574,580 ----
/* In tree-dfa.c */
extern var_ann_t create_var_ann (tree);
extern stmt_ann_t create_stmt_ann (tree);
+ extern tmr_ann_t create_tmr_ann (tree);
extern tree_ann_t create_tree_ann (tree);
extern void reserve_phi_args_for_new_edge (basic_block);
extern tree create_phi_node (tree, basic_block);
*************** extern bool expr_invariant_in_loop_p (st
*** 803,808 ****
--- 822,841 ----
tree force_gimple_operand (tree, tree *, bool, tree);
+ /* In tree-ssa-address.c */
+
+ /* Description of a memory address. */
+
+ struct mem_address
+ {
+ tree symbol, base, index, step, offset;
+ };
+
+ tree create_mem_ref (block_stmt_iterator *, tree, tree);
+ rtx addr_for_mem_ref (struct mem_address *, bool);
+ void get_address_description (tree, struct mem_address *);
+ tree maybe_fold_tmr (tree);
+
#include "tree-flow-inline.h"
#endif /* _TREE_FLOW_H */
Index: tree-mudflap.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-mudflap.c,v
retrieving revision 2.36
diff -c -3 -p -r2.36 tree-mudflap.c
*** tree-mudflap.c 11 Mar 2005 09:05:08 -0000 2.36
--- tree-mudflap.c 12 Apr 2005 12:38:27 -0000
*************** mf_xform_derefs_1 (block_stmt_iterator *
*** 853,858 ****
--- 853,866 ----
integer_one_node));
break;
+ case TARGET_MEM_REF:
+ addr = tree_mem_ref_addr (ptr_type_node, t);
+ base = addr;
+ limit = fold_build2 (MINUS_EXPR, ptr_type_node,
+ fold_build2 (PLUS_EXPR, ptr_type_node, base, size),
+ build_int_cst_type (ptr_type_node, 1));
+ break;
+
case ARRAY_RANGE_REF:
warning ("mudflap checking not yet implemented for ARRAY_RANGE_REF");
return;
Index: tree-pretty-print.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-pretty-print.c,v
retrieving revision 2.57
diff -c -3 -p -r2.57 tree-pretty-print.c
*** tree-pretty-print.c 11 Apr 2005 15:05:30 -0000 2.57
--- tree-pretty-print.c 12 Apr 2005 12:38:27 -0000
*************** dump_generic_node (pretty_printer *buffe
*** 444,449 ****
--- 444,507 ----
pp_string (buffer, "::");
break;
+ case TARGET_MEM_REF:
+ {
+ const char *sep = "";
+ tree tmp;
+
+ pp_string (buffer, "MEM[");
+
+ tmp = TMR_SYMBOL (node);
+ if (tmp)
+ {
+ pp_string (buffer, sep);
+ sep = ", ";
+ pp_string (buffer, "symbol: ");
+ dump_generic_node (buffer, tmp, spc, flags, false);
+ }
+ tmp = TMR_BASE (node);
+ if (tmp)
+ {
+ pp_string (buffer, sep);
+ sep = ", ";
+ pp_string (buffer, "base: ");
+ dump_generic_node (buffer, tmp, spc, flags, false);
+ }
+ tmp = TMR_INDEX (node);
+ if (tmp)
+ {
+ pp_string (buffer, sep);
+ sep = ", ";
+ pp_string (buffer, "index: ");
+ dump_generic_node (buffer, tmp, spc, flags, false);
+ }
+ tmp = TMR_STEP (node);
+ if (tmp)
+ {
+ pp_string (buffer, sep);
+ sep = ", ";
+ pp_string (buffer, "step: ");
+ dump_generic_node (buffer, tmp, spc, flags, false);
+ }
+ tmp = TMR_OFFSET (node);
+ if (tmp)
+ {
+ pp_string (buffer, sep);
+ sep = ", ";
+ pp_string (buffer, "offset: ");
+ dump_generic_node (buffer, tmp, spc, flags, false);
+ }
+ pp_string (buffer, "]");
+ if (flags & TDF_DETAILS)
+ {
+ pp_string (buffer, "{");
+ dump_generic_node (buffer, tmr_ann (node)->original, spc, flags,
+ false);
+ pp_string (buffer, "}");
+ }
+ }
+ break;
+
case ARRAY_TYPE:
{
tree tmp;
Index: tree-ssa-address.c
===================================================================
RCS file: tree-ssa-address.c
diff -N tree-ssa-address.c
*** /dev/null 1 Jan 1970 00:00:00 -0000
--- tree-ssa-address.c 12 Apr 2005 12:38:27 -0000
***************
*** 0 ****
--- 1,960 ----
+ /* Memory address lowering and addressing mode selection.
+ Copyright (C) 2004 Free Software Foundation, Inc.
+
+ This file is part of GCC.
+
+ GCC is free software; you can redistribute it and/or modify it
+ under the terms of the GNU General Public License as published by the
+ Free Software Foundation; either version 2, or (at your option) any
+ later version.
+
+ GCC is distributed in the hope that it will be useful, but WITHOUT
+ ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
+ FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
+ for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with GCC; see the file COPYING. If not, write to the Free
+ Software Foundation, 59 Temple Place - Suite 330, Boston, MA
+ 02111-1307, USA. */
+
+ /* Utility functions for manipulation with TARGET_MEM_REFs -- tree expressions
+ that directly map to addressing modes of the target. */
+
+ #include "config.h"
+ #include "system.h"
+ #include "coretypes.h"
+ #include "tm.h"
+ #include "tree.h"
+ #include "rtl.h"
+ #include "tm_p.h"
+ #include "hard-reg-set.h"
+ #include "basic-block.h"
+ #include "output.h"
+ #include "diagnostic.h"
+ #include "tree-flow.h"
+ #include "tree-dump.h"
+ #include "tree-pass.h"
+ #include "timevar.h"
+ #include "flags.h"
+ #include "tree-inline.h"
+ #include "insn-config.h"
+ #include "recog.h"
+ #include "expr.h"
+ #include "ggc.h"
+
+ /* TODO -- handling of symbols (according to Richard Hendersons
+ comments, http://gcc.gnu.org/ml/gcc-patches/2005-04/msg00949.html):
+
+ There are at least 5 different kinds of symbols that we can run up against:
+
+ (1) binds_local_p, small data area.
+ (2) binds_local_p, eg local statics
+ (3) !binds_local_p, eg global variables
+ (4) thread local, local_exec
+ (5) thread local, !local_exec
+
+ Now, (1) won't appear often in an array context, but it certainly can.
+ All you have to do is set -GN high enough, or explicitly mark any
+ random object __attribute__((section (".sdata"))).
+
+ All of these affect whether or not a symbol is in fact a valid address.
+ The only one tested here is (3). And that result may very well
+ be incorrect for (4) or (5).
+
+ An incorrect result here does not cause incorrect results out the
+ back end, because the expander in expr.c validizes the address. However
+ it would be nice to improve the handling here in order to produce more
+ precise results. */
+
+ /* A "template" for memory address, used to determine whether the address is
+ valid for mode. */
+
+ struct mem_addr_template GTY (())
+ {
+ rtx ref; /* The template. */
+ rtx * GTY ((skip)) step_p; /* The point in template where the step should be
+ filled in. */
+ rtx * GTY ((skip)) off_p; /* The point in template where the offset should
+ be filled in. */
+ };
+
+ /* The templates. Each of the five bits of the index corresponds to one
+ component of TARGET_MEM_REF being present, see TEMPL_IDX. */
+
+ static GTY (()) struct mem_addr_template templates[32];
+
+ #define TEMPL_IDX(SYMBOL, BASE, INDEX, STEP, OFFSET) \
+ (((SYMBOL != 0) << 4) \
+ | ((BASE != 0) << 3) \
+ | ((INDEX != 0) << 2) \
+ | ((STEP != 0) << 1) \
+ | (OFFSET != 0))
+
+ /* Invokes force_gimple_operand for EXPR with parameters SIMPLE_P and VAR. If
+ some statements are produced, emits them before BSI. */
+
+ static tree
+ force_gimple_operand_bsi (block_stmt_iterator *bsi, tree expr,
+ bool simple_p, tree var)
+ {
+ tree stmts;
+
+ expr = force_gimple_operand (unshare_expr (expr), &stmts, simple_p, var);
+ if (stmts)
+ bsi_insert_before (bsi, stmts, BSI_SAME_STMT);
+
+ return expr;
+ }
+
+ /* Stores address for memory reference with parameters SYMBOL, BASE, INDEX,
+ STEP and OFFSET to *ADDR. Stores pointers to where step is placed to
+ *STEP_P and offset to *OFFSET_P. */
+
+ static void
+ gen_addr_rtx (rtx symbol, rtx base, rtx index, rtx step, rtx offset,
+ rtx *addr, rtx **step_p, rtx **offset_p)
+ {
+ rtx act_elem;
+
+ *addr = NULL_RTX;
+ if (step_p)
+ *step_p = NULL;
+ if (offset_p)
+ *offset_p = NULL;
+
+ if (index)
+ {
+ act_elem = index;
+ if (step)
+ {
+ act_elem = gen_rtx_MULT (Pmode, act_elem, step);
+
+ if (step_p)
+ *step_p = &XEXP (act_elem, 1);
+ }
+
+ *addr = act_elem;
+ }
+
+ if (base)
+ {
+ if (*addr)
+ *addr = gen_rtx_PLUS (Pmode, *addr, base);
+ else
+ *addr = base;
+ }
+
+ if (symbol)
+ {
+ act_elem = symbol;
+ if (offset)
+ {
+ act_elem = gen_rtx_CONST (Pmode,
+ gen_rtx_PLUS (Pmode, act_elem, offset));
+ if (offset_p)
+ *offset_p = &XEXP (XEXP (act_elem, 0), 1);
+ }
+
+ if (*addr)
+ *addr = gen_rtx_PLUS (Pmode, *addr, act_elem);
+ else
+ *addr = act_elem;
+ }
+ else if (offset)
+ {
+ if (*addr)
+ {
+ *addr = gen_rtx_PLUS (Pmode, *addr, offset);
+ if (offset_p)
+ *offset_p = &XEXP (*addr, 1);
+ }
+ else
+ {
+ *addr = offset;
+ if (offset_p)
+ *offset_p = addr;
+ }
+ }
+
+ if (!*addr)
+ *addr = const0_rtx;
+ }
+
+ /* Returns address for TARGET_MEM_REF with parameters given by ADDR.
+ If REALLY_EXPAND is false, just make fake registers instead
+ of really expanding the operands, and perform the expansion in-place
+ by using one of the "templates". */
+
+ rtx
+ addr_for_mem_ref (struct mem_address *addr, bool really_expand)
+ {
+ rtx address, sym, bse, idx, st, off;
+ static bool templates_initialized = false;
+ struct mem_addr_template *templ;
+
+ if (addr->step && !integer_onep (addr->step))
+ st = immed_double_const (TREE_INT_CST_LOW (addr->step),
+ TREE_INT_CST_HIGH (addr->step), Pmode);
+ else
+ st = NULL_RTX;
+
+ if (addr->offset && !integer_zerop (addr->offset))
+ off = immed_double_const (TREE_INT_CST_LOW (addr->offset),
+ TREE_INT_CST_HIGH (addr->offset), Pmode);
+ else
+ off = NULL_RTX;
+
+ if (!really_expand)
+ {
+ /* Reuse the templates for addresses, so that we do not waste memory. */
+ if (!templates_initialized)
+ {
+ unsigned i;
+
+ templates_initialized = true;
+ sym = gen_rtx_SYMBOL_REF (Pmode, ggc_strdup ("test_symbol"));
+ bse = gen_raw_REG (Pmode, FIRST_PSEUDO_REGISTER);
+ idx = gen_raw_REG (Pmode, FIRST_PSEUDO_REGISTER + 1);
+
+ for (i = 0; i < 32; i++)
+ gen_addr_rtx ((i & 16 ? sym : NULL_RTX),
+ (i & 8 ? bse : NULL_RTX),
+ (i & 4 ? idx : NULL_RTX),
+ (i & 2 ? const0_rtx : NULL_RTX),
+ (i & 1 ? const0_rtx : NULL_RTX),
+ &templates[i].ref,
+ &templates[i].step_p,
+ &templates[i].off_p);
+ }
+
+ templ = templates + TEMPL_IDX (addr->symbol, addr->base, addr->index,
+ st, off);
+ if (st)
+ *templ->step_p = st;
+ if (off)
+ *templ->off_p = off;
+
+ return templ->ref;
+ }
+
+ /* Otherwise really expand the expressions. */
+ sym = (addr->symbol
+ ? expand_expr (build_addr (addr->symbol), NULL_RTX, Pmode, EXPAND_NORMAL)
+ : NULL_RTX);
+ bse = (addr->base
+ ? expand_expr (addr->base, NULL_RTX, Pmode, EXPAND_NORMAL)
+ : NULL_RTX);
+ idx = (addr->index
+ ? expand_expr (addr->index, NULL_RTX, Pmode, EXPAND_NORMAL)
+ : NULL_RTX);
+
+ gen_addr_rtx (sym, bse, idx, st, off, &address, NULL, NULL);
+ return address;
+ }
+
+ /* Returns address of MEM_REF in TYPE. */
+
+ tree
+ tree_mem_ref_addr (tree type, tree mem_ref)
+ {
+ tree addr = NULL_TREE;
+ tree act_elem;
+ tree step = TMR_STEP (mem_ref), offset = TMR_OFFSET (mem_ref);
+
+ act_elem = TMR_INDEX (mem_ref);
+ if (act_elem)
+ {
+ act_elem = fold_convert (type, act_elem);
+
+ if (step)
+ act_elem = fold_build2 (MULT_EXPR, type, act_elem,
+ fold_convert (type, step));
+ addr = act_elem;
+ }
+
+ act_elem = TMR_BASE (mem_ref);
+ if (act_elem)
+ {
+ act_elem = fold_convert (type, act_elem);
+
+ if (addr)
+ addr = fold_build2 (PLUS_EXPR, type, addr, act_elem);
+ else
+ addr = act_elem;
+ }
+
+ act_elem = TMR_SYMBOL (mem_ref);
+ if (act_elem)
+ {
+ act_elem = fold_convert (type, build_addr (act_elem));
+ if (addr)
+ addr = fold_build2 (PLUS_EXPR, type, addr, act_elem);
+ else
+ addr = act_elem;
+ }
+
+ if (!zero_p (offset))
+ {
+ act_elem = fold_convert (type, offset);
+
+ if (addr)
+ addr = fold_build2 (PLUS_EXPR, type, addr, act_elem);
+ else
+ addr = act_elem;
+ }
+
+ if (!addr)
+ addr = build_int_cst (type, 0);
+
+ return addr;
+ }
+
+ /* Returns true if a memory reference in MODE and with parameters given by
+ ADDR is valid on the current target. */
+
+ static bool
+ valid_mem_ref_p (enum machine_mode mode, struct mem_address *addr)
+ {
+ rtx address;
+
+ address = addr_for_mem_ref (addr, false);
+ if (!address)
+ return false;
+
+ return memory_address_p (mode, address);
+ }
+
+ /* Checks whether a TARGET_MEM_REF with type TYPE and parameters given by ADDR
+ is valid on the current target and if so, creates and returns the
+ TARGET_MEM_REF. */
+
+ static tree
+ create_mem_ref_raw (tree type, struct mem_address *addr)
+ {
+ if (!valid_mem_ref_p (TYPE_MODE (type), addr))
+ return NULL_TREE;
+
+ if (addr->step && integer_onep (addr->step))
+ addr->step = NULL_TREE;
+
+ if (addr->offset && zero_p (addr->offset))
+ addr->offset = NULL_TREE;
+
+ return build5 (TARGET_MEM_REF, type,
+ addr->symbol, addr->base, addr->index,
+ addr->step, addr->offset);
+ }
+
+ /* Adds register REG to the address PARTS, multiplied by MUL. */
+
+ static void
+ add_reg_to_parts (struct mem_address *parts, tree reg, tree mul)
+ {
+ tree tmp;
+
+ if (zero_p (mul))
+ return;
+ mul = fold_convert (ptr_type_node, mul);
+
+ /* If REG is INDEX already, just adjust the step. */
+ if (parts->index
+ && operand_equal_p (parts->index, reg, 0))
+ {
+ if (parts->step)
+ parts->step
+ = fold_binary_to_constant (PLUS_EXPR, ptr_type_node,
+ parts->step, mul);
+ else
+ parts->step
+ = fold_binary_to_constant (PLUS_EXPR, ptr_type_node,
+ mul,
+ build_int_cst_type (ptr_type_node,
+ 1));
+
+ if (zero_p (parts->step))
+ {
+ parts->step = NULL_TREE;
+ parts->index = NULL_TREE;
+ }
+
+ return;
+ }
+
+ /* Equal to base; try putting it to the step with an appropriate index. */
+ if (parts->base
+ && operand_equal_p (parts->base, reg, 0))
+ {
+ mul = fold_binary_to_constant (PLUS_EXPR, ptr_type_node, mul,
+ build_int_cst_type (ptr_type_node, 1));
+ if (zero_p (mul))
+ {
+ parts->base = NULL_TREE;
+ return;
+ }
+
+ if (!parts->step)
+ {
+ parts->base = parts->index;
+ parts->index = fold_convert (ptr_type_node, reg);
+ parts->step = mul;
+ return;
+ }
+
+ /* MUL * REG + STEP * INDEX. If MUL == STEP, just add REG to INDEX. */
+ if (operand_equal_p (parts->step, mul, 0))
+ {
+ parts->base = NULL_TREE;
+ parts->index = fold_build2 (PLUS_EXPR, ptr_type_node,
+ fold_convert (ptr_type_node, reg),
+ parts->index);
+ return;
+ }
+
+ /* Replace base by MUL * BASE. */
+ parts->base = fold_build2 (MULT_EXPR, ptr_type_node,
+ fold_convert (ptr_type_node, reg), mul);
+ return;
+ }
+
+ /* Step 1 and free BASE. */
+ if (integer_onep (mul) && !parts->base)
+ {
+ parts->base = reg;
+ return;
+ }
+
+ /* Maybe we can move value from INDEX to BASE, thus enabling us to
+ put REG to index. */
+ if (parts->index
+ && !parts->step
+ && !parts->base)
+ {
+ parts->base = parts->index;
+ parts->index = NULL_TREE;
+ }
+
+ /* Free INDEX. */
+ if (!parts->index)
+ {
+ parts->index = fold_convert (ptr_type_node, reg);
+ parts->step = mul;
+ return;
+ }
+
+ /* Both BASE and INDEX are full, or STEP != 1 and MUL != 1, and
+ BASE and INDEX is different from REG. Check whether
+ step is the same as MUL. */
+ if (!integer_onep (mul)
+ && parts->step
+ && operand_equal_p (parts->step, mul, 0))
+ {
+ parts->index = fold_build2 (PLUS_EXPR, ptr_type_node,
+ fold_convert (ptr_type_node, reg),
+ parts->index);
+ return;
+ }
+
+ /* Otherwise add MUL * REG to BASE. */
+ if (integer_onep (mul))
+ tmp = fold_convert (ptr_type_node, reg);
+ else
+ tmp = fold_build2 (MULT_EXPR, ptr_type_node,
+ fold_convert (ptr_type_node, reg), mul);
+
+ if (parts->base)
+ tmp = fold_build2 (PLUS_EXPR, ptr_type_node,
+ tmp, parts->base);
+ parts->base = tmp;
+ }
+
+ /* Returns true if OBJ is an object whose address is a link time constant. */
+
+ static bool
+ fixed_address_object_p (tree obj)
+ {
+ return (TREE_CODE (obj) == VAR_DECL
+ && (TREE_STATIC (obj)
+ || DECL_EXTERNAL (obj)));
+ }
+
+ /* Adds symbol SYM to the address PARTS. */
+
+ static void
+ add_symbol_to_parts (struct mem_address *parts, tree sym)
+ {
+ if (!parts->symbol
+ && fixed_address_object_p (sym))
+ {
+ parts->symbol = sym;
+ return;
+ }
+
+ add_reg_to_parts (parts, build_addr (sym),
+ build_int_cst_type (ptr_type_node, 1));
+ }
+
+ /* Adds offset OFF to the address PARTS. */
+
+ static void
+ add_offset_to_parts (struct mem_address *parts, tree off)
+ {
+ off = fold_convert (ptr_type_node, off);
+
+ if (!parts->offset)
+ parts->offset = off;
+ else
+ parts->offset
+ = fold_binary_to_constant (PLUS_EXPR, ptr_type_node,
+ parts->offset, off);
+ }
+
+ /* Sets the address PARTS to sum of addresses P0 and P1. */
+
+ static void
+ add_parts (struct mem_address *p0, struct mem_address *p1,
+ struct mem_address *parts)
+ {
+ *parts = *p0;
+
+ if (p1->offset)
+ add_offset_to_parts (parts, p1->offset);
+
+ if (p1->index)
+ {
+ tree mul = p1->step;
+
+ if (!mul)
+ mul = build_int_cst_type (ptr_type_node, 1);
+ add_reg_to_parts (parts, p1->index, mul);
+ }
+
+ if (p1->base)
+ add_reg_to_parts (parts, p1->base, build_int_cst_type (ptr_type_node, 1));
+
+ if (p1->symbol)
+ add_symbol_to_parts (parts, p1->symbol);
+ }
+
+ /* Sets the address PARTs to difference of addresses P0 and P1. */
+
+ static void
+ rem_parts (struct mem_address *p0, struct mem_address *p1,
+ struct mem_address *parts)
+ {
+ tree tmp;
+
+ *parts = *p0;
+
+ if (p1->offset)
+ {
+ tmp = fold_unary_to_constant (NEGATE_EXPR, ptr_type_node, p1->offset);
+ add_offset_to_parts (parts, tmp);
+ }
+
+ if (p1->index)
+ {
+ if (p1->step)
+ tmp = fold_unary_to_constant (NEGATE_EXPR, ptr_type_node,
+ p1->step);
+ else
+ tmp = build_int_cst_type (ptr_type_node, -1);
+ add_reg_to_parts (parts, p1->index, tmp);
+ }
+
+ if (p1->base)
+ add_reg_to_parts (parts, p1->base,
+ build_int_cst_type (ptr_type_node, -1));
+
+ if (p1->symbol)
+ {
+ /* if the symbols are equal, cancel them; otherwise,
+ simply subtract its address. */
+ if (p0->symbol
+ && operand_equal_p (p0->symbol, p1->symbol, 0))
+ parts->symbol = NULL_TREE;
+ else
+ {
+ add_reg_to_parts (parts,
+ build_addr (p1->symbol),
+ build_int_cst_type (ptr_type_node, -1));
+ }
+ }
+ }
+
+ /* Sets the address PARTs to MUL times addresses P0. */
+
+ static void
+ mul_parts (struct mem_address *p0,
+ tree mul, struct mem_address *parts)
+ {
+ parts->symbol = NULL_TREE;
+ parts->base = NULL_TREE;
+
+ if (p0->offset)
+ parts->offset = fold_binary_to_constant (MULT_EXPR, ptr_type_node, mul,
+ p0->offset);
+ else
+ parts->offset = NULL_TREE;
+
+ if (p0->index)
+ {
+ parts->index = p0->index;
+ if (p0->step)
+ parts->step = fold_binary_to_constant (MULT_EXPR, ptr_type_node, mul,
+ p0->step);
+ else
+ parts->step = mul;
+ }
+ else
+ {
+ parts->index = NULL_TREE;
+ parts->step = NULL_TREE;
+ }
+
+ if (p0->base)
+ add_reg_to_parts (parts, p0->base, mul);
+
+ if (p0->symbol)
+ add_reg_to_parts (parts, build_addr (p0->symbol), mul);
+ }
+
+ /* Splits address ADDR into PARTS. */
+
+ static void
+ addr_to_parts (tree addr, struct mem_address *parts)
+ {
+ tree op0, op1;
+ enum tree_code code;
+ struct mem_address p0, p1;
+
+ STRIP_NOPS (addr);
+ code = TREE_CODE (addr);
+
+ switch (code)
+ {
+ case SSA_NAME:
+ parts->symbol = NULL_TREE;
+ parts->base = fold_convert (ptr_type_node, addr);
+ parts->index = NULL_TREE;
+ parts->step = NULL_TREE;
+ parts->offset = NULL_TREE;
+ return;
+
+ case INTEGER_CST:
+ parts->symbol = NULL_TREE;
+ parts->base = NULL_TREE;
+ parts->index = NULL_TREE;
+ parts->step = NULL_TREE;
+ parts->offset = fold_convert (ptr_type_node, addr);
+ return;
+
+ case PLUS_EXPR:
+ case MINUS_EXPR:
+ op0 = TREE_OPERAND (addr, 0);
+ op1 = TREE_OPERAND (addr, 1);
+ addr_to_parts (op0, &p0);
+ addr_to_parts (op1, &p1);
+
+ if (code == PLUS_EXPR)
+ add_parts (&p0, &p1, parts);
+ else
+ rem_parts (&p0, &p1, parts);
+ return;
+
+ case MULT_EXPR:
+ op0 = TREE_OPERAND (addr, 0);
+ op1 = TREE_OPERAND (addr, 1);
+
+ if (TREE_CODE (op1) != INTEGER_CST)
+ break;
+ addr_to_parts (op0, &p0);
+ mul_parts (&p0, op1, parts);
+ return;
+
+ case ADDR_EXPR:
+ {
+ tree core, off;
+ enum machine_mode mode;
+ int unsignedp, volatilep;
+ HOST_WIDE_INT bitsize, bitoff;
+
+ core = get_inner_reference (TREE_OPERAND (addr, 0), &bitsize, &bitoff,
+ &off, &mode, &unsignedp, &volatilep,
+ false);
+ gcc_assert (bitoff % BITS_PER_UNIT == 0);
+
+ if (off)
+ addr_to_parts (off, &p0);
+ else
+ {
+ p0.symbol = NULL_TREE;
+ p0.base = NULL_TREE;
+ p0.index = NULL_TREE;
+ p0.step = NULL_TREE;
+ p0.offset = NULL_TREE;
+ }
+
+ if (bitoff)
+ add_offset_to_parts (&p0,
+ build_int_cst_type (ptr_type_node,
+ bitoff / BITS_PER_UNIT));
+
+ if (TREE_CODE (core) == INDIRECT_REF)
+ {
+ addr_to_parts (TREE_OPERAND (core, 0), &p1);
+ add_parts (&p0, &p1, parts);
+ }
+ else
+ {
+ add_symbol_to_parts (&p0, core);
+ *parts = p0;
+ }
+ return;
+ }
+
+ default:
+ break;
+ }
+
+ parts->symbol = NULL_TREE;
+ parts->base = fold_convert (ptr_type_node, addr);
+ parts->index = NULL_TREE;
+ parts->step = NULL_TREE;
+ parts->offset = NULL_TREE;
+ return;
+ }
+
+ /* Force the PARTS to register. */
+
+ static void
+ gimplify_mem_ref_parts (block_stmt_iterator *bsi, struct mem_address *parts)
+ {
+ if (parts->base)
+ parts->base = force_gimple_operand_bsi (bsi, parts->base,
+ true, NULL_TREE);
+ if (parts->index)
+ parts->index = force_gimple_operand_bsi (bsi, parts->index,
+ true, NULL_TREE);
+ }
+
+ /* Creates and returns a TARGET_MEM_REF for address ADDR. If necessary
+ computations are emitted in front of BSI. TYPE is the mode
+ of created memory reference. */
+
+ tree
+ create_mem_ref (block_stmt_iterator *bsi, tree type, tree addr)
+ {
+ tree mem_ref, tmp;
+ tree addr_type = build_pointer_type (type);
+ struct mem_address parts;
+
+ addr_to_parts (addr, &parts);
+ gimplify_mem_ref_parts (bsi, &parts);
+ mem_ref = create_mem_ref_raw (type, &parts);
+ if (mem_ref)
+ return mem_ref;
+
+ /* The expression is too complicated. Try making it simpler. */
+
+ if (parts.step && !integer_onep (parts.step))
+ {
+ /* Move the multiplication to index. */
+ gcc_assert (parts.index);
+ parts.index = force_gimple_operand_bsi (bsi,
+ build2 (MULT_EXPR, addr_type,
+ parts.index, parts.step),
+ true, NULL_TREE);
+ parts.step = NULL_TREE;
+
+ mem_ref = create_mem_ref_raw (type, &parts);
+ if (mem_ref)
+ return mem_ref;
+ }
+
+ if (parts.symbol)
+ {
+ tmp = build_addr (parts.symbol);
+
+ /* Add the symbol to base, eventually forcing it to register. */
+ if (parts.base)
+ parts.base = force_gimple_operand_bsi (bsi,
+ build2 (PLUS_EXPR, addr_type,
+ parts.base, tmp),
+ true, NULL_TREE);
+ else
+ parts.base = tmp;
+ parts.symbol = NULL_TREE;
+
+ mem_ref = create_mem_ref_raw (type, &parts);
+ if (mem_ref)
+ return mem_ref;
+ }
+
+ if (parts.base)
+ {
+ /* Add base to index. */
+ if (parts.index)
+ parts.index = force_gimple_operand_bsi (bsi,
+ build2 (PLUS_EXPR, addr_type,
+ parts.base,
+ parts.index),
+ true, NULL_TREE);
+ else
+ parts.index = parts.base;
+ parts.base = NULL_TREE;
+
+ mem_ref = create_mem_ref_raw (type, &parts);
+ if (mem_ref)
+ return mem_ref;
+ }
+
+ if (parts.offset && !integer_zerop (parts.offset))
+ {
+ /* Try adding offset to index. */
+ if (parts.index)
+ parts.index = force_gimple_operand_bsi (bsi,
+ build2 (PLUS_EXPR, addr_type,
+ parts.index,
+ parts.offset),
+ true, NULL_TREE);
+ else
+ parts.index = parts.offset, bsi;
+
+ parts.offset = NULL_TREE;
+
+ mem_ref = create_mem_ref_raw (type, &parts);
+ if (mem_ref)
+ return mem_ref;
+ }
+
+ /* Verify that the address is in the simplest possible shape
+ (only a register). If we cannot create such a memory reference,
+ something is really wrong. */
+ gcc_assert (parts.symbol == NULL_TREE);
+ gcc_assert (parts.base == NULL_TREE);
+ gcc_assert (!parts.step || integer_onep (parts.step));
+ gcc_assert (!parts.offset || integer_zerop (parts.offset));
+ gcc_unreachable ();
+ }
+
+ /* Copies components of the address from OP to ADDR. */
+
+ void
+ get_address_description (tree op, struct mem_address *addr)
+ {
+ addr->symbol = TMR_SYMBOL (op);
+ addr->base = TMR_BASE (op);
+ addr->index = TMR_INDEX (op);
+ addr->step = TMR_STEP (op);
+ addr->offset = TMR_OFFSET (op);
+ }
+
+ /* Copies the additional information attached to target_mem_ref FROM to TO. */
+
+ void
+ copy_mem_ref_info (tree to, tree from)
+ {
+ /* Copy the annotation, to preserve the aliasing information. */
+ get_tmr_ann (to)->tag = tmr_ann (from)->tag;
+
+ /* And the info about the original reference. */
+ tmr_ann (to)->original = tmr_ann (from)->original;
+ }
+
+ /* Move constants in target_mem_ref REF to offset. Returns the new target
+ mem ref if anything changes, NULL_TREE otherwise. */
+
+ tree
+ maybe_fold_tmr (tree ref)
+ {
+ struct mem_address addr;
+ bool changed = false;
+ tree ret, off;
+
+ get_address_description (ref, &addr);
+
+ if (addr.base && TREE_CODE (addr.base) == INTEGER_CST)
+ {
+ if (addr.offset)
+ addr.offset = fold_binary_to_constant (PLUS_EXPR, ptr_type_node,
+ addr.offset, addr.base);
+ else
+ addr.offset = addr.base;
+
+ addr.base = NULL_TREE;
+ changed = true;
+ }
+
+ if (addr.index && TREE_CODE (addr.index) == INTEGER_CST)
+ {
+ off = addr.index;
+ if (addr.step)
+ {
+ off = fold_binary_to_constant (MULT_EXPR, ptr_type_node,
+ off, addr.step);
+ addr.step = NULL_TREE;
+ }
+
+ if (addr.offset)
+ {
+ addr.offset = fold_binary_to_constant (PLUS_EXPR, ptr_type_node,
+ addr.offset, off);
+ }
+ else
+ addr.offset = off;
+
+ addr.index = NULL_TREE;
+ changed = true;
+ }
+
+ if (!changed)
+ return NULL_TREE;
+
+ ret = create_mem_ref_raw (TREE_TYPE (ref), &addr);
+ if (!ret)
+ return NULL_TREE;
+
+ copy_mem_ref_info (ret, ref);
+ return ret;
+ }
+
+ /* Dump PARTS to FILE. */
+
+ extern void dump_mem_address (FILE *, struct mem_address *);
+ void
+ dump_mem_address (FILE *file, struct mem_address *parts)
+ {
+ if (parts->symbol)
+ {
+ fprintf (file, "symbol: ");
+ print_generic_expr (file, parts->symbol, TDF_SLIM);
+ fprintf (file, "\n");
+ }
+ if (parts->base)
+ {
+ fprintf (file, "base: ");
+ print_generic_expr (file, parts->base, TDF_SLIM);
+ fprintf (file, "\n");
+ }
+ if (parts->index)
+ {
+ fprintf (file, "index: ");
+ print_generic_expr (file, parts->index, TDF_SLIM);
+ fprintf (file, "\n");
+ }
+ if (parts->step)
+ {
+ fprintf (file, "step: ");
+ print_generic_expr (file, parts->step, TDF_SLIM);
+ fprintf (file, "\n");
+ }
+ if (parts->offset)
+ {
+ fprintf (file, "offset: ");
+ print_generic_expr (file, parts->offset, TDF_SLIM);
+ fprintf (file, "\n");
+ }
+ }
+
+ #include "gt-tree-ssa-address.h"
Index: tree-ssa-ccp.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-ssa-ccp.c,v
retrieving revision 2.64
diff -c -3 -p -r2.64 tree-ssa-ccp.c
*** tree-ssa-ccp.c 9 Apr 2005 01:37:23 -0000 2.64
--- tree-ssa-ccp.c 12 Apr 2005 12:38:27 -0000
*************** maybe_fold_stmt_addition (tree expr)
*** 1803,1809 ****
return t;
}
-
/* Subroutine of fold_stmt called via walk_tree. We perform several
simplifications of EXPR_P, mostly having to do with pointer arithmetic. */
--- 1803,1808 ----
*************** fold_stmt_r (tree *expr_p, int *walk_sub
*** 1877,1882 ****
--- 1876,1885 ----
}
break;
+ case TARGET_MEM_REF:
+ t = maybe_fold_tmr (expr);
+ break;
+
default:
return NULL_TREE;
}
Index: tree-ssa-loop-im.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-ssa-loop-im.c,v
retrieving revision 2.36
diff -c -3 -p -r2.36 tree-ssa-loop-im.c
*** tree-ssa-loop-im.c 11 Apr 2005 20:17:38 -0000 2.36
--- tree-ssa-loop-im.c 12 Apr 2005 12:38:27 -0000
*************** for_each_index (tree *addr_p, bool (*cbc
*** 197,202 ****
--- 197,213 ----
case RESULT_DECL:
return true;
+ case TARGET_MEM_REF:
+ idx = &TMR_BASE (*addr_p);
+ if (*idx
+ && !cbck (*addr_p, idx, data))
+ return false;
+ idx = &TMR_INDEX (*addr_p);
+ if (*idx
+ && !cbck (*addr_p, idx, data))
+ return false;
+ return true;
+
default:
gcc_unreachable ();
}
Index: tree-ssa-loop-ivopts.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-ssa-loop-ivopts.c,v
retrieving revision 2.59
diff -c -3 -p -r2.59 tree-ssa-loop-ivopts.c
*** tree-ssa-loop-ivopts.c 9 Apr 2005 01:37:24 -0000 2.59
--- tree-ssa-loop-ivopts.c 12 Apr 2005 12:38:27 -0000
*************** may_be_unaligned_p (tree ref)
*** 1481,1486 ****
--- 1481,1491 ----
int unsignedp, volatilep;
unsigned base_align;
+ /* TARGET_MEM_REFs are translated directly to valid MEMs on the target,
+ thus they are not missaligned. */
+ if (TREE_CODE (ref) == TARGET_MEM_REF)
+ return false;
+
/* The test below is basically copy of what expr.c:normal_inner_ref
does to check whether the object must be loaded by parts when
STRICT_ALIGNMENT is true. */
*************** may_be_unaligned_p (tree ref)
*** 1503,1509 ****
static void
find_interesting_uses_address (struct ivopts_data *data, tree stmt, tree *op_p)
{
! tree base = unshare_expr (*op_p), step = NULL;
struct iv *civ;
struct ifs_ivopts_data ifs_ivopts_data;
--- 1508,1514 ----
static void
find_interesting_uses_address (struct ivopts_data *data, tree stmt, tree *op_p)
{
! tree base = *op_p, step = NULL;
struct iv *civ;
struct ifs_ivopts_data ifs_ivopts_data;
*************** find_interesting_uses_address (struct iv
*** 1517,1536 ****
&& may_be_unaligned_p (base))
goto fail;
! ifs_ivopts_data.ivopts_data = data;
! ifs_ivopts_data.stmt = stmt;
! ifs_ivopts_data.step_p = &step;
! if (!for_each_index (&base, idx_find_step, &ifs_ivopts_data)
! || zero_p (step))
! goto fail;
! gcc_assert (TREE_CODE (base) != ALIGN_INDIRECT_REF);
! gcc_assert (TREE_CODE (base) != MISALIGNED_INDIRECT_REF);
! if (TREE_CODE (base) == INDIRECT_REF)
! base = TREE_OPERAND (base, 0);
else
! base = build_addr (base);
civ = alloc_iv (base, step);
record_use (data, op_p, civ, stmt, USE_ADDRESS);
--- 1522,1587 ----
&& may_be_unaligned_p (base))
goto fail;
! base = unshare_expr (base);
!
! if (TREE_CODE (base) == TARGET_MEM_REF)
! {
! tree type = build_pointer_type (TREE_TYPE (base));
! tree astep;
!
! if (TMR_BASE (base)
! && TREE_CODE (TMR_BASE (base)) == SSA_NAME)
! {
! civ = get_iv (data, TMR_BASE (base));
! if (!civ)
! goto fail;
! TMR_BASE (base) = civ->base;
! step = civ->step;
! }
! if (TMR_INDEX (base)
! && TREE_CODE (TMR_INDEX (base)) == SSA_NAME)
! {
! civ = get_iv (data, TMR_INDEX (base));
! if (!civ)
! goto fail;
! TMR_INDEX (base) = civ->base;
! astep = civ->step;
!
! if (astep)
! {
! if (TMR_STEP (base))
! astep = fold_build2 (MULT_EXPR, type, TMR_STEP (base), astep);
!
! if (step)
! step = fold_build2 (PLUS_EXPR, type, step, astep);
! else
! step = astep;
! }
! }
!
! if (zero_p (step))
! goto fail;
! base = tree_mem_ref_addr (type, base);
! }
else
! {
! ifs_ivopts_data.ivopts_data = data;
! ifs_ivopts_data.stmt = stmt;
! ifs_ivopts_data.step_p = &step;
! if (!for_each_index (&base, idx_find_step, &ifs_ivopts_data)
! || zero_p (step))
! goto fail;
!
! gcc_assert (TREE_CODE (base) != ALIGN_INDIRECT_REF);
! gcc_assert (TREE_CODE (base) != MISALIGNED_INDIRECT_REF);
!
! if (TREE_CODE (base) == INDIRECT_REF)
! base = TREE_OPERAND (base, 0);
! else
! base = build_addr (base);
! }
civ = alloc_iv (base, step);
record_use (data, op_p, civ, stmt, USE_ADDRESS);
*************** unshare_and_remove_ssa_names (tree ref)
*** 4804,4882 ****
return ref;
}
! /* Rewrites base of memory access OP with expression WITH in statement
! pointed to by BSI. */
! static void
! rewrite_address_base (block_stmt_iterator *bsi, tree *op, tree with)
{
! tree bvar, var, new_name, copy, name;
! tree orig;
! var = bvar = get_base_address (*op);
!
! if (!var || TREE_CODE (with) != SSA_NAME)
! goto do_rewrite;
- gcc_assert (TREE_CODE (var) != ALIGN_INDIRECT_REF);
- gcc_assert (TREE_CODE (var) != MISALIGNED_INDIRECT_REF);
if (TREE_CODE (var) == INDIRECT_REF)
var = TREE_OPERAND (var, 0);
if (TREE_CODE (var) == SSA_NAME)
{
! name = var;
var = SSA_NAME_VAR (var);
}
! else if (DECL_P (var))
! name = NULL_TREE;
! else
! goto do_rewrite;
!
! /* We need to add a memory tag for the variable. But we do not want
! to add it to the temporary used for the computations, since this leads
! to problems in redundancy elimination when there are common parts
! in two computations referring to the different arrays. So we copy
! the variable to a new temporary. */
! copy = build2 (MODIFY_EXPR, void_type_node, NULL_TREE, with);
!
! if (name)
! new_name = duplicate_ssa_name (name, copy);
! else
{
! tree tag = var_ann (var)->type_mem_tag;
! tree new_ptr = create_tmp_var (TREE_TYPE (with), "ruatmp");
! add_referenced_tmp_var (new_ptr);
if (tag)
! var_ann (new_ptr)->type_mem_tag = tag;
! else
! add_type_alias (new_ptr, var);
! new_name = make_ssa_name (new_ptr, copy);
! }
!
! TREE_OPERAND (copy, 0) = new_name;
! update_stmt (copy);
! bsi_insert_before (bsi, copy, BSI_SAME_STMT);
! with = new_name;
! do_rewrite:
!
! orig = NULL_TREE;
! gcc_assert (TREE_CODE (*op) != ALIGN_INDIRECT_REF);
! gcc_assert (TREE_CODE (*op) != MISALIGNED_INDIRECT_REF);
!
! if (TREE_CODE (*op) == INDIRECT_REF)
! orig = REF_ORIGINAL (*op);
! if (!orig)
! orig = unshare_and_remove_ssa_names (*op);
! *op = build1 (INDIRECT_REF, TREE_TYPE (*op), with);
! /* Record the original reference, for purposes of alias analysis. */
! REF_ORIGINAL (*op) = orig;
! /* Virtual operands in the original statement may have to be renamed
! because of the replacement. */
! mark_new_vars_to_rename (bsi_stmt (*bsi));
}
/* Rewrites USE (address that is an iv) using candidate CAND. */
--- 4855,4914 ----
return ref;
}
! /* Extract the alias analysis info for the memory reference REF. There are
! several ways how this information may be stored and what precisely is
! its semantics depending on the type of the reference, but there always is
! somewhere hidden one _DECL node that is used to determine the set of
! virtual operands for the reference. The code below deciphers this jungle
! and extracts this single useful piece of information. */
! static tree
! get_ref_tag (tree ref)
{
! tree var = get_base_address (ref);
! tree tag;
! if (!var)
! return NULL_TREE;
if (TREE_CODE (var) == INDIRECT_REF)
var = TREE_OPERAND (var, 0);
if (TREE_CODE (var) == SSA_NAME)
{
! if (SSA_NAME_PTR_INFO (var))
! {
! tag = SSA_NAME_PTR_INFO (var)->name_mem_tag;
! if (tag)
! return tag;
! }
!
var = SSA_NAME_VAR (var);
}
!
! if (DECL_P (var))
{
! tag = var_ann (var)->type_mem_tag;
if (tag)
! return tag;
! return var;
! }
! return NULL_TREE;
! }
! /* Copies the reference information from OLD_REF to NEW_REF. */
! static void
! copy_ref_info (tree new_ref, tree old_ref)
! {
! if (TREE_CODE (old_ref) == TARGET_MEM_REF)
! new_ref->common.ann = tree_ann (old_ref);
! else
! {
! get_tmr_ann (new_ref)->tag = get_ref_tag (old_ref);
! tmr_ann (new_ref)->original = unshare_and_remove_ssa_names (old_ref);
! }
}
/* Rewrites USE (address that is an iv) using candidate CAND. */
*************** static void
*** 4885,4900 ****
rewrite_use_address (struct ivopts_data *data,
struct iv_use *use, struct iv_cand *cand)
{
tree comp = unshare_expr (get_computation (data->current_loop,
use, cand));
block_stmt_iterator bsi = bsi_for_stmt (use->stmt);
! tree stmts;
! tree op = force_gimple_operand (comp, &stmts, true, NULL_TREE);
! if (stmts)
! bsi_insert_before (&bsi, stmts, BSI_SAME_STMT);
!
! rewrite_address_base (&bsi, use->op_p, op);
}
/* Rewrites USE (the condition such that one of the arguments is an iv) using
--- 4917,4930 ----
rewrite_use_address (struct ivopts_data *data,
struct iv_use *use, struct iv_cand *cand)
{
+ /* TODO -- produce TARGET_MEM_REF directly. */
tree comp = unshare_expr (get_computation (data->current_loop,
use, cand));
block_stmt_iterator bsi = bsi_for_stmt (use->stmt);
! tree ref = create_mem_ref (&bsi, TREE_TYPE (*use->op_p), comp);
! copy_ref_info (ref, *use->op_p);
! *use->op_p = ref;
}
/* Rewrites USE (the condition such that one of the arguments is an iv) using
*************** tree_ssa_iv_optimize (struct loops *loop
*** 5402,5428 ****
loop = loop->outer;
}
- /* FIXME. IV opts introduces new aliases and call-clobbered
- variables, which need to be renamed. However, when we call the
- renamer, not all statements will be scanned for operands. In
- particular, the newly introduced aliases may appear in statements
- that are considered "unmodified", so the renamer will not get a
- chance to rename those operands.
-
- Work around this problem by forcing an operand re-scan on every
- statement. This will not be necessary once the new operand
- scanner is implemented. */
- if (need_ssa_update_p ())
- {
- basic_block bb;
- block_stmt_iterator si;
- FOR_EACH_BB (bb)
- for (si = bsi_start (bb); !bsi_end_p (si); bsi_next (&si))
- update_stmt (bsi_stmt (si));
-
- update_ssa (TODO_update_ssa);
- }
-
- rewrite_into_loop_closed_ssa (NULL);
tree_ssa_iv_optimize_finalize (loops, &data);
}
--- 5432,5436 ----
Index: tree-ssa-operands.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree-ssa-operands.c,v
retrieving revision 2.76
diff -c -3 -p -r2.76 tree-ssa-operands.c
*** tree-ssa-operands.c 9 Apr 2005 16:43:28 -0000 2.76
--- tree-ssa-operands.c 12 Apr 2005 12:38:27 -0000
*************** static void note_addressable (tree, stmt
*** 135,140 ****
--- 135,141 ----
static void get_expr_operands (tree, tree *, int);
static void get_asm_expr_operands (tree);
static void get_indirect_ref_operands (tree, tree, int);
+ static void get_tmr_operands (tree, tree, int);
static void get_call_expr_operands (tree, tree);
static inline void append_def (tree *);
static inline void append_use (tree *);
*************** get_expr_operands (tree stmt, tree *expr
*** 1333,1338 ****
--- 1334,1343 ----
get_indirect_ref_operands (stmt, expr, flags);
return;
+ case TARGET_MEM_REF:
+ get_tmr_operands (stmt, expr, flags);
+ return;
+
case ARRAY_REF:
case ARRAY_RANGE_REF:
/* Treat array references as references to the virtual variable
*************** get_indirect_ref_operands (tree stmt, tr
*** 1714,1719 ****
--- 1719,1748 ----
get_expr_operands (stmt, pptr, opf_none);
}
+ /* A subroutine of get_expr_operands to handle TARGET_MEM_REF. */
+
+ static void
+ get_tmr_operands (tree stmt, tree expr, int flags)
+ {
+ tmr_ann_t ann = tmr_ann (expr);
+
+ /* First record the real operands. */
+ get_expr_operands (stmt, &TMR_BASE (expr), opf_none);
+ get_expr_operands (stmt, &TMR_INDEX (expr), opf_none);
+
+ /* MEM_REFs should never be killing. */
+ flags &= ~opf_kill_def;
+
+ if (TMR_SYMBOL (expr))
+ note_addressable (TMR_SYMBOL (expr), stmt_ann (stmt));
+
+ if (ann->tag)
+ add_stmt_operand (&ann->tag, stmt_ann (stmt), flags);
+ else
+ /* Something weird, so ensure we will be careful. */
+ stmt_ann (stmt)->has_volatile_ops = true;
+ }
+
/* A subroutine of get_expr_operands to handle CALL_EXPR. */
static void
Index: tree.c
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree.c,v
retrieving revision 1.472
diff -c -3 -p -r1.472 tree.c
*** tree.c 30 Mar 2005 21:34:29 -0000 1.472
--- tree.c 12 Apr 2005 12:38:27 -0000
*************** copy_node_stat (tree node MEM_STAT_DECL)
*** 441,446 ****
--- 441,449 ----
TREE_VISITED (t) = 0;
t->common.ann = 0;
+ if (code == TARGET_MEM_REF)
+ copy_mem_ref_info (t, node);
+
if (TREE_CODE_CLASS (code) == tcc_declaration)
DECL_UID (t) = next_decl_uid++;
else if (TREE_CODE_CLASS (code) == tcc_type)
*************** build4_stat (enum tree_code code, tree t
*** 2704,2715 ****
return t;
}
/* Backup definition for non-gcc build compilers. */
tree
(build) (enum tree_code code, tree tt, ...)
{
! tree t, arg0, arg1, arg2, arg3;
int length = TREE_CODE_LENGTH (code);
va_list p;
--- 2707,2748 ----
return t;
}
+ tree
+ build5_stat (enum tree_code code, tree tt, tree arg0, tree arg1,
+ tree arg2, tree arg3, tree arg4 MEM_STAT_DECL)
+ {
+ bool constant, read_only, side_effects, invariant;
+ tree t;
+
+ gcc_assert (TREE_CODE_LENGTH (code) == 5);
+
+ t = make_node_stat (code PASS_MEM_STAT);
+ TREE_TYPE (t) = tt;
+
+ side_effects = TREE_SIDE_EFFECTS (t);
+
+ PROCESS_ARG(0);
+ PROCESS_ARG(1);
+ PROCESS_ARG(2);
+ PROCESS_ARG(3);
+ PROCESS_ARG(4);
+
+ TREE_SIDE_EFFECTS (t) = side_effects;
+
+ TREE_THIS_VOLATILE (t)
+ = (code != TARGET_MEM_REF
+ && TREE_CODE_CLASS (code) == tcc_reference
+ && arg0 && TREE_THIS_VOLATILE (arg0));
+
+ return t;
+ }
+
/* Backup definition for non-gcc build compilers. */
tree
(build) (enum tree_code code, tree tt, ...)
{
! tree t, arg0, arg1, arg2, arg3, arg4;
int length = TREE_CODE_LENGTH (code);
va_list p;
*************** tree
*** 2741,2746 ****
--- 2774,2787 ----
arg3 = va_arg (p, tree);
t = build4 (code, tt, arg0, arg1, arg2, arg3);
break;
+ case 5:
+ arg0 = va_arg (p, tree);
+ arg1 = va_arg (p, tree);
+ arg2 = va_arg (p, tree);
+ arg3 = va_arg (p, tree);
+ arg4 = va_arg (p, tree);
+ t = build5 (code, tt, arg0, arg1, arg2, arg3, arg4);
+ break;
default:
gcc_unreachable ();
}
Index: tree.def
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree.def,v
retrieving revision 1.115
diff -c -3 -p -r1.115 tree.def
*** tree.def 9 Apr 2005 01:37:28 -0000 1.115
--- tree.def 12 Apr 2005 12:38:27 -0000
*************** DEFTREECODE (WITH_SIZE_EXPR, "with_size_
*** 935,940 ****
--- 935,948 ----
generated by the builtin targetm.vectorize.mask_for_load_builtin_decl. */
DEFTREECODE (REALIGN_LOAD_EXPR, "realign_load", tcc_expression, 3)
+ /* Low-level memory addressing. Operands are SYMBOL (static or global
+ variable), BASE (register), INDEX (register), STEP (integer constant),
+ OFFSET (integer constant). Corresponding address is
+ SYMBOL + BASE + STEP * INDEX + OFFSET. Only variations and values valid on
+ the target are allowed. */
+
+ DEFTREECODE (TARGET_MEM_REF, "target_mem_ref", tcc_reference, 5)
+
/*
Local variables:
mode:c
Index: tree.h
===================================================================
RCS file: /cvs/gcc/gcc/gcc/tree.h,v
retrieving revision 1.714
diff -c -3 -p -r1.714 tree.h
*** tree.h 9 Apr 2005 17:19:03 -0000 1.714
--- tree.h 12 Apr 2005 12:38:27 -0000
*************** struct tree_vec GTY(())
*** 1166,1175 ****
#define TREE_OPERAND(NODE, I) TREE_OPERAND_CHECK (NODE, I)
#define TREE_COMPLEXITY(NODE) (EXPR_CHECK (NODE)->exp.complexity)
- /* In INDIRECT_REF, ALIGN_INDIRECT_REF, MISALIGNED_INDIRECT_REF. */
- #define REF_ORIGINAL(NODE) TREE_CHAIN (TREE_CHECK3 (NODE, \
- INDIRECT_REF, ALIGN_INDIRECT_REF, MISALIGNED_INDIRECT_REF))
-
/* In a LOOP_EXPR node. */
#define LOOP_EXPR_BODY(NODE) TREE_OPERAND_CHECK_CODE (NODE, LOOP_EXPR, 0)
--- 1166,1171 ----
*************** struct tree_vec GTY(())
*** 1235,1240 ****
--- 1231,1243 ----
#define CASE_HIGH(NODE) TREE_OPERAND (CASE_LABEL_EXPR_CHECK (NODE), 1)
#define CASE_LABEL(NODE) TREE_OPERAND (CASE_LABEL_EXPR_CHECK (NODE), 2)
+ /* The operands of a TARGET_MEM_REF. */
+ #define TMR_SYMBOL(NODE) (TREE_OPERAND (TARGET_MEM_REF_CHECK (NODE), 0))
+ #define TMR_BASE(NODE) (TREE_OPERAND (TARGET_MEM_REF_CHECK (NODE), 1))
+ #define TMR_INDEX(NODE) (TREE_OPERAND (TARGET_MEM_REF_CHECK (NODE), 2))
+ #define TMR_STEP(NODE) (TREE_OPERAND (TARGET_MEM_REF_CHECK (NODE), 3))
+ #define TMR_OFFSET(NODE) (TREE_OPERAND (TARGET_MEM_REF_CHECK (NODE), 4))
+
/* The operands of a BIND_EXPR. */
#define BIND_EXPR_VARS(NODE) (TREE_OPERAND (BIND_EXPR_CHECK (NODE), 0))
#define BIND_EXPR_BODY(NODE) (TREE_OPERAND (BIND_EXPR_CHECK (NODE), 1))
*************** extern tree build3_stat (enum tree_code,
*** 2890,2895 ****
--- 2893,2901 ----
extern tree build4_stat (enum tree_code, tree, tree, tree, tree,
tree MEM_STAT_DECL);
#define build4(c,t1,t2,t3,t4,t5) build4_stat (c,t1,t2,t3,t4,t5 MEM_STAT_INFO)
+ extern tree build5_stat (enum tree_code, tree, tree, tree, tree, tree,
+ tree MEM_STAT_DECL);
+ #define build5(c,t1,t2,t3,t4,t5,t6) build5_stat (c,t1,t2,t3,t4,t5,t6 MEM_STAT_INFO)
extern tree build_int_cst (tree, HOST_WIDE_INT);
extern tree build_int_cst_type (tree, HOST_WIDE_INT);
*************** extern tree get_base_address (tree t);
*** 3975,3978 ****
--- 3981,3988 ----
/* In tree-vectorizer.c. */
extern void vect_set_verbosity_level (const char *);
+ /* In tree-ssa-address.c. */
+ extern tree tree_mem_ref_addr (tree, tree);
+ extern void copy_mem_ref_info (tree, tree);
+
#endif /* GCC_TREE_H */
Index: doc/c-tree.texi
===================================================================
RCS file: /cvs/gcc/gcc/gcc/doc/c-tree.texi,v
retrieving revision 1.72
diff -c -3 -p -r1.72 c-tree.texi
*** doc/c-tree.texi 5 Mar 2005 19:56:27 -0000 1.72
--- doc/c-tree.texi 12 Apr 2005 12:38:29 -0000
*************** This macro returns the attributes on the
*** 1712,1717 ****
--- 1712,1718 ----
@tindex EXACT_DIV_EXPR
@tindex ARRAY_REF
@tindex ARRAY_RANGE_REF
+ @tindex TARGET_MEM_REF
@tindex LT_EXPR
@tindex LE_EXPR
@tindex GT_EXPR
*************** meanings. The type of these expressions
*** 2103,2108 ****
--- 2104,2125 ----
type is the same as that of the first operand. The range of that array
type determines the amount of data these expressions access.
+ @item TARGET_MEM_REF
+ These nodes represent memory accesses whose address directly map to
+ an addressing mode of the target architecture. The first argument
+ is @code{TMR_SYMBOL} and must be a @code{VAR_DECL} of an object with
+ a fixed address. The second argument is @code{TMR_BASE} and the
+ third one is @code{TMR_INDEX}. The fourth argument is
+ @code{TMR_STEP} and must be an @code{INTEGER_CST}. The fifth
+ argument is @code{TMR_OFFSET} and must be an @code{INTEGER_CST}.
+ Any of the arguments may be NULL if the appropriate component
+ does not appear in the address. Address of the @code{TARGET_MEM_REF}
+ is determined in the following way.
+
+ @smallexample
+ &TMR_SYMBOL + TMR_BASE + TMR_INDEX * TMR_STEP + TMR_OFFSET
+ @end smallexample
+
@item LT_EXPR
@itemx LE_EXPR
@itemx GT_EXPR
Index: doc/tree-ssa.texi
===================================================================
RCS file: /cvs/gcc/gcc/gcc/doc/tree-ssa.texi,v
retrieving revision 1.22
diff -c -3 -p -r1.22 tree-ssa.texi
*** doc/tree-ssa.texi 10 Apr 2005 17:26:03 -0000 1.22
--- doc/tree-ssa.texi 12 Apr 2005 12:38:29 -0000
*************** void f()
*** 632,637 ****
--- 632,643 ----
op2 -> var
compref : inner-compref
+ | TARGET_MEM_REF
+ op0 -> ID
+ op1 -> val
+ op2 -> val
+ op3 -> CONST
+ op4 -> CONST
| REALPART_EXPR
op0 -> inner-compref
| IMAGPART_EXPR