This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Incorrect bitfield aliasing with Tree SSA


> How does this get a different result for trees than RTL?
> 
> As i've explained, we rely on the proper of the TBAA forest that given
> 
>   struct foo (set 1)
>   /                                 \
> int :31  (set 2)  short :31 (set 3)
> 
> sets for int :31 and short :31 are strict subsets of that of struct foo.

Where I'm lost here is why you need do anything whatsoever!  If you ask
for the alias set of something, get_alias_set *already* does the right
thing.  Or is the issue that you're calling get_alias_set on a FIELD_DECL?

I'm not sure what a "TBAA forest" is, but keep in mind that, at least in
Ada, we have many different types (meaning different tree nodes) that have
the same alias set and we really do mean that they are to conflict.
This is why it's a concern to me if we're not uniformly using the same
functions at the tree and RTL level to compute alias sets.

But there's also an implementation issue: where do you propose to *store*
this fake alias set?  There is no such field as DECL_ALIAS_SET.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]