[Bug c/28921] vector of a typedef applies the vector to the inner most type instead of erroring/warning out that vector does not apply
egallager at gcc dot gnu.org
gcc-bugzilla@gcc.gnu.org
Tue Jul 25 00:21:00 GMT 2017
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=28921
Eric Gallager <egallager at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|UNCONFIRMED |NEW
Last reconfirmed| |2017-07-25
CC| |egallager at gcc dot gnu.org
Ever confirmed|0 |1
Known to fail| |8.0
--- Comment #3 from Eric Gallager <egallager at gcc dot gnu.org> ---
(In reply to Andrew Pinski from comment #0)
> Testcase:
> typedef char *cptr;
>
> char *a;
>
> __attribute__ ((vector_size(16))) cptr t;
>
> int f(void)
> {
>
> __attribute__ ((vector_size(16))) int t1 =
> (__attribute__ ((vector_size(16))) int )t;
> }
>
> We get an error about converting t to a vector int but t looks to me a
> vector of a char pointer. This happens with both the C and C++ front-ends.
Confirmed.
More information about the Gcc-bugs
mailing list