[Bug tree-optimization/93843] [10 Regression] wrong code at -O3 on x86_64-linux-gnu

jakub at gcc dot gnu.org gcc-bugzilla@gcc.gnu.org
Mon Feb 24 11:24:00 GMT 2020


https://gcc.gnu.org/bugzilla/show_bug.cgi?id=93843

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |rguenth at gcc dot gnu.org

--- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
The problem is that we have a signed char -> short int cast and
vectorizable_conversion on that sees vectype_in a 2xQI with HImode TYPE_MODE
and
vectype_out a 2xHI with SImode TYPE_MODE.  As both vector types have 2 units,
modifier is NONE and we create a NOP_EXPR from the 2xQI vector to 2xHI vector
where we actually need to sign-extend both vector elements.
The big question is whether this is something valid (I'd hope not); if yes,
we'd need to tweak the expansion so that it emits something that actually does
the extensions, if not, we need to punt or handle it some different way in
vectorizable_conversion.  Because right now, we actually emit just a scalar
sign-extension, which means the first element contains both original elements
and second element just 0 or -1 depending on the sign of the original second
element.


More information about the Gcc-bugs mailing list