Bug 94122 - Wrong optimization: reading value of a decimal FP variable changes its representation for optimizer
Summary: Wrong optimization: reading value of a decimal FP variable changes its repres...
Status: NEW
Alias: None
Product: gcc
Classification: Unclassified
Component: middle-end (show other bugs)
Version: 10.0
: P3 normal
Target Milestone: ---
Assignee: Not yet assigned to anyone
URL:
Keywords: wrong-code
Depends on:
Blocks:
 
Reported: 2020-03-10 13:39 UTC by Alexander Cherepanov
Modified: 2021-09-13 01:40 UTC (History)
1 user (show)

See Also:
Host:
Target:
Build:
Known to work:
Known to fail:
Last reconfirmed: 2020-03-10 00:00:00


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Alexander Cherepanov 2020-03-10 13:39:34 UTC
Split from bug 94103, comment 1.

It seems the optimizer sometimes computes the representation of variables from its value instead of tracking it directly. This is wrong when the value admits different representations.
(Given that the value is used, the representation should be valid (non-trap).)

Example with decimal floating-point:

----------------------------------------------------------------------
#include <string.h>
#include <stdio.h>

int main()
{
    _Decimal32 x = 9999999; // maximum significand
    unsigned u;

    memcpy(&u, &x, sizeof u);
    printf("%#08x\n", u);

    ++*(unsigned char *)&x; // create non-canonical representation of 0
    (void)-x;

    memcpy(&u, &x, sizeof u);
    printf("%#08x\n", u);
}
----------------------------------------------------------------------
$ gcc -std=c2x -pedantic -Wall -Wextra test.c && ./a.out
0x6cb8967f
0x6cb89680
$ gcc -std=c2x -pedantic -Wall -Wextra -O3 test.c && ./a.out
0x6cb8967f
0x32800000
----------------------------------------------------------------------
gcc x86-64 version: gcc (GCC) 10.0.1 20200305 (experimental)
----------------------------------------------------------------------

Unoptimized results are right, optimized -- wrong.
Comment 1 Richard Biener 2020-03-10 14:24:23 UTC
This works for me with GCC 9.  On trunk this is wrong FRE:

Value numbering stmt = _3 = MEM[(unsigned char *)&x];
Setting value number of _3 to 127 (changed)
Value numbering stmt = _4 = _3 + 1;
Match-and-simplified _3 + 1 to 128
RHS _3 + 1 simplified to 128
Setting value number of _4 to 128 (changed)
Value numbering stmt = MEM[(unsigned char *)&x] = _4;
No store match
Value numbering store MEM[(unsigned char *)&x] to 128
Setting value number of .MEM_12 to .MEM_12 (changed)
Value numbering stmt = x.2_5 = x;
Successfully combined 2 partial definitions
Setting value number of x.2_5 to 0 (changed)
Value numbering stmt = _13 = MEM <unsigned int> [(char * {ref-all})&x];
Setting value number of _13 to 847249408 (changed)

possibly caused by real_{to,from}_target doing normalization during
encoding/decoding? (that would IMHO be wrong?)