gcc seems to wrongly infer provenance of a pointer expression of the form `p + (q1 - q2)` when the following conditions hold: - the provenance of the pointer `p` couldn't be tracked; - the provenance of `q1` or `q2` is known; - `q1 - q2` couldn't be simplified to get rid of pointers. ---------------------------------------------------------------------- #include <stdio.h> __attribute__((noipa)) // imagine it in a separate TU static int *opaque(int *p) { return p; } int main() { static int x, y; int *r = opaque(&x) + (opaque(&y) - &y); x = 1; *r = 2; printf("x = %d\n", x); } ---------------------------------------------------------------------- $ gcc -std=c11 -pedantic -Wall -Wextra test.c && ./a.out x = 2 $ gcc -std=c11 -pedantic -Wall -Wextra -O3 test.c && ./a.out x = 1 ---------------------------------------------------------------------- gcc x86-64 version: gcc (GCC) 10.0.0 20191230 (experimental) ---------------------------------------------------------------------- The problem is similar to pr49330. Analysis by Alexander Monakov via bug 49330, comment 29: "It's a separate issue, and it's also a regression, gcc-4.7 did not miscompile this. The responsible pass seems to be RTL DSE."
It's actually the same as PR49330. One of the reasons I've not even tried to do major surgery on GIMPLE is that we can't seem to agree on a course of action on the RTL blunder called find_base_term and base_alias_check which makes any "fixing" on the GIMPLE side pointless. *** This bug has been marked as a duplicate of bug 49330 ***