In following test program: ------------------------------ #define NB_DEV 1 extern unsigned int max; unsigned long left[NB_DEV]; unsigned long right[NB_DEV]; void foo() { unsigned int i; for (i=1; i < max; i++) left[i] = right[i-1]; } ------------------------------ compiled with: $(CXX) -Werror -Wall -O2 -c reprod.cc g++ gives following warning/error: reprod.cc: In function 'void foo()': reprod.cc:13:13: error: array subscript is above array bounds [-Werror=array-bounds] left[i] = right[i-1]; ~~~~~~^ cc1plus: all warnings being treated as errors make: *** [Makefile:4: all] Error 1 While there _could_ be an array overflow, g++ cannot know for sure because the loop boundary 'max' is an external variable. The code is perfectly fine in case max == 1. In that case, the loop does nothing. This is a reduced version of real code where the arrays left and right are dimensioned to some maximum value NB_DEV, and 'max' will be at most that NB_DEV but possibly smaller. We are thus sure there will not be an array overflow. Going back to the reproduction code above, if you change NB_DEV to 2 (for example), no warning is thrown, even though there could still be an overflow in case max == 5, for example. According to me, no warning should be thrown because g++ cannot surely say there is a problem. Same problem is seen if you compile this as C code rather than C++. Problem is not seen with -O1, only with -O2 or -O3. This problem was tested with gcc 6.4.0 (x86_64), gcc 6.3.0 (armeb), gcc 5.4.0 (armeb) and gcc 4.9.4 (armeb).
Confirmed. It doesn't look like the assignment to left[i] affects the determination of the maximum number of iterations of the loop during unrolling. This is also a missed optimization opportunity since the whole loop could be eliminated.
This is the simple case of GCC optimizing the access to a constant: <bb 2> [50.00%]: max.0_11 = max; if (max.0_11 > 1) goto <bb 3>; [50.00%] else goto <bb 4>; [50.00%] <bb 3> [25.00%]: _13 = right[0]; left[1] = _13; <bb 4> [50.00%]: return; and we warn for the case of max > 1. Similarly we warn for int bar () { return left[2]; } even if we can't prove that bar() is actually executed. We could change the warning to have a "may be above array bounds" form for your case but that wouldn't handle the bar() case.
(In reply to Richard Biener from comment #2) > > We could change the warning to have a "may be above array bounds" form > for your case but that wouldn't handle the bar() case. The problem with giving warnings about potential-but-not-definite issues is that projects that compile with '-Wall -Werror' assume zero warnings to guard quality. But if some warnings are false-positives, this strategy no longer works. The project will fail to compile even though it is perfectly fine. You'd need a way to tell gcc that this code is fine, or put such cases in a separate warning category that is not included in Wall or can be disabled explicitly.
A simpler test case is this (which is analogous to what the loop is transformed into): $ cat x.c && gcc -O2 -S -Wall x.c unsigned left[1]; unsigned long right[1]; void f (unsigned i) { if (i) left[i] = right[i - 1]; } x.c: In function ‘f’: x.c:7:11: warning: array subscript [0, 0] is outside array bounds of ‘unsigned int[1]’ [-Warray-bounds] left[i] = right[i - 1]; ~~~~^~~ Here it's even more obvious that the warning is wrong. It seems to me that the whole if statement could either be eliminated or its body replaced by a trap because the assignment in it is undefined. That would eliminate the loop (and with it also the warning).