This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug sanitizer/68065] Size calculations for VLAs can overflow
- From: "ch3root at openwall dot com" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Wed, 11 Nov 2015 03:21:27 +0000
- Subject: [Bug sanitizer/68065] Size calculations for VLAs can overflow
- Auto-submitted: auto-generated
- References: <bug-68065-4 at http dot gcc dot gnu dot org/bugzilla/>
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68065
--- Comment #23 from Alexander Cherepanov <ch3root at openwall dot com> ---
On 2015-11-11 03:53, msebor at gcc dot gnu.org wrote:
> Another way is that the standard requires
> sizeof(excessively-large-vla-type) to overflow/wrap. That's how the
> implementations I've tested behave, including GCC. This could, of course, be
> said to be a manifestation of undefined behavior rather than a feature. Either
> way, the result is the same and the problem with it, as was pointed out in the
> WG14 discussion, is that it can lead to buffer overflow when the overflowed
> size of the VLA type is used is to allocate memory on the heap and the number
> of elements in the VLA to write to the memory.
1. Yes, the practical problem is potential buffer overflows (examples
are in the description of this PR and in the comment #3).
2. The practical problem is size calculation in general, it's not
limited to sizeof operation. You don't need to use sizeof to create
oversized automatic VLA (an example in the description).
3. IMHO overflow in sizeof operation is UB due to C11, 6.5p5, and
wrapping according to C11, 6.2.5p9, is not applicable (see the comment #7).
4. From the POV of the standard I don't see much difference between VLA
and ordinary arrays in this question. AFAICT the standard doesn't place
limits on constructed types of any kind and hence oversized types are
permitted by the standard. See comment #3 (or pr68107) for a practical
example of sizeof overflow with an array of a known constant size which
works with the current gcc.
Gcc chooses to prohibit oversized types when it can easily catch them
and fails compilation stumbling upon an oversized array of a known
constant size (modulo pr68107) but is this a case of undefined behavior,
implementation-defined behavior or what?
3. The same for sizes of objects. There is an environmental limit for
"bytes in an object" but it's marked as "(in a hosted environment
only)". So there is no such limit in the standard for a freestanding
implementation, right? But I doubt that you are supposed to be able to
create oversized arrays (either static or automatic) even in a
freestanding implementation.
4. It's well known that there could be problems with the amount of
automatic storage due to limited stack size. But the same is true for
static storage. Even in a hosted environment and if you meet the limit
of the compiler there is no guarantee that your program will
successfully run. Try "char a[-1ul/2]; int main() { }". For me, it
compiles fine but says "Killed" when run:-) That is, the "execution
environment" part of the implementation failed.