Note: initially found on gcc 4.3.2, confirmed on 4.6.0 20100924 from svn. Consider the following program: <code> /* test.c */ #include<assert.h> #include<inttypes.h> #include<stddef.h> #include<stdio.h> #include<stdlib.h> int main(int argc, char ** argv) { printf("ptrdiff_t max: %ju, size_t max: %ju\n", (uintmax_t) PTRDIFF_MAX, (uintmax_t) SIZE_MAX); assert (argc > 1); size_t size = atoll(argv[1]); printf("requested array size: %zu\n", size); assert (size > 0); uint16_t * array = malloc(size * sizeof(*array)); assert (array != NULL); printf("array one-past-end/start difference: %td\n", &array[size] - &array[0]); } </code> $ gcc -std=c99 -pedantic -Wall -Wextra test.c $ ./a.out 1200000000 ptrdiff_t max: 2147483647, size_t max: 4294967295 requested array size: 1200000000 array one-past-end/start difference: -947483648 The output "-947483648" violates the C99 standard, it should be "1200000000". This program was compiled and run on an IA-32 host with 2.5 GiB memory. The pointer returned by the successful call to malloc() points to an array of 1200000000 uint16_t's. In the present case, the number 1200000000 is smaller than PTRDIFF_MAX and thus representable by the ptrdiff_t type. Hence, by the C99 standard, 6.5.6p9, the expression &array[size] - &array[0] above is defined to have type ptrdiff_t and value 1200000000. Note that if one replaced uint16_t with char in the above code and called the program with argument 2400000000 (a number larger than PTRDIFF_MAX), the behaviour would be undefined. Therefore I suspect that, internally, gcc first calculates the value of &array[size] - &array[0] as if array had type pointer-to-char and then erroneously interprets the result as a negative 32-bit 2's complement signed integer, which it then divides by 2 (that is, sizeof(uint16_t)) with a signed integer division. Best regards, Alexander
While fiddling around a bit more I found that gcc normally doesn't let me create objects whose size exceeds PTRDIFF_MAX. For objects of size at most PTRDIFF_MAX, the bug cannot be triggered. The only function I found which does create such large objects is malloc(). Presumably, its companions calloc() and realloc() do so as well. In this light, the best fix for this bug seems to be for malloc() and companions to simply return NULL upon a request for an object whose size exceeds PTRDIFF_MAX.
https://gcc.gnu.org/ml/gcc/2011-08/msg00221.html