This is the mail archive of the
gcc-help@gcc.gnu.org
mailing list for the GCC project.
Re: Writing past the 2GB file size boundary on 32-bit systems
- From: Nick Maclaren <nmm1 at cus dot cam dot ac dot uk>
- To: gcc-help at gcc dot gnu dot org
- Date: Sun, 06 Jan 2008 11:10:51 +0000
- Subject: Re: Writing past the 2GB file size boundary on 32-bit systems
"D. R. Evans" <doc.evans@gmail.com> wrote:
>
> Is there a clear description anywhere of how to use C++ streams and
> ordinary C FILE* functions so that they don't fail when an attempt to write
> to a file goes past the 2GB boundary?
No.
I can't begin to describe how messed up this area is. I could start
with K&R C and AT&T Unix (which had excuses), ISO C and POSIX (which
didn't) and carry on from there. The one thing that I can say is
that the compiler (i.e. gcc, sensu stricto) has nothing to do with
the matter.
This is entirely a library and operating system issue and, by the
sound of it, you are knackered. You can try creating such a file
using the underlying POSIX calls and see if that works. You may
need to specify O_LARGEFILE in the open call.
Regards,
Nick Maclaren,
University of Cambridge Computing Service,
New Museums Site, Pembroke Street, Cambridge CB2 3QH, England.
Email: nmm1@cam.ac.uk
Tel.: +44 1223 334761 Fax: +44 1223 334679