This is the mail archive of the
gcc-help@gcc.gnu.org
mailing list for the GCC project.
Re: std::list.sort() blocks on very large lists (>12GB) GCC 4.0.1 Mac OS X 64-bit
- From: Nicholas Sherlock <n dot sherlock at gmail dot com>
- To: gcc-help at gcc dot gnu dot org
- Date: Sat, 18 Jul 2009 21:48:28 +1200
- Subject: Re: std::list.sort() blocks on very large lists (>12GB) GCC 4.0.1 Mac OS X 64-bit
- References: <20090716161254.1zb8i5qo00gg4kw4@webmail.fhcrc.org>
jpitt@fhcrc.org wrote:
I'm experiencing a weird problem that makes me wonder if there is a
problem with the std::list.sort member function when used on very large
lists. I have a very large list (~20GB) that is made up of a class
(382bytes each) of DNA sequence data. When the list is on the order of
12GB it sorts fine, but if it gets up into the 17GB range when I call
sort() it seems to block. The list has already been built so I don't
think it's a malloc issue. I also don't think it's a complexity problem
because the 12GB list sorts fine (it takes a minute or so), so I'm
wondering if the std sort method doesn't properly handle the excess
memory available in 64bit mode? Has anyone experienced this?
Double check the comparison function you're using. If it doesn't
properly conform to the specification, sort() can get into an infinite
loop. Here's some info on the matter:
http://blogs.msdn.com/oldnewthing/archive/2009/05/08/9595334.aspx
http://blogs.msdn.com/oldnewthing/archive/2003/10/23/55408.aspx
Cheers,
Nicholas Sherlock