This is the mail archive of the gcc-help@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: std::list.sort() blocks on very large lists (>12GB) GCC 4.0.1 Mac OS X 64-bit


jpitt@fhcrc.org writes:

> I'm experiencing a weird problem that makes me wonder if there is a
> problem with the std::list.sort member function when used on very
> large lists. I have a very large list (~20GB) that is made up of a
> class (382bytes each) of DNA sequence data. When the list is on the
> order of 12GB it sorts fine, but if it gets up into the 17GB range
> when I call sort() it seems to block. The list has already been built
> so I don't think it's a malloc issue. I also don't think it's a
> complexity problem because the 12GB list sorts fine (it takes a minute
> or so), so I'm wondering if the std sort method doesn't properly
> handle the excess memory available in 64bit mode? Has anyone
> experienced this?

Without more information, my first guess would be that you are exceeding
the available working set on your machine, and thrashing.
std::list.sort is written to assume that the entire list can be stored
in the working set.  Sorting an amount of data that is significantly
larger than available RAM generally requires different techniques for
efficiency.  For example, manually splice() the list into lists which
are small enough to sort efficiently, and then merge() the results
together.

Ian


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]