This is the mail archive of the
gcc@gcc.gnu.org
mailing list for the GCC project.
Re: Good news, bad news on the repository conversion
- From: Janus Weil <janus at gcc dot gnu dot org>
- To: "Eric S. Raymond" <esr at thyrsus dot com>
- Cc: David Edelsohn <dje dot gcc at gmail dot com>, GCC Development <gcc at gcc dot gnu dot org>, fallenpegasus at gmail dot com
- Date: Mon, 9 Jul 2018 18:53:36 +0200
- Subject: Re: Good news, bad news on the repository conversion
- References: <20180709002754.962F43A4AA7@snark.thyrsus.com> <CAKwh3qgKjXYfp8RWLYOOR8h6zbbtw-F+gya1KuDnmLf=54j5eA@mail.gmail.com> <20180709101628.GA19774@thyrsus.com> <CAGWvny=-NdGDwKC2XRQCNJ4YZ7P_rVSnbcZ5eN==9zBpqffVXg@mail.gmail.com> <20180709163542.GA15706@thyrsus.com>
2018-07-09 18:35 GMT+02:00 Eric S. Raymond <esr@thyrsus.com>:
> David Edelsohn <dje.gcc@gmail.com>:
>> > The truth is we're near the bleeding edge of what conventional tools
>> > and hardware can handle gracefully. Most jobs with working sets as
>> > big as this one's do only comparatively dumb operations that can be
>> > parallellized and thrown on a GPU or supercomputer. Most jobs with
>> > the algorithmic complexity of repository surgery have *much* smaller
>> > working sets. The combination of both extrema is hard.
>>
>> If you come to the conclusion that the GCC Community could help with
>> resources, such as the GNU Compile Farm or paying for more RAM, let us
>> know.
>
> 128GB of DDR4 registered RAM would allow me to run conversions with my
> browser up, but be eye-wateringly expensive. Thanks, but I'm not
> going to yell for that help
I for one would certainly be happy to donate some spare bucks towards
beastie RAM if it helps to get the GCC repo converted to git in a
timely manner, and I'm sure there are other GCC
developers/users/sympathizers who'd be willing to join in. So, where
do we throw those bucks?
Cheers,
Janus