This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

cleaning up the toolchain



I'm sorry this post is a little bit non-technical -- but it does
pertain to the state of the source.

For many years, it has seemed to me (and I believe to at least some of
the GCC and toolchain hackers) that the GNU development tools are
suffering from slow but unrelenting code-rot.  The code is large and
keeps getting larger.  The comments and external documentation are
weak and get better only with excruciating slowness.  I'm not a
compiler engineer, but I've heard (vague) rumours from some compiler
engineers that it's getting progressively harder for the GNU tools to
keep up with other tool-sets, partly as a result of the code-rot.

Wasn't there, at one time, some talk of doing a large clean-up of GCC,
the binutils, and GDB?

In my experience, large-scale clean-ups of code are hard to do
incrementally, in the course of other work.  I also have the
impression that all of the companies who do most of the work on these
tools are constrained to work only on specific projects for which
there are customer contracts -- there isn't money for anything else.

So my questions and suggestions are these: Is there any concensus
among the hackers that a massive project dedicated simply to cleaning
up the existing code would be a Good Thing?  If there is such a
concensus, can it be reduced to the form of a project plan?  If
such a project plan came into existence, is there likely to be a 
set of paying customers who would each chip in a fraction of the
budget?

-t


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]