This is the mail archive of the
mailing list for the GCC project.
Re: [RFC] type safe trees
- From: Phil Edwards <phil at codesourcery dot com>
- To: Andrew Pinski <pinskia at physics dot uc dot edu>
- Cc: gdr at acm dot org, Mark Mitchell <mark at codesourcery dot com>, Nathan Sidwell <nathan at codesourcery dot com>, gcc <gcc at gcc dot gnu dot org>, zack at codesourcery dot com
- Date: Thu, 24 Jun 2004 21:34:18 -0400
- Subject: Re: [RFC] type safe trees
- References: <35125.::ffff:email@example.com> <200406241641.i5OGfu131772@tin.geop.uc.edu>
- Reply-to: gcc at gcc dot gnu dot org
On Thu, Jun 24, 2004 at 12:41:56PM -0400, Andrew Pinski wrote:
> 100% disagree here, if we are going to say we require a C++ compiler
> why not go all out why stop at using no templates, exceptions, etc. go
> all out and use them all and get worse performance as we have now.
You're speaking facetiously as a joke, but one of the disagreements that
keeps being raised (here and on IRC) is that if we start using any part
of C++, even if only a "stricter C with classes," we'll inevitably start
using more and more, ending up with a bizarre template metaprogramming
version of reload.
Can we all agree to just flat kill that stupid nonsense? It's a causal
(not "casual") form of the slippy-slope argument, which is typically wrong.
It implies that nobody would /notice/ if other facets of the language
suddenly appeared in the compiler sources. It implies that we can't use
any part of C++ without using all of it, which any decent C++ programmer
should find a little insulting, as the language was expressly designed to
support using only parts of it.
It /is/ possible to pick a subset and restrict ourselves to it. If somebody
is scared that a developer will violate that subset, then that somebody
can participate more in patch review. :-) Or at the least, help design
> But seriously, we have no need for C++ at this momement in time for GCC,
> static only trees are nice but you can still cheat and get around them but
> runtime trees you have no way to cheat and get around them. If something
> does not exist it turns into an ICE and you will see the problem right away.
Which is a horrible, horrible way of catching errors, compared to a
I would therefore like to posit that computing's central challenge, viz. "How
not to make a mess of it," has /not/ been met.
- Edsger Dijkstra, 1930-2002