This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: patch rereview requested for PRs 6860, 10467 and 11741


[I added the gcc list since this has now gotten far beyond the issue of a
specific patch.]

    > Again, I'd argue the other way around: it's precisely because of manpower
    > limitations that we have to be more careful than usual to ensure the high
    > quality and hence maintainability of GCC.

    Even GCC is wonderfully maintainable, a user will perceive that the
    quality of the product is poor if it is unusable.  Quality and
    maintainability are separate issues.  With an infinite amount of
    resources a poorly maintainable design may still be perceived as
    having a high quality if it meets user expections.

Right.  But that's precisely the issue: a poorly maintainable design requires
much more effort than one that has good software engineering attributes, such
as structure and comments.

    Quality is achieved partly through testing.  If we can't do testing
    because the product is broken, I can't see how we can ensure a high
    quality product.

Sure, but I don't see your point here.  Nobody (certainly not me) is arguing
in favor of buggy code or of not fixing bugs.

What I'm saying is that a reviewer of a patch is entitled to assume that the
submitter has met the guidelines for a patch and has properly tested it, both
to ensure that it accomplishes its goal (be it a new feature, new
optimization, a bugfix, code cleanup, or whatever) and that it causes no
regressions.  That means that reviewers are free to concentrate on quality
(software engineering) issues.  I don't understand what is controvercial
about that claim.

    We always had defined deliverables, a limited budget and a schedule
    that had to be met.

The GCC project certainly has the middle of those, but I don't agree that it
really has the former and the latter as anything more than goals.

    Just because GCC is developed by volunteers, don't assume an infinite-
    time zero-cost model for its development.  

Indeed not.  That's why quality is so important.

    Remember the gcc/egcs split.  Gcc was supposed to be the high-quality
    stable release, and egcs the experimental development release.  Which
    model worked better?  

I don't think it's fair to characterize egcs as "low-quality" as you seem to
be doing.  There were quite a number of differences between the two
development methodologies quite apart from quality and level of
experimentation.  The current GCC development model is significantly
different from either of those.  I don't think it's at all meaningful to
speak of one model as working "better" than the other since history has shown
that *neither* was workable in the long term.

    This is all about risk management and setting achievable goals within
    the time frame expected for any given release.

I'm still not sure what point you're making: are you suggesting that the
pressure of getting a feature in before a "release date" should justify
installing the feature even if the implementation does not meet the required
quality?  If that's not what you mean, can you be more specific?


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]