This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Designs for better debug info in GCC


On Nov 12, 2007, Mark Mitchell <mark@codesourcery.com> wrote:

> Alexandre Oliva wrote:
>> On Nov 12, 2007, Mark Mitchell <mark@codesourcery.com> wrote:
>> 
>>> Clearly, for some users, incorrect debugging information on optimized
>>> code is not a terribly big deal.  It's certainly less important to many
>>> users than that the program get the right answer.  On the other hand,
>>> there are no doubt users where, whether for debugging, certification, or
>>> whatever, it's vitally important that the debugging information meet
>>> some standard of accuracy.
>> 
>> How is this different from a port of the compiler for a CPU that few
>> people care about?  That many users couldn't care less whether the
>> compiler output on that port works at all doesn't make it any less of
>> a correctness issue.

> You're again trying to make this a binary-value question.  Why?

Because in my mind, when we agree there is a bug, then a fix for it
can is easier to swallow even if it makes the compiler spend more
resources, whereas a mere quality-of-implementation issue is subject
to quite different standards.

> Lots of things are "a correctness issue".  But, some categories tend to
> be worse than others.  There is certainly a qualitative difference in
> the severity of a defect that results in the compiler generating code
> that computes the wrong answer and a defect that results in the compiler
> generating wrong debugging information for optimized code.

That depends a lot on whether your application depends uses the
incorrect compiler output or not.

If the compiler produces incorrect code, but your application doesn't
ever exercise that error, would you argue for leaving the bug unfixed?

These days, applications are built that depend on the correctness of
the compiler output in certain sections that historically weren't all
that functionally essential, namely, the meta-information sections
that we got used to calling debug information.

I.e., these days, applications exercise the "code paths" that formerly
weren't exercised.  This exposes bugs in the compiler.  Worse: bugs
that we have no infrastructure to test, and that we don't even agree
are actual bugs, because the standards that specify the "ISA and ABI"
in which such code ought to be output are apparently regarded as
irrelevant by some.

Just because their perception is distorted by a single use of such
information, which involves a high amount of human interaction, and
humans are able to tolerate and adapt to error conditions.

But as more and more uses of such information are actual production
systems rather than humans behind debuggers, such errors can no longer
be tolerated, because when the debug output is wrong, the system
breaks.  It's that simple.  It's really no different from any other
compiler bug.

> Let's put it this way: if a user has to choose whether the compiler will
> (a) generate code that runs correctly for their application, or (b)
> generate debugging information that's accurate, which one will they choose?

(a), for sure.  But bear in mind that, when the application's correct
execution depends on the correctness of debugging information, then a
implies b.

> But what's the point of this argument?  It sounds like you're trying to
> argue that debug info for optimized code is a correctness issue, and
> therefore we should work as hard on it as we would on code-generation
> bugs.

I'm working hard on it.  I'm not asking others to join me.  I'm just
asking people to understand how serious a problem it is, and that,
even those fixing these bugs may have a cost, it's bugs we're talking
about, it's incorrect compiler output that causes applications to
break, not mere inconvenience for debuggers.

> I'd like better debugging for optimized code, but I'm certainly more
> concerned that (a) we generate correct, fast code when optimizing,
> and (b) we generate good debugging information when not optimizing.

This just goes to show that you're not concerned with the kind of
application that *depends* on correct debug information for
functioning.  And it's not debuggers I'm talking about here.

That's a reasonable point of view.  Maybe the GCC community can decide
that the debug information it produces is just for (poor) consumption
by debug programs, and that we have no interest in *complying* with
the debug information standards that document the debug information
that other applications depend on.  And I mean *complying* with the
standards, rather than merely outputting whatever seems to be easy and
approximately close to what the standard mandates.

I just wish the GCC community doesn't make this decision, and it
accepts fixes to these bugs even when they impose some overhead,
especially when such overhead can be easily avoided with command-line
options, or even is disabled by default (because debug info is not
emitted by default, after all).

-- 
Alexandre Oliva         http://www.lsd.ic.unicamp.br/~oliva/
FSF Latin America Board Member         http://www.fsfla.org/
Red Hat Compiler Engineer   aoliva@{redhat.com, gcc.gnu.org}
Free Software Evangelist  oliva@{lsd.ic.unicamp.br, gnu.org}


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]