Merge cpplib and front end hashtables, part 1

Neil Booth neil@daikokuya.demon.co.uk
Wed May 16 15:57:00 GMT 2001


Daniel Berlin wrote:-

> 1. I'll just move it into a seperate structure, containing the objfile
>    they came from (ie whose debug info we got from),  and the decl
>    line.  Then, we just process all until we hit the place we are
>    currently in.

That seems to be the only way to me, with possible shortcuts for
keeping in memory defined macros that are never undefined.  But if
you're going to use cpp_destroy (), that defeats that idea.

The reason memory isn't freed on undef is that macros are all stored
in a kind of obstack - it's fast and for a single compilation saves
space.  But freeing something in the middle doesn't really work.  It
would be possible to add a function where you could inform cpplib
whether to store future definitions in an obstack as usual (the
permanent unchanging ones), or to malloc them (ones you want freed).
Then that memory issue would conveniently go away.

Ah, but then you still have to worry about not defining macros that
haven't been defined yet.  What a nasty problem.

> 2.  Make the outputted macro definitions fully expanded in the debug info.
>     Are they? I never bothered to look.  What I mean by fully expanded
>     is that they don't depend on other macros.  If so, then we only need
>     the define when we hit a macro to expand, right?
>     This is only one lookup, and can be done in the parser.

The concept of a generalised "fully expanded macro" is meaningless.
Since macros can be defined and undefined anywhere, the expansion of
any given macro can vary with source line.

Also, they can take arguments, which would seem to defeat this idea
too.  Don't you love the C macro preprocessor?  It must have set back
C programmers' tools by at least a decade.

> Crud.
> I also noticed that calling cpp_destroy causes a crash, and that
> nothing really uses it.

Really? Yes, I was going to turn that back on in stand-alone CPP, to
stop bit rot.  But it looks like it has already happened.  <g> The
only reason nothing uses it is that all invokers of cpplib are
currently processes that end after translating a file, and we didn't
want to carefully tear down internal data structures, only to have the
O/S unmap all the process' pages anyway.

It worked about 3 months ago, so whatever is wrong should be a simple
fix.

> If I don't, does memory actually get freed at the right times?

Cpplib frees some odd bits when it's finished with them, but it
naturally has to keep a lot of stuff in memory, like the expansions of
each macro; such things are not freed unless and until you call
cpp_destroy ().  If that function is working, it should free
everything without exception - we want cpplib to be stably re-entrant,
which I think it is, apart from what you've reported with cpp_destroy
().

> Well, we have enough macros that if I have to do constant insertion
> and removal, then it'll matter a whole lot.

Yes.

> This is because all macros get output, not just used ones, which is
> IMHO, a good thing anyway, since you may want debugging macros.

Yes.

Scrap this.  I've got a better idea.  You give cpplib some text you
want macro expanded.  You don't give cpplib any macro definitions, but
instead give it a callback where it enquires whether a macro is
defined at that location in the source file.  GDB does it's stuff to
figure that out, which I imagine is pretty easy and fast.

If so, your callback routine calls cpp_define () with the macro
definition, and returns to cpplib, which continues to try to expand
the macro expression.

This way, we can do lazy insertion of macros and be very efficient
about it, and fully free them all after the one expansion.  Yeah!
However, this callback check would get in cpplib's fast path I think;
we'd need to be careful we don't penalize GCC to a noticable extent to
benefit GDB.

Neil.



More information about the Gcc-patches mailing list