This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [pph] Save pending and specialized templates (issue4814054)


On 7/26/11, Gabriel Charette <gchare@google.com> wrote:
>> +/* Load a tinst_level list.  */
>> +
>> +static struct tinst_level *
>> +pph_in_tinst_level (pph_stream *stream)
>> +{
>> +  struct tinst_level *last = NULL;
>> +  unsigned count = pph_in_uint (stream);
>> +  /* FIXME pph: This leaves the list in reverse order.  Issue?  */
>> +  for (; count > 0; --count)
>> +    {
>> +      struct tinst_level *cur = ggc_alloc_tinst_level ();
>> +      cur->next = last;
>
> -      cur->next = last;
> +     if(last)
> +       last->next = cur;
>
>> +      cur->decl = pph_in_tree (stream);
>> +      cur->locus = pph_in_location (stream);
>> +      cur->errors = pph_in_uint (stream);
>> +      cur->in_system_header_p = pph_in_uint (stream);
>> +      last = cur;
>> +    }
>
> if (last)
>   last->next = NULL;
>
>> +  return last;
>> +}
>> +
>
> If you do the changes above, the list won't be in reverse order.

That code loses the pointer to the head of the list.  However, I
take your point and will avoid the comment by doing the forward list.

> Also if you do it as I did in pph_in_cxx_binding, using the cache
> markers, you do need to output count, you can simply use the cache
> mechanism which will know how to output/input NULL (I used that in
> pph_in/out_cxx_binding).

I think I like the count better in this instance.

>> +/* Load and merge a spec_entry TABLE from STREAM.  */
>> +
>> +static void
>> +pph_in_spec_entry_htab (pph_stream *stream, htab_t *table)
>> +{
>> +  spec_entry **slot = NULL;
>
> This variable is shadowed by....
>
>> +  unsigned count = pph_in_uint (stream);
>> +  if (flag_pph_debug >= 2)
>> +    fprintf (stderr, "loading %d spec_entries\n", count );
>> +  for (; count > 0; --count)
>> +    {
>> +      hashval_t hash;
>> +      spec_entry **slot;
>
> ...this one??
>
>> +      struct spec_entry *se = ggc_alloc_spec_entry ();
>> +      se->tmpl = pph_in_tree (stream);
>> +      se->args = pph_in_tree (stream);
>> +      se->spec = pph_in_tree (stream);
>> +      hash = hash_specialization (se);
>> +      slot = htab_find_slot_with_hash (*table, se, hash, INSERT);
>> +      *slot = se;
>> +    }
>> +}

I must have removed that earlier.

>> +/* Read and return a location_t from STREAM.  */
>> +static inline location_t
>> +pph_in_location (pph_stream *stream)
>> +{
>> +  location_t loc = lto_input_location (stream->ib, stream->data_in);
>> +  if (flag_pph_tracer >= 4)
>> +    pph_trace_location (stream, loc);
>> +  return loc;
>> +}
>> +
>
> I was actually going to create pph_in/out_location. I
> will need to make them a streamer_hook so that all calls to
> lto_input/output_location actually are redirected to those when
> lto wants to in/out a source_location. Of course it would no
> longer call lto_input/output_location, rather it would output
> the source_location directly, and apply the calculated offset
> (discussed in "Linemap and pph") on the way in. It's great that
> you implemented a tracer for this as it will be easy for me to
> debug my implementation of this, thanks!
>
> Is there anything you implementation depends on I should be careful
> about, from what I see, it doesn't look like it (as long as the
> source_location you get back from pph_in_location are correct).

No particular dependences.  The templates needed locations but
weren't in trees.  They aren't doing anything special with them
that I saw.

>> +extern void lto_output_location (struct output_block *ob, location_t loc);
>
> lto_input/ouput will again no longer need to be extern when I make
> my patch for pph to handle those differently, I will make sure
> (remind me if I don't) to remove this change.

Okay.

-- 
Lawrence Crowl


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]