This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

Re: optimization problem in flow



  In message <200002232004.OAA20040@d0sgibnl1.fnal.gov>you write:
  > 
  > hi -
  > 
  > I just ran into a problem with the gcc optimizer producing incorrect
  > code.  This is with the cvs version of gcc (2.96 20000222 (experimental)),
  > on an i686-pc-linux-gnu platform (RH 6.1).
  > 
  > The following input shows the problem when compiled with the
  > c++ compiler with -O2 (this is distilled from code in libstdc++ v3).
  > [Nb, in this version of the compiler, there was another, trivial,
  > problem in init_output_buffer that caused an ICE.  But that's already
  > been fixed in cvs.]
[ ... ]

  > So the store gets erroneously deleted in the second flow pass.
  > I tried stepping through the code in flow.c to see what was going
  > on.  What i found was that when propagate_block() calls mark_used_regs()
  > for the `minus' insn, the stack slot -16(ebp) is not removed
  > from mem_set_list.  To test whether this should happen,
  > mark_used_regs() calls anti_dependence() with two identical
  > arguments (the mem expression from the minus rtx).  The reason,
  > in turn, that anti_dependence returns false is that the rtx has the
  > unchanging flag set.
Thanks for the great but report.  My first question is why is RTX_UNCHANGING_P
set for the MEM.  That seems clearly wrong to me and is probably the real
bug.

When RTX_UNCHANGING_P is set we have a logical constant -- in simplest terms
we can initialize the value, but never change it after that point.

In your RTL we changed the contents of the MEM twice (insn 25 and insn 29),
which indicates to me we should not have RTX_UNCHANGING_P set.

I think you should track down why RTX_UNCHANGING_P is set.

jeff


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]