This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: [PATCH, i386]: Do not emit "cld" instructions

> Based on all comments and suggestions, I'd propose that we keep all
> cld functionality, but emit empty asm instead of "cld". If gcc ever
> emits std instruction, we could re-enable cld and perhaps emit it via
> optimize mode switching.

I would think in favour of simply dropping the code - if it ever becomes
important we can always rescuesce your mode switching pass.  I would not
expect inlining memove via std to become important, just because the
hardware probably won't be very well optimized for his and here are
better ways to inline it.

Or alternatively just disable code emitting the CLD.  I think it is
better than outputting empty ASM and confusing scheduler.

But I am happy with either way (disabling codegen/disabling
> Unfortunatelly, for some reasons clds were not eliminated from loops...

Yes, CFG branch once upon a time was capable of PREing this, but it
never got merged.
> >> > Finally, according to pentium optimization guide by Agner Fog, std and
> >> > cld have astonishing latency of 48 and 52 clks (I still hope for the
> >> > possibility that there is some kind of error).
> >
> >This is prety weird.  Does this apply to original Pentium?
> According to the guide, it applies to pentium4.

This is pretty high.  Would be possible for you to rerun the
test_stringops script on P4 machine after removing the CLD?  If it
really is 48 cycles, it should show difference in the preffered memcpy

Thanks for looking into this!
> 2006-12-06  Uros Bizjak  <>
> 	* config/i386/ (cld): Emit empty asm string.
> Patch was bootstrapped on i686-pc-linux-gnu, regression tested for c,
> c++ and fortran.
> OK for mainline?
> Uros.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]