This is the mail archive of the gcc-patches@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

patch to remove temporary clobbers in LRA


Working on a problem where var-tracking started to spend more time after LRA is switched on, I found that LRA creates significant # clobbers for x86 code (I saw 20% of all insns). Clobbers are created for correct live range analysis in LRA in cases when matching pseudos of *different modes* should be in the same hard register.
As these clobbers are not needed anymore after LRA, the following patch removes them.


Unfortunately, it does not solve the problem I mentioned. Now, LRA generates approximately the same # of insns and debug_insn. Moreover, two functions (canonicalize_values_star and set_slot_part) in var-tracking spending most of time have approximately the same coverage, in other words numbers of execution for each line are the same (differences < 1%). Still var-tracking after LRA spent on 20% more time than after reload. I have no idea what to investigate more but I'll continue my work on the problem next week.

The patch was successfully bootstrapped on x86/x86-64.

Committed as rev. 192897.

2012-10-28 Vladimir Makarov <vmakarov@redhat.com>

    * rtl.h (struct rtx_def): Add a comment for member unchanging.
    * lra-int.h (LRA_TEMP_CLOBBER_P): New macro.
    (lra_hard_reg_substitution): Rename to lra_final_code_change.
    * lra-constraints.c (match_reload): Mark temporary clobbers.
    * lra-spill.c (lra_hard_reg_substitution): Rename to
    lra_final_code_change.  Remove temporary clobbers.
    * lra.c (lra): Rename to lra_final_code_change.

Attachment: clobbers.patch
Description: Text document


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]