This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: [patch RFA] Handle blockage insn in mode-switching.c/create_pre_exit
Eric Botcazou <ebotcazou@libertysurf.fr> wrote:
> Exactly. While you're at it, could you further refactor the code?
I've attached a suggested patch. Thanks for your suggestions.
Ok if it bootstraps and passes the regtest on x86 without problems?
Regards,
kaz
--
PR target/32664
* mode-switching.c (create_pre_exit): Skip barrier insns.
--- ORIG/trunk/gcc/mode-switching.c 2007-06-12 09:34:43.000000000 +0900
+++ LOCAL/trunk/gcc/mode-switching.c 2007-07-09 17:21:13.000000000 +0900
@@ -246,21 +246,37 @@ create_pre_exit (int n_entities, int *en
if (INSN_P (return_copy))
{
- if (GET_CODE (PATTERN (return_copy)) == USE
- && GET_CODE (XEXP (PATTERN (return_copy), 0)) == REG
- && (FUNCTION_VALUE_REGNO_P
- (REGNO (XEXP (PATTERN (return_copy), 0)))))
- {
- maybe_builtin_apply = 1;
- last_insn = return_copy;
- continue;
- }
- if (GET_CODE (PATTERN (return_copy)) == ASM_INPUT
- && strcmp (XSTR (PATTERN (return_copy), 0), "") == 0)
+ return_copy_pat = PATTERN (return_copy);
+ switch (GET_CODE (return_copy_pat))
{
+ case USE:
+ /* Skip __builtin_apply pattern. */
+ if (GET_CODE (XEXP (return_copy_pat, 0)) == REG
+ && (FUNCTION_VALUE_REGNO_P
+ (REGNO (XEXP (return_copy_pat, 0)))))
+ {
+ maybe_builtin_apply = 1;
+ last_insn = return_copy;
+ continue;
+ }
+ break;
+
+ case ASM_OPERANDS:
+ /* Skip barrier insns. */
+ if (!MEM_VOLATILE_P (return_copy_pat))
+ break;
+
+ /* Fall through. */
+
+ case ASM_INPUT:
+ case UNSPEC_VOLATILE:
last_insn = return_copy;
continue;
+
+ default:
+ break;
}
+
/* If the return register is not (in its entirety)
likely spilled, the return copy might be
partially or completely optimized away. */