This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
[PATCH] Avoid *ORDERED_EXPRs in the IL if !HONOR_NANS (PR target/92449)
- From: Jakub Jelinek <jakub at redhat dot com>
- To: Richard Biener <rguenther at suse dot de>
- Cc: gcc-patches at gcc dot gnu dot org
- Date: Tue, 12 Nov 2019 01:06:55 +0100
- Subject: [PATCH] Avoid *ORDERED_EXPRs in the IL if !HONOR_NANS (PR target/92449)
- Reply-to: Jakub Jelinek <jakub at redhat dot com>
Hi!
The testcase from the PR ICEs on rs6000. For __builtin_isunordered etc.
the folding makes sure that there is no *ORDERED_EXPR etc. in the IL
if !HONOR_NANS, but the complex multiplication can emit it when mixing
-Ofast with -fno-cx-limited-range.
The following patch makes sure we don't emit it even in that case.
Bootstrapped/regtested on x86_64-linux and i686-linux, ok for trunk?
I'll defer the addition of the testcase and rs6000 changes to Segher.
2019-11-11 Jakub Jelinek <jakub@redhat.com>
PR target/92449
* tree-complex.c (expand_complex_multiplication): If !HONOR_NANS,
don't emit UNORDERED_EXPR guarded libcall. Formatting fixes.
--- gcc/tree-complex.c.jj 2019-01-10 11:43:02.252577041 +0100
+++ gcc/tree-complex.c 2019-11-11 20:26:28.585315282 +0100
@@ -1144,6 +1144,16 @@ expand_complex_multiplication (gimple_st
return;
}
+ if (!HONOR_NANS (inner_type))
+ {
+ /* If we are not worrying about NaNs expand to
+ (ar*br - ai*bi) + i(ar*bi + br*ai) directly. */
+ expand_complex_multiplication_components (gsi, inner_type,
+ ar, ai, br, bi,
+ &rr, &ri);
+ break;
+ }
+
/* Else, expand x = a * b into
x = (ar*br - ai*bi) + i(ar*bi + br*ai);
if (isunordered (__real__ x, __imag__ x))
@@ -1151,7 +1161,7 @@ expand_complex_multiplication (gimple_st
tree tmpr, tmpi;
expand_complex_multiplication_components (gsi, inner_type, ar, ai,
- br, bi, &tmpr, &tmpi);
+ br, bi, &tmpr, &tmpi);
gimple *check
= gimple_build_cond (UNORDERED_EXPR, tmpr, tmpi,
@@ -1167,13 +1177,12 @@ expand_complex_multiplication (gimple_st
= insert_cond_bb (gsi_bb (*gsi), gsi_stmt (*gsi), check,
profile_probability::very_unlikely ());
-
gimple_stmt_iterator cond_bb_gsi = gsi_last_bb (cond_bb);
gsi_insert_after (&cond_bb_gsi, gimple_build_nop (), GSI_NEW_STMT);
tree libcall_res
= expand_complex_libcall (&cond_bb_gsi, type, ar, ai, br,
- bi, MULT_EXPR, false);
+ bi, MULT_EXPR, false);
tree cond_real = gimplify_build1 (&cond_bb_gsi, REALPART_EXPR,
inner_type, libcall_res);
tree cond_imag = gimplify_build1 (&cond_bb_gsi, IMAGPART_EXPR,
@@ -1190,20 +1199,18 @@ expand_complex_multiplication (gimple_st
edge orig_to_join = find_edge (orig_bb, join_bb);
gphi *real_phi = create_phi_node (rr, gsi_bb (*gsi));
- add_phi_arg (real_phi, cond_real, cond_to_join,
- UNKNOWN_LOCATION);
+ add_phi_arg (real_phi, cond_real, cond_to_join, UNKNOWN_LOCATION);
add_phi_arg (real_phi, tmpr, orig_to_join, UNKNOWN_LOCATION);
gphi *imag_phi = create_phi_node (ri, gsi_bb (*gsi));
- add_phi_arg (imag_phi, cond_imag, cond_to_join,
- UNKNOWN_LOCATION);
+ add_phi_arg (imag_phi, cond_imag, cond_to_join, UNKNOWN_LOCATION);
add_phi_arg (imag_phi, tmpi, orig_to_join, UNKNOWN_LOCATION);
}
else
/* If we are not worrying about NaNs expand to
(ar*br - ai*bi) + i(ar*bi + br*ai) directly. */
expand_complex_multiplication_components (gsi, inner_type, ar, ai,
- br, bi, &rr, &ri);
+ br, bi, &rr, &ri);
break;
default:
Jakub