This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug c/84873] [6/7/8 Regression] ICE: verify_ssa failed (error: definition in block 3 does not dominate use in block 4)


https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84873

--- Comment #7 from rguenther at suse dot de <rguenther at suse dot de> ---
On Thu, 15 Mar 2018, jakub at gcc dot gnu.org wrote:

> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84873
> 
> --- Comment #6 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
> Why the
> +           if (TREE_CODE (*op1_p) == INTEGER_CST)
> +             *op1_p = fold_convert (unsigned_type_node, *op1_p);
> +           else
> +             *op1_p = build1 (NOP_EXPR, unsigned_type_node, *op1_p);
> ?  Just change the convert to fold_convert...  I hope the FE ensures that the
> shift count has a sane type (some integral one).

Because it's fold_binary_op_with_conditional_arg which introduces
the tree sharing.  So if, say, *op1_p was (long)  (x == 0) + (long) y
then fold_convert will fold away the conversion and re-fold the
PLUS which then may reach fold_binary_op_with_conditional_arg again.

I might have used

  tree tem = fold_unary_to_constant (...);
  if (!tem)
    tem = build1 (...);
  *op1_p = tem;

but fold_unary_to_constant uses TREE_CONSTANT as well ...
(but it looks like COND_EXPR will never be TREE_CONSTANT at least).

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]