This is the mail archive of the
gcc-bugs@gcc.gnu.org
mailing list for the GCC project.
[Bug tree-optimization/78327] Improve VRP for ranges of signed integers in the range [-TYPE_MAX + N, N]
- From: "msebor at gcc dot gnu.org" <gcc-bugzilla at gcc dot gnu dot org>
- To: gcc-bugs at gcc dot gnu dot org
- Date: Sat, 12 Nov 2016 02:01:07 +0000
- Subject: [Bug tree-optimization/78327] Improve VRP for ranges of signed integers in the range [-TYPE_MAX + N, N]
- Auto-submitted: auto-generated
- References: <bug-78327-4@http.gcc.gnu.org/bugzilla/>
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78327
Martin Sebor <msebor at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Summary|Improve VRP for ranges of |Improve VRP for ranges of
|signed char |signed integers in the
| |range [-TYPE_MAX + N, N]
--- Comment #3 from Martin Sebor <msebor at gcc dot gnu.org> ---
The problem described here isn't specific to signed char but affects all signed
types. Let's keep that in the subject so as not to suggest otherwise. I think
mentioning the [-TYPE_MAX + N, N] range is also useful because the problem
doesn't seem to be triggered by values in other ranges.
The relationship to anti-ranges is only in the subject of the email thread I
referenced in comment #0. Nothing in this bug or in that thread should imply
that they are the root cause of this problem.