This is the mail archive of the
gcc-patches@gcc.gnu.org
mailing list for the GCC project.
Re: Your sizetype changes ...
- To: mark at codesourcery dot com
- Subject: Re: Your sizetype changes ...
- From: kenner at vlsi1 dot ultra dot nyu dot edu (Richard Kenner)
- Date: Wed, 1 Mar 00 12:56:54 EST
- Cc: gcc-patches at gcc dot gnu dot org
Didn't I add code to the size comparison code to handle that? Or did I
just do it in one direction and not teh other?
The assertions are failing *because* things are comparing equal, not
because they're not.
I'm sorry, I don't follow.
No, you can't. This C++ code:
unsigned int x;
size_t s;
x = UINT_MAX;
s = (x * 40) / 20;
is exactly equivalent to this C++ code, in all respects:
unsigned int x;
unsigned int s;
x = UINT_MAX;
s = (x * 40) / 20;
assuming that `size_t' is `unsigned int'. There are no legal
additional optimizations -- whatever you can conclude about overflow
in the second case, you can also conclude in the first.
Right. I think I clarified this in my last message: the issue is size
computations *generated by the compiler*, not user operations on
language-level sizetypes.
They are considered equivalent. So, when we ask to convert an
expression with type `unsigned int' to `size_t', we don't do anything
-- we just leave the expression alone. So, then, when we pass it to
size_binop, it aborts.
Ah, I see. So the test case you sent me blows up?
No, I was not concerned about misuses of sizetypes. I was concerned
about misengineering of GCC. I was concerned about adding fields to
all structures such that a) those fields were related by an algebraic
identity but b) could be updated independently, leading to an
inconsistent state. That's got very little to do with whether or not
size_binop gets arguments of type size_t, or some equivalent integer
type.
We went through that: sizes are not *updated*, they are *set once*.
It makes no sense to change the size of a type or decl. But it's a
requirment in GCC that tree nodes be type-correct. You can often get
away with it not being correct, but enforcing type consistency makes
the compiler more reliable and this is a step in that direction.