This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

[Bug c++/45265] GCC has an intermittent bug when computing the address of function parameters



------- Comment #29 from rogerio at rilhas dot com  2010-08-12 18:24 -------
(In reply to comment #27)
> Oh, this fun.  Enjoyable, really! ;-)

Again I couldn't resist! Everytime I'm ready to go away you say something
shocking that I simply can«t resist. Its time for me to admit I have a problem!
:-)

> So, you admit that MSVC does in fact "miscompile" your perfectly fine cdecl
> code, if you request optimization from it?

Yes.

>  How bad is that of them?

Not perfect, but still better than GCC, because at least I can get it to work.

> Terrible!

... noooo... that's just your jealousy talking because you can't even begin to
understand how to make a temporary variable an l-value... how can someone be
defending a compiler that never conforms (GCC) to one that conforms if I don't
request it to optimize? That's just bad logic. It's not my intention to insult
you, but the observation itself lacks any underlying logic.

>  I would consider creating a bug report with them, because if they
> miscompile your code with optimizations it must surely be their bug.

No, optimizations take away room for assumptions. That's why GCC can optimize
for(i=0; i<strlen(sp); i++). What??? GCC didn't call strlen() every time? How
stupid! No, you are just lacking logic. Drink something with vitamins and get
out more, it will do you good.

>  After
> all optimization is a process of transforming a valid program into another
> program that behaves exactly the same, hence if they optimize your valid
> program into a crasher, what else could it be than a bug in their compiler?

Read my strlen() example. That shows you are wrong. You "invented" that
definition, you can't really back it up otherwise strlen() would have to be
called every time.

> I mean, really.  They are supposed to provide a commercial grade compiler.
> How can it be that they force you to deactivate optimization options
> (and hence live with slow runtime) just so that your valid cdecl program
> doesn't crash?

Yup, money can only buy so much. No money can buy you alittle bit less.

> One side remark about your p2-p1 claim:
> > char* p1=random_address();
> > char* p2=another_random_address();
> > 
> > Any compiler that does not predictably compute p2-p1 is a piece of crap. You
> > can twist C99 all you want, but whenever p2-p1 is left to some undefined
> > criteria of the compiler then it is just an absolute piece of crap. Period.
> You obviously never used segmented platforms (old DOS was such a thing,
> but there are others more recent, e.g. Cell with PowerPC is similar in this
> respect).

Yes I did work with those platforms. Remember the "far" qualifiers? Remember
what they were for? They were invented to make you, again, write something
wrong.

However, as parameters to functions, they were always in the same segment, so
the subtraction was always valid. C99 cannot back this up though, it was just
the way things were made back then. Maybe GCC inherited too much from those
days.


>  On those it was valid only to sunstract pointers from each other
> when they pointed into the same segment.

Not really, you could always subtract. However, far pointers gave predictable
addresses, just like C99 says they pointer arithmetic should. Go and read C99
about the "far" qualifier so that you can see why it was not smart of you to
talk about DOS.

Still, on every segmented platform, the subtraction of the addresses of
parameters is always valid, as parameter will be all placed on the same stack.
And if some parameters had "far" qualifies and other not then the compilers
would warn you about it so that you could requalify them.

Don't talk about what you don't know, you clearly know much less about the old
days than me. Stick to C99.

>  Because the pointer difference type
> was a 16 bit type, whereas the pointers could address 1MB of memory (hence 
> effectively 20 bit).  If you do the math you'll see that it's impossible
> to map all 2^20 possible differences between pointers (unsigned 2-completement
> 20-bit arithmetic, otherwise 2^21 differences) into just 16 bit.  So yes,
> on those platforms it really was impossible to substract two arbitrary
> pointers.

No. Pointers of the same type, with the same qualifiers, were always
subtractable. Don't invent, it will just make you wrong. The addresses of
parameters on the stack would always be near (16-bit), so subtraction would
always be well-defined.

> C (the language) reflects such constraints.

Not really, no. Or can you back up your claims with an old standard applycable
30 or 40 years ago? I'm sure you can't. I don't even know if cdecl was well
defined back then, do you?

> With complete trust in your incapability to grok these concepts. but hats
> off to a capable troll,
> Michael.

Its is amazing how foast you take your self out and crash head-on into a wall.
And be proud of it. But you are right, this is fun. Keep on sending your
errors, inventions, inconsistencies, and mistakes, and I'll keep on correcting
them. Then you deflect and try to pretend that you said smart things and send
me a bunch of errors again. We could make a really fun game out of it.


-- 


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=45265


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]