difference in calculation result when using gcc vs Visual studio and optimisation flag

Mason slash.tmp@free.fr
Thu Apr 26 10:53:00 GMT 2018


On 26/04/2018 11:26, Jack Andrews wrote:

> ​This reduces it to a one-platform problem.  gcc -O different to gcc
> Maybe I'm being idealistic, but why should optimization change results?​

Because, for example on x86 platforms, 'gcc -O0' will use the 80-bit
x87 stack regs, while 'gcc -O2' will use the 64-bit SSE regs.

It is better to give up the notion that floating point computation
are exact, and accept the fact that small errors do change the
results on different implementations (and as pointed out, even on
the same implementation with different options).

It is also worth pointing out that sometimes these small errors
accumulate into huge errors. Floating point is tricky.

cf. https://en.wikipedia.org/wiki/Loss_of_significance

Regards.



More information about the Gcc-help mailing list