This is the mail archive of the
mailing list for the GCC project.
Re: What is acceptable for -ffast-math? (Was: associative law in combine)
- To: Linus Torvalds <torvalds at transmeta dot com>
- Subject: Re: What is acceptable for -ffast-math? (Was: associative law in combine)
- From: Gabriel Dos Reis <gdr at codesourcery dot com>
- Date: 30 Jul 2001 20:53:37 +0200
- Cc: Gabriel Dos Reis <gdr at codesourcery dot com>, Fergus Henderson <fjh at cs dot mu dot oz dot au>, <moshier at moshier dot ne dot mediaone dot net>, <tprince at computer dot org>, <gcc at gcc dot gnu dot org>
- Organization: CodeSourcery, LLC
- References: <Pine.LNX.email@example.com>
Linus Torvalds <firstname.lastname@example.org> writes:
| On 30 Jul 2001, Gabriel Dos Reis wrote:
| > Fergus Henderson <email@example.com> writes:
| > |
| > | I think it would be nice if GCC was standards compliant by default,
| > | but that is certainly not the status quo.
| > Agreed. But that should be the ideal we should strive for.
| > And also, I don't think that is an argument for not computing
| > correctly, even if we're asked to compute fast.
| Note that "a/b/c" _is_ the same as "a/(b*c)" when seen as a mathematical
Abstract Algebra is *different* from Computing. That is lesson 0
from real-world numerical computations.
| Who are you to say that the user wouldn't like the code to be run faster?
Who are you to say that it is more important to compute wrong and fast
than computing correctly?
If correctness is not an issue then the fastest way is to avoid the
computation entirely or return 0.0 unconditionally.
| Note that with the current -ffast-math on x86, for example, gcc will
| already inline things like "sin()" etc using the x87 instruction directly.
I don't buy the argument
That is broken, then we should break this as well
If GCC behaves incorrectly in some cases, the appropriate action to
take is to make it behave correctly, -not- to break other parts.
| What planet are you people from?
A planet where we do numerical computation for living.
As von Neumann said once
Numerical computation is too serious to be left to the computer