This is the mail archive of the
mailing list for the GCC project.
Re: denormals/subnormals are heading for extinction
- To: <dewar at gnat dot com>
- Subject: Re: denormals/subnormals are heading for extinction
- From: Scott A Crosby <crosby at qwes dot math dot cmu dot edu>
- Date: Thu, 16 Aug 2001 16:22:25 -0400 (EDT)
- cc: <gcc at gcc dot gnu dot org>, <trt at cs dot duke dot edu>
On Thu, 16 Aug 2001 email@example.com wrote:
> That's excessive rhetoric. Indeed abysmal, like very is one of those words
> that people use when they do not have relevant quantitative data. I have
I would like to add that even if denormals are usually encounted
infrequently enough that any ineffeciencies with them are minor, there is
still a big cost. Implementation complexity. Having to add and waste the
transistors to deal with them in the first place.
So at that level, denormals are a symptom of featuritis.
> not seen any application on the ia32 where the overall performance is
> significantly affected by the denormal implementation, and would be
> very interested to see counter examples. Again, I am thinking about
> the entire overall performance of a complete application (nothing
> else is relevant).
... and thus, should be judged like any proposed extension to GCC. Are the
benefits of the feature outweighted by the constant implementation
complexity and maintance of the feature.
If the gains and costs of denormals are neglegible, leave them out.
They're a headache you don't want to design for.