This is the mail archive of the gcc-help@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: Possible performance issue with gfortran? denormalized numbers


On 02/01/2016 12:55 PM, Jose Miguel Reynolds Barredo wrote:

Hi everyone,

I was developing a tridiagonal block solver and I found a performance
issue that is ingriguing me: doing operations with numbers in the
denormalized range is around ten times slower than regular numbers
(https://en.wikipedia.org/wiki/Denormal_number). As an example, I
builded a very simple code:

The reason operations with denormal numbers are ten times slower has - unfortunately, as otherwise we could do something about it - nothing to do with the compiler or the run time library (of *any* language).

Denormal number operations are handled by the operating system, because it is too costly to allocate silicon to handle them on the CPU. So when the CPU detects a denormal number, it traps. This trap is caught by the OS, which dispatches the computation to a routine written for the purpose. The trap and the software implementation of the handling of the operation involving a denormal are costly, as you observed.

There is nothing the compiler writers (*any* compiler writers, not just GCC's) can do about this.

Kind regards,

--
Toon Moene - e-mail: toon@moene.org - phone: +31 346 214290
Saturnushof 14, 3738 XG  Maartensdijk, The Netherlands
At home: http://moene.org/~toon/; weather: http://moene.org/~hirlam/
Progress of GNU Fortran: http://gcc.gnu.org/wiki/GFortran#news


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]