Formatted Read Accuracy
Wed Oct 9 16:02:00 GMT 2002
Thanks for all the inputs!
I have confirmed with a debuger (and write statements) that "line" indeed
contains 67.9936, and after the read the value is 67.9935989 with G77 and is
67.9936 with F77. Another example with G77 -- input 53.5139 becomes 53.5139008,
a larger value! Both compiers are using single precision.
The numbers are correctly read with the Sun F77 compiler used on the same source
code. This is only two examples of hundreds of numbers that are read and G77
adds "change" in the fifth thru seventh decimal places on all the numbers when
using the F9.4 specifier. It appears that the F77 compiler rounds off the
numbers, and places zeroes in any field greater than specified by the the format
statement. I have tried several formats and the result is the same.
I have resolved several differences between the 2 compilers using the source
code for a very large simulation, and this is the only remaining different in
behavior. I don't understand why the number should be different with the G77
compiler. I would appreciate any inputs on where in the G77 compiler source code
this occurs, and which compiler is conforming to the standard.
More information about the Gcc-help