This is the mail archive of the gcc-bugs@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

optimization/10482: Optimized and debug binaries of same application give different results.


>Number:         10482
>Category:       optimization
>Synopsis:       Optimized and debug binaries of same application give different results.
>Confidential:   no
>Severity:       serious
>Priority:       medium
>Responsible:    unassigned
>State:          open
>Class:          sw-bug
>Submitter-Id:   net
>Arrival-Date:   Thu Apr 24 16:06:01 UTC 2003
>Closed-Date:
>Last-Modified:
>Originator:     Alberto dot Ribon at cern dot ch
>Release:        gcc 3.2.2 and earlier versions (2.95.2)
>Organization:
>Environment:
Linux Red Hat 7.3
>Description:
After building the same C++ application twice,
     once using debug  -g  option
 and once with optimization ( -O  option), 
and running them at the same conditions, the results 
numerically differ while they should be exactly the same.
We verified and are confident that there are no cases of
uninitialized variables or numerical instabilities (like
nan, division by zero, etc...).
We also verified that the same exercise on Sun Solaris
system with Forte CC 5.4 compiler (and earlier verions)
instead generates the same output in the two cases.

We would like to know if this is a known feature of gcc
compiler, and what could be the cause in terms of optimizations done with the default level -O.
>How-To-Repeat:
The problem appears in a rather complicated simulation 
application for which we don't have an easy test case to
provide.
>Fix:

>Release-Note:
>Audit-Trail:
>Unformatted:


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]