optimization/10772: Result differences between debug and optmize libs on linux; and with sol/nt on optimize mode

omarbeo@hotmail.com omarbeo@hotmail.com
Tue May 13 20:04:00 GMT 2003


>Number:         10772
>Category:       optimization
>Synopsis:       Result differences between debug and optmize libs on linux; and with sol/nt on optimize mode
>Confidential:   no
>Severity:       serious
>Priority:       medium
>Responsible:    unassigned
>State:          open
>Class:          sw-bug
>Submitter-Id:   net
>Arrival-Date:   Tue May 13 18:26:00 UTC 2003
>Closed-Date:
>Last-Modified:
>Originator:     omarbeo@hotmail.com
>Release:        GCC 3.2
>Organization:
>Environment:

>Description:
Hi,
I'm seeing differences between nt/sol and linux when i run my application; i'm not sure if this a bug or not.
the interresting thing is that linux-64 optmize and linux-32 debug results will match the nt/sol results.

Here the code:
double get_p() { ... };  // multiple oprations: multiplication, division,..
int get_p() { ... }
double a;
int res;

a = ( (double) get_p() . get_g());
res = (int) a;


i run this test on nt, unix, and linux platforms.
nt and unix (sol) match, while linux data don't (off by 1 unit).
when i run this test using linux debug libs, the results maches those of nt/sol!

So i'm not sure why this is happening?

This could be caused by us not rounding the numbers; but then why linux results differ based on how we build the libs ( optimize and debug).

Thanks,
salim
>How-To-Repeat:

>Fix:
I'm attaching the Assemly code generated on linux in debug and optmize mode.
I found a workaround which will make linux match the two other platforms as follow:

static bool turn_off = false
..
a = ( (double) get_p() . get_g());
if (turn_off) printf("", a)
res = (int) a;
>Release-Note:
>Audit-Trail:
>Unformatted:



More information about the Gcc-bugs mailing list