This is the mail archive of the gcc@gcc.gnu.org mailing list for the GCC project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

atof()


I have a problem with atof()... and I don't understand because it...
When I use atof() to get a value of a string... and put in my double var, the value is not the
same...

I make a little example...

#include <stdio.h>

int main()
{
	double teste;
	char   strteste[10];
	
	strcpy( strteste, "120.1" );
	
	teste = atof( strteste );
	
	printf( "\n\tResult is = %.2f\n\n", teste );
	
	return 0;
}

and the result is this

    Result is = 1078984704.00

and I don't understand this... I make this in several PCs... and in a SuSE, in a Slackware and in
a RedHat(urgg!), and in all the erro ocurred...

I don't know if I need set a specific #define... or a specific environment var...

Please... anybody can help me ???

I compiled with...
   gcc -i teste.c -o teste


Thanks a lot in advance...




=====
"When you know Slackware, you know Linux... when you know Red Hat, all you know is Red hat"

- Anyone seen smoking will be assumed to be on fire and will be summarily put out.
- Power doesn't corrupt people, people corrupt power.

- r_linux@yahoo.com -- http://slackware.linuxbr.org
- UIN: 42853394 - irc.brasnet.org(#slackware)

__________________________________________________
Do You Yahoo!?
HotJobs - Search Thousands of New Jobs
http://www.hotjobs.com


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]