This is the mail archive of the mailing list for the GCC project.

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]
Other format: [Raw text]

Re: atof()

#include <stdlib.h> needs to be added.  The default
data type for an undeclared function is int which
will cause oddities like that.  Always compile
with -Wall when debugging stuff, as it will
report such errors.


Reinaldo Nolasco Sanches wrote:
I have a problem with atof()... and I don't understand because it...
When I use atof() to get a value of a string... and put in my double var, the value is not the

I make a little example...

#include <stdio.h>

int main()
	double teste;
	char   strteste[10];
	strcpy( strteste, "120.1" );
	teste = atof( strteste );
	printf( "\n\tResult is = %.2f\n\n", teste );
	return 0;

and the result is this

    Result is = 1078984704.00

and I don't understand this... I make this in several PCs... and in a SuSE, in a Slackware and in
a RedHat(urgg!), and in all the erro ocurred...

I don't know if I need set a specific #define... or a specific environment var...

Please... anybody can help me ???

I compiled with...
   gcc -i teste.c -o teste

Thanks a lot in advance...

"When you know Slackware, you know Linux... when you know Red Hat, all you know is Red hat"

- Anyone seen smoking will be assumed to be on fire and will be summarily put out.
- Power doesn't corrupt people, people corrupt power.

- --
- UIN: 42853394 -

Do You Yahoo!?
HotJobs - Search Thousands of New Jobs

Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]