June 8th, 2013, 04:40 AM
Join Date: Jun 2013
Time spent in forums: 32 m 35 sec
Reputation Power: 0
Newb question with C programming
So I don't understand why my software doesn't show that
1000000 x 1000000 = 1000000000000
Instead it shows a different result
signed long long int m, n=1000000;
m = 1000000;
printf("m*n=%d \n", m*n);
The result it shows me isn't 1000000000000, but -727379968
If I do "double" instead of "int" it also doesn't help, it gives a slightly different result than if I do "signed long long int", but still not correct.
What's the problem here?