Casting in C
I am following a web lecture on "Programming Paradigm" from Stanford.
for the following code the lecturer said.
"A double of 3.14 or pi is declared then cast as a character ch. &d is a pointer to the first byte address in d, (char*) says "Oh, it is not a double it's character" . The final * "de-references" that value which means it copies the value to ch not the address.
char ch=*(char*) &d;
the output of the program is a downward triangle, not 3.14.
can anybody kindl tell me why?
What is the ASCII code of downward triangle? Assuming that you are running this under Windows/DOS from the command line (or at least in a Cmd Prompt console window) or even under MS-DOS, my IBM ASCII Character Set table says that the ASCII code of "down triangle" is 0x1F, which is 31 decimal.
My d2hex utility (that uses casting to display the hex bytes of a double value) tells me that the bit pattern (in hex) of 3.14 is 40 09 1E B8 51 EB 85 1F.
I am still assuming that you are running Windows on an Intel machine. Intel machines are little-endian, meaning that they display the least-significant byte first, so the bytes of that value of 3.14 would be stored in this order: 1F 85 EB 51 B8 1E 09 40.
You are printing out the first byte in that memory location cast as a char. The value there is 0x1F, which is the ASCII code for "down arrow" in the IBM character set.
July 10th, 2013, 05:08 PM
Thanks alot..Its clear now.