Originally Posted By: EpsiloN
And, let me see if I got this right (learned it enough?):
Code:
var v = *((var*)(&((int)1)));  // all the bits are 0 but the last


You are casting a var literal into an int, dereferencing the address of an int, casting the address of an int to a var address and assigning the address to that address to an actual var (into v), right? I think I got something wrong...

Lets go ny parts laugh
The literal is an integer already. I added the casting in order to avoid confusions. Integer literals are treated as integers and decimal numbers as var. It will need to add '.0' to an integer to force the compiler treat it as a var.

I should have written:
Code:
int i = 1;
var v = *((var*)(&i));