Sorry to ask this here but I've searched on wikipedia and on the web and couldn't find an explanation. I didn't know where to ask math related questions so If I'm wrong please direct me.
In the manual it says about var precision; it stores numbers with fixed point, with 3 decimal digits. The minimum step is 0.001. Isn't this (3) its precision expressed in decimal places?
In the table it says "~9". I'm confused.
And related to this, can somebody explain about this:
Quote:
a multiplication by 100 has an inaccuracy of 0.001/100 = 0.001%, while the result of a division by 0.01 - mathematically the same - is inaccurate by 0.001/0.01 = 10%!

Which of the 3 values (at least) has to be of type "var" for the above to be true? I mean in an assignment like
z = x / y;


ERROR in communism.cpp, line 0:
#include<god.h>
was fatally missed.