What is the default numeric precision of Reality

The default precision of Reality is dependant on the type of interface that you are currently using.

ENGLISH – The default precision for both 'A' and 'F' correlatives is 12, which can be overridden by the use of the format 'A{n}' or 'F{n}' where 'n' is a scaling factor of between 1 & 6.

SQL  – The precision for this interface is controlled via the individual column definitions within an SQL table.

DataBasic – The default precision for this is 4, the runtime precision can be modified with the statement 'PRECISION n' (where n is a value between 0 & 99). The functions SIN, COS, TAN, PWR, LN all calculate to a fixed precision of 5.

Back to articles