Computable number

HomePage | Recent changes | View source | Discuss this page | Page history | Log in |

Printable version | Disclaimers | Privacy policy

A computable number is a real number whose digit sequence can be produced by some algorithm. The algorithm takes a natural number n as input and produces the n-th digit in the decimal expansion of the real number as output.

The computable numbers form a field, and arguably this field contains all the numbers we ever need in practice. It contains all algebraic numbers as well as all known transcendental mathematical constants. There are however many real numbers which are not computable: the set of all computable numbers is countable (because the set of algorithms is) while the set of real numbers is not (see Cantors Diagonal argument).

Every computable number is definable, but not vice versa.

Computable numbers were introduced by Alan Turing in 1936.


  • Alan Turing, On computable numbers, with an application to the Entscheidungsproblem, Proceedings of the London Mathematical Society, Series 2, 42 (1936), pp 230-265. online version