Chaitins constant

HomePage | Recent changes | View source | Discuss this page | Page history | Log in |

Printable version | Disclaimers | Privacy policy

Chaitin's constant Ω, also called the Halting probability, refers to a construction by Gregory Chaitin. For a given model of computation or programming language, Ω is the probability that a randomly produced string will represent a program that, when run, will eventually halt.

The fact that this number can be defined is important because the question whether an individual program halts is not decidable with a general algorithm (see halting problem). The number Ω can be defined, but it cannot be computed; we don't know its value for any programming language, nor will we ever.

It is important to realize that Chaitin's constant is not a constant in the usual sense: it is not a fixed, canonically defined number such as Pi or e since its definition depends on the arbitrary choice of computation model and program encoding. It should be more properly refered to as "Chaitin's construction".

To define Ω formally, we first need to fix a model of computation, for instance Turing machines or LISP or Pascal programs. We then need to specify an unambiguous encoding of programs (or machines) as bit strings. This encoding must have the property that if w encodes a syntactically correct program, then no proper prefix of w encodes a syntactically correct program. This can always be achieved by using a special end symbol. We only consider programs that don't require any input.

The definition of Ω is then

  Ω  =      ∑      2-|p|
         p program 
         which halts 

This is an infinite sum which has one summand for every syntactically correct program which halts. |p| stands for the length of the bit string of p's encoding. The above requirement that programs be prefix-free ensures that this sum converges to a real number between 0 and 1.

It can then be shown that Ω represents the probability that a randomly produced bit string will encode a halting program. This means that if you start flipping coins, always recording a head as a one and a tail as a zero, the probability is Ω that you will eventually reach the encoding of a syntactically correct halting program.

One can prove that there is no algorithm which produces the digits of Ω: Ω is definable but not computable. Furthermore, if you fix, in addition to the computation model and encoding mentioned above, a specific consistent axiomatic system for the natural numbers, say Peano's axioms, then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system. (The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.)

This is an incompleteness result akin to Gödel's incompleteness theorem and Chaitin's own result mentioned under algorithmic information theory.