What is the contribution of lambda calculus to the field of theory of computation?

2 min read Original article ↗

Turing argued that Mathematics can be reduced to a combination of reading/writing symbols, chosen from a finite set, and switching between a finite number of mental 'states'. He reified this in his Turing Machines, where symbols are recorded in cells on a tape and an automaton keeps track of the state.

However, Turing's machines are not a constructive proof of this reduction. He argued that any 'effective procedure' can be implemented by some Turing Machine, and showed that a Universal Turing Machine can implement all of those other machines, but he didn't actually give a set of symbols, states and update rules which implement Mathematics in the way that he argued. In other words, he did not propose a 'standard Turing Machine', with a standard set of symbol which we can use to write down our Mathematics.

Lambda Calculus, on the other hand, is precisely that. Church was specifically trying to unify the notations used to write down our Mathematics. Once it was shown that LC and TMs are equivalent, we could use LC as our 'standard Turing Machine' and everyone would be able to read our programs (well, in theory ;) ).

Now, we could ask why treat LC as a primitive, rather than as a TM dialect? The answer is that LC's semantics are denotational: LC terms have 'intrinsic' meaning. There are Church numerals, there are functions for addition, multiplication, recursion, etc. This makes LC very well aligned with how (formal) Mathematics is practiced, which is why many (functional) algorithms are still presented directly in LC.

On the other hand, the semantics of TM programs are operational: the meaning is defined as the behaviour of the machine. In this sense, we can't cut out some section of tape and say "this is addition", because it is context-dependent. The machine's behaviour, when it hits that section of tape, depends on the machine's state, the lengths/offsets/etc. of the arguments, how much tape will be used for the result, whether any previous operation has corrupted that section of tape, etc. This is a horrendous way of working ("Nobody wants to program a Turing Machine"), which is why so many (imperative) algorithms are presented as pseudocode.