In computability theory, the ChurchTuring thesis (also known as computability thesis,^{[1]} the TuringChurch thesis,^{[2]} the ChurchTuring conjecture, Church's thesis, Church's conjecture, and Turing's thesis) is a hypothesis about the nature of computable functions. It states that a function on the natural numbers is computable by a human being following an algorithm, ignoring resource limitations, if and only if it is computable by a Turing machine. The thesis is named after American mathematician Alonzo Church and the British mathematician Alan Turing. Before the precise definition of computable function, mathematicians often used the informal term effectively calculable to describe functions that are computable by paperandpencil methods. In the 1930s, several independent attempts were made to formalize the notion of computability:
Church^{[3]} and Turing^{[4]} proved that these three formally defined classes of computable functions coincide: a function is ?computable if and only if it is Turing computable if and only if it is general recursive. This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes. Other formal attempts to characterize computability have subsequently strengthened this belief (see below).
On the other hand, the ChurchTuring thesis states that the above three formallydefined classes of computable functions coincide with the informal notion of an effectively calculable function. Since, as an informal notion, the concept of effective calculability does not have a formal definition, the thesis, although it has nearuniversal acceptance, cannot be formally proven.
Since its inception, variations on the original thesis have arisen, including statements about what can physically be realized by a computer in our universe (physical ChurchTuring thesis) and what can be efficiently computed (ChurchTuring thesis (complexity theory)). These variations are not due to Church or Turing, but arise from later work in complexity theory and digital physics. The thesis also has implications for the philosophy of mind (see below).
J.B. Rosser (1939) addresses the notion of "effective computability" as follows: "Clearly the existence of CC and RC (Church's and Rosser's proofs) presupposes a precise definition of 'effective'. 'Effective method' is here used in the rather special sense of a method each step of which is precisely predetermined and which is certain to produce the answer in a finite number of steps".^{[5]} Thus the adverbadjective "effective" is used in a sense of "1a: producing a decided, decisive, or desired effect", and "capable of producing a result".^{[6]}^{[7]}
In the following, the words "effectively calculable" will mean "produced by any intuitively 'effective' means whatsoever" and "effectively computable" will mean "produced by a Turingmachine or equivalent mechanical device". Turing's "definitions" given in a footnote in his 1939 Ph.D. thesis Systems of Logic Based on Ordinals, supervised by Church, are virtually the same:
^{+} We shall use the expression "computable function" to mean a function calculable by a machine, and let "effectively calculable" refer to the intuitive idea without particular identification with any one of these definitions.^{[8]}
The thesis can be stated as: Every effectively calculable function is a computable function.^{[9]}
Turing stated it this way:
It was stated... that "a function is effectively calculable if its values can be found by some purely mechanical process." We may take this literally, understanding that by a purely mechanical process one which could be carried out by a machine. The development... leads to... an identification of computability^{+} with effective calculability. [+ is the footnote quoted above.]^{[8]}
One of the important problems for logicians in the 1930s was David Hilbert's Entscheidungsproblem, which asked whether there was a mechanical procedure for separating mathematical truths from mathematical falsehoods. This quest required that the notion of "algorithm" or "effective calculability" be pinned down, at least well enough for the quest to begin.^{[10]} But from the very outset Alonzo Church's attempts began with a debate that continues to this day.^{[11]} Was the notion of "effective calculability" to be (i) an "axiom or axioms" in an axiomatic system, or (ii) merely a definition that "identified" two or more propositions, or (iii) an empirical hypothesis to be verified by observation of natural events, or (iv) or just a proposal for the sake of argument (i.e. a "thesis").
In the course of studying the problem, Church and his student Stephen Kleene introduced the notion of ?definable functions, and they were able to prove that several large classes of functions frequently encountered in number theory were ?definable.^{[12]} The debate began when Church proposed to Gödel that one should define the "effectively computable" functions as the ?definable functions. Gödel, however, was not convinced and called the proposal "thoroughly unsatisfactory".^{[13]} Rather, in correspondence with Church (ca 19345), Gödel proposed axiomatizing the notion of "effective calculability"; indeed, in a 1935 letter to Kleene, Church reported that:
But Gödel offered no further guidance. Eventually, he would suggest his recursion, modified by Herbrand's suggestion, that Gödel had detailed in his 1934 lectures in Princeton NJ (Kleene and Rosser transcribed the notes). But he did not think that the two ideas could be satisfactorily identified "except heuristically".^{[15]}
Next, it was necessary to identify and prove the equivalence of two notions of effective calculability. Equipped with the ?calculus and "general" recursion, Stephen Kleene with help of Church and J. Barkley Rosser produced proofs (1933, 1935) to show that the two calculi are equivalent. Church subsequently modified his methods to include use of HerbrandGödel recursion and then proved (1936) that the Entscheidungsproblem is unsolvable: there is no generalized algorithm that can determine whether a well formed formula has a "normal form".^{[16]}
Many years later in a letter to Davis (ca 1965), Gödel said that "he was, at the time of these [1934] lectures, not at all convinced that his concept of recursion comprised all possible recursions".^{[17]} By 19634 Gödel would disavow HerbrandGödel recursion and the ?calculus in favor of the Turing machine as the definition of "algorithm" or "mechanical procedure" or "formal system".^{[18]}
A hypothesis leading to a natural law?: In late 1936 Alan Turing's paper (also proving that the Entscheidungsproblem is unsolvable) was delivered orally, but had not yet appeared in print.^{[19]} On the other hand, Emil Post's 1936 paper had appeared and was certified independent of Turing's work.^{[20]} Post strongly disagreed with Church's "identification" of effective computability with the ?calculus and recursion, stating:
Rather, he regarded the notion of "effective calculability" as merely a "working hypothesis" that might lead by inductive reasoning to a "natural law" rather than by "a definition or an axiom".^{[22]} This idea was "sharply" criticized by Church.^{[23]}
Thus Post in his 1936 paper was also discounting Kurt Gödel's suggestion to Church in 19345 that the thesis might be expressed as an axiom or set of axioms.^{[14]}
Turing adds another definition, Rosser equates all three: Within just a short time, Turing's 193637 paper "On Computable Numbers, with an Application to the Entscheidungsproblem"^{[19]} appeared. In it he stated another notion of "effective computability" with the introduction of his amachines (now known as the Turing machine abstract computational model). And in a proofsketch added as an "Appendix" to his 193637 paper, Turing showed that the classes of functions defined by ?calculus and Turing machines coincided.^{[24]} Church was quick to recognise how compelling Turing's analysis was. In his review of Turing's paper he made clear that Turing's notion made "the identification with effectiveness in the ordinary (not explicitly defined) sense evident immediately".^{[25]}
In a few years (1939) Turing would propose, like Church and Kleene before him, that his formal definition of mechanical computing agent was the correct one.^{[26]} Thus, by 1939, both Church (1934) and Turing (1939) had individually proposed that their "formal systems" should be definitions of "effective calculability";^{[27]} neither framed their statements as theses.
Rosser (1939) formally identified the three notionsasdefinitions:
Kleene proposes Church's Thesis: This left the overt expression of a "thesis" to Kleene. In his 1943 paper Recursive Predicates and Quantifiers Kleene proposed his "THESIS I":
Kleene goes on to note that:
Kleene's ChurchTuring Thesis: A few years later (1952) Kleene, who switched from presenting his work in the mathematical terminology of the lambda calculus of his phd advisor Alonzo Church to the theory of general recursive functions of his other teacher Kurt Gödel, would overtly name the ChurchTuring thesis in his correction of Turing's paper "The Word Problem in SemiGroups with Cancellation",^{[31]} defend, and express the two "theses" and then "identify" them (show equivalence) by use of his Theorem XXX:
Kleene himself never stated that Turing had made a mistake in his paper, important in its own right for helping to establish the unsolvability of problems in group theoretic computations, although corrections to Turing's paper were also made later by Boone who originally pointed out "points in the proof require clarification, which can be given"^{[34]} and Turing's only PhD student, Robin Gandy. That Kleene doesn't mention this mistake in the body of his textbook where his presents his work on Turing machines but buried the fact he was correcting Alan Turing in the appendix was appreciated by Turing himself can be surmised from the ending of Turing's last publication "Solvable and Unsolvable Problems" which ends not with a bibliography but the words,
An attempt to understand the notion of "effective computability" better led Robin Gandy (Turing's student and friend) in 1980 to analyze machine computation (as opposed to humancomputation acted out by a Turing machine). Gandy's curiosity about, and analysis of, "cellular automata", "Conway's game of life", "parallelism" and "crystalline automata" led him to propose four "principles (or constraints) ... which it is argued, any machine must satisfy."^{[35]} His mostimportant fourth, "the principle of causality" is based on the "finite velocity of propagation of effects and signals; contemporary physics rejects the possibility of instantaneous action at a distance."^{[36]} From these principles and some additional constraints(1a) a lower bound on the linear dimensions of any of the parts, (1b) an upper bound on speed of propagation (the velocity of light), (2) discrete progress of the machine, and (3) deterministic behaviorhe produces a theorem that "What can be calculated by a device satisfying principles IIV is computable."^{[37]}
In the late 1990s Wilfried Sieg analyzed Turing's and Gandy's notions of "effective calculability" with the intent of "sharpening the informal notion, formulating its general features axiomatically, and investigating the axiomatic framework".^{[38]} In his 1997 and 2002 work Sieg presents a series of constraints on the behavior of a computor"a human computing agent who proceeds mechanically". These constraints reduce to:
The matter remains in active discussion within the academic community.^{[40]}^{[41]}
The thesis can be viewed as nothing but an ordinary mathematical definition. Comments by Gödel on the subject suggest this view, e.g. "...the correct definition of mechanical computability was established beyond any doubt by Turing."^{[42]} The case for viewing the thesis as nothing more than a definition is made explicitly by Robert I. Soare,^{[43]} where it is also argued that Turing's definition of computability is no less likely to be correct than the epsilondelta definition of a continuous function.
Other formalisms (besides recursion, the ?calculus, and the Turing machine) have been proposed for describing effective calculability/computability. Stephen Kleene (1952) adds to the list the functions "reckonable in the system S_{1}" of Kurt Gödel 1936, and Emil Post's (1943, 1946) "canonical [also called normal] systems".^{[44]} In the 1950s Hao Wang and Martin Davis greatly simplified the onetape Turingmachine model (see PostTuring machine). Marvin Minsky expanded the model to two or more tapes and greatly simplified the tapes into "updown counters", which Melzak and Lambek further evolved into what is now known as the counter machine model. In the late 1960s and early 1970s researchers expanded the counter machine model into the register machine, a close cousin to the modern notion of the computer. Other models include combinatory logic and Markov algorithms. Gurevich adds the pointer machine model of Kolmogorov and Uspensky (1953, 1958): "... they just wanted to ... convince themselves that there is no way to extend the notion of computable function."^{[45]}
All these contributions involve proofs that the models are computationally equivalent to the Turing machine; such models are said to be Turing complete. Because all these different attempts at formalizing the concept of "effective calculability/computability" have yielded equivalent results, it is now generally assumed that the ChurchTuring thesis is correct. In fact, Gödel (1936) proposed something stronger than this; he observed that there was something "absolute" about the concept of "reckonable in S_{1}":
Proofs in computability theory often invoke the ChurchTuring thesis in an informal way to establish the computability of functions while avoiding the (often very long) details which would be involved in a rigorous, formal proof.^{[47]} To establish that a function is computable by Turing machine, it is usually considered sufficient to give an informal English description of how the function can be effectively computed, and then conclude "by the ChurchTuring thesis" that the function is Turing computable (equivalently, partial recursive).
Dirk van Dalen gives the following example for the sake of illustrating this informal use of the ChurchTuring thesis:^{[48]}
In order to make the above example completely rigorous, one would have to carefully construct a Turing Machine, or ?function, or carefully invoke recursion axioms, or at best, cleverly invoke various theorems of computability theory. But because the computability theorist believes that Turing computability correctly captures what can be computed effectively, and because an effective procedure is spelled out in English for deciding the set B, the computability theorist accepts this as proof that the set is indeed recursive.
The success of the ChurchTuring thesis prompted variations of the thesis to be proposed. For example, the Physical ChurchTuring thesis (PCTT) states:
All physically computable functions are Turingcomputable.^{[49]}^{:101}
The ChurchTuring thesis says nothing about the efficiency with which one model of computation can simulate another. It has been proved for instance that a (multitape) universal Turing machine only suffers a logarithmic slowdown factor in simulating any Turing machine.^{[50]} A variation of the ChurchTuring thesis addresses whether an arbitrary but "reasonable" model of computation can be efficiently simulated. This is called the Feasibility Thesis,^{[51]} also known as the (Classical) ComplexityTheoretic ChurchTuring Thesis (SCTT) or the Extended ChurchTuring Thesis, which is not due to Church or Turing, but rather was realized gradually in the development of complexity theory. It states:^{[52]}
A probabilistic Turing machine can efficiently simulate any realistic model of computation.
The word 'efficiently' here means up to polynomialtime reductions. This thesis was originally called Computational ComplexityTheoretic ChurchTuring Thesis by Ethan Bernstein and Umesh Vazirani (1997). The ComplexityTheoretic ChurchTuring Thesis, then, posits that all 'reasonable' models of computation yield the same class of problems that can be computed in polynomial time. Assuming the conjecture that probabilistic polynomial time (BPP) equals deterministic polynomial time (P), the word 'probabilistic' is optional in the ComplexityTheoretic ChurchTuring Thesis. A similar thesis, called the Invariance Thesis, was introduced by Cees F. Slot and Peter van Emde Boas. It states: "Reasonable" machines can simulate each other within a polynomially bounded overhead in time and a constantfactor overhead in space.^{[53]} The thesis originally appeared in a paper at STOC'84, which was the first paper to show that polynomialtime overhead and constantspace overhead could be simultaneously achieved for a simulation of a Random Access Machine on a Turing machine.^{[54]}
If BQP is shown to be a strict superset of BPP, it would invalidate the ComplexityTheoretic ChurchTuring Thesis. In other words, there would be efficient quantum algorithms that perform tasks that do not have efficient probabilistic algorithms. This would not however invalidate the original ChurchTuring thesis, since a quantum computer can always be simulated by a Turing machine, but it would invalidate the classical ComplexityTheoretic ChurchTuring thesis for efficiency reasons. Consequently, the Quantum ComplexityTheoretic ChurchTuring thesis states:^{[52]}
A quantum Turing machine can efficiently simulate any realistic model of computation.
Eugene Eberbach and Peter Wegner claim that the ChurchTuring thesis is sometimes interpreted too broadly, stating "the broader assertion that algorithms precisely capture what can be computed is invalid."^{[55]}^{[page needed]} They claim that forms of computation not captured by the thesis are relevant today, terms which they call superTuring computation.
Philosophers have interpreted the ChurchTuring thesis as having implications for the philosophy of mind.^{[56]}^{[57]}B. Jack Copeland states that it's an open empirical question whether there are actual deterministic physical processes that, in the long run, elude simulation by a Turing machine; furthermore, he states that it is an open empirical question whether any such processes are involved in the working of the human brain.^{[58]} There are also some important open questions which cover the relationship between the ChurchTuring thesis and physics, and the possibility of hypercomputation. When applied to physics, the thesis has several possible meanings:
There are many other technical possibilities which fall outside or between these three categories, but these serve to illustrate the range of the concept.
This section relies largely or entirely upon a single source. (November 2017) (Learn how and when to remove this template message)

One can formally define functions that are not computable. A wellknown example of such a function is the Busy Beaver function. This function takes an input n and returns the largest number of symbols that a Turing machine with n states can print before halting, when run with no input. Finding an upper bound on the busy beaver function is equivalent to solving the halting problem, a problem known to be unsolvable by Turing machines. Since the busy beaver function cannot be computed by Turing machines, the ChurchTuring thesis states that this function cannot be effectively computed by any method.
Several computational models allow for the computation of (ChurchTuring) noncomputable functions. These are known as hypercomputers. Mark Burgin argues that superrecursive algorithms such as inductive Turing machines disprove the ChurchTuring thesis.^{[61]}^{[page needed]} His argument relies on a definition of algorithm broader than the ordinary one, so that noncomputable functions obtained from some inductive Turing machines are called computable. This interpretation of the ChurchTuring thesis differs from the interpretation commonly accepted in computability theory, discussed above. The argument that superrecursive algorithms are indeed algorithms in the sense of the ChurchTuring thesis has not found broad acceptance within the computability research community.