30
$\begingroup$

Is there constructed some set of physical laws from which we can logically obtain that any function that can be implemented in some device is Turing computable?

EDIT

I believe that if we restrict ourselves to classical mechanics (I mean if we suppose that any device obey just classical mechanics laws) then CT thesis can be proved.

$\endgroup$
11
  • 30
    $\begingroup$ On the other hand, you could take the question as "Is there some set of axioms modeling physics from which you can derive a proof of the Church-Turing thesis." In this case, the answer is "Yes." Both quantum mechanics and quantum field theory appear to be Turing computable. However, with quantum mechanics, you appear to be able to compute certain functions exponentially faster than Turing machines are able to. $\endgroup$
    – Peter Shor
    Feb 8, 2011 at 22:25
  • 5
    $\begingroup$ I was under the impression that there are results of the form "the solution to this explicit PDE at some positive computable time t can be non-computable given computable initial data." Does anyone have a reference to a result of this kind? $\endgroup$ Feb 8, 2011 at 23:13
  • 3
    $\begingroup$ I think restricting to classical mechanics, makes things more difficult instead of easier. Because classical mechanics does not imply particle on the smallest level. So, in classical mechanics you might think of a device that is infinite on the microscopic level. $\endgroup$
    – Lucas K.
    Feb 8, 2011 at 23:41
  • 6
    $\begingroup$ @Qiaochu: Perhaps you are thinking of this: Pour-El, M.B., Richards, I. 1981. 'The Wave Equation with Computable Initial Data such that its Unique Solution is not Computable'. Advances in Mathematics, 39, 215-239. $\endgroup$ Feb 9, 2011 at 0:08
  • 3
    $\begingroup$ @Lucas: When you describe a model of computation based on physics it is important to specify also the model for approximation (/errors) you are using. $\endgroup$
    – Gil Kalai
    Feb 9, 2011 at 6:50

11 Answers 11

21
$\begingroup$

Just appeared on the arXiv today: "The physical Church-Turing thesis and the principles of quantum theory," by Pablo Arrighi and Gilles Dowek. http://arxiv.org/abs/1102.1612

Abstract:

Notoriously, quantum computation shatters complexity theory, but is innocuous to computability theory. Yet several works have shown how quantum theory as it stands could breach the physical Church-Turing thesis. We draw a clear line as to when this is the case, in a way that is inspired by Gandy. Gandy formulates postulates about physics, such as homogeneity of space and time, bounded density and velocity of information --- and proves that the physical Church-Turing thesis is a consequence of these postulates. We provide a quantum version of the theorem. Thus this approach exhibits a formal non-trivial interplay between theoretical physics symmetries and computability assumptions.

$\endgroup$
8
  • 3
    $\begingroup$ Cool, they've made an article in 1 day after my question! just kidding =) $\endgroup$
    – Dan
    Feb 9, 2011 at 15:41
  • 7
    $\begingroup$ Actually, there is a 1-day delay at the arXiv, so they must have used "relativistic time foreshortening" (to quote Joel)! $\endgroup$ Feb 9, 2011 at 17:00
  • 1
    $\begingroup$ Hmmmm ... students should be wary of Arrighi and Dowek's bald assertion (without citation) that "Quantum theory is about tensor products of vector spaces." A text search for "tensor network" on the arxiv server yields 213 preprints, such as Poulin, Qarry, Soma & Verstraete "Quantum simulation of time-dependent Hamiltonians and the convenient illusion of Hilbert space" (arXiv:1102.1360). None of these tensor-join state-spaces are Hilbert sensu stricto; see Ashtekar and Schilling "Geometrical formulation of quantum mechanics" for an overview of post-Hilbert quantum formalisms (gr-qc/9706069). $\endgroup$ Feb 9, 2011 at 20:24
  • $\begingroup$ Of course everyone should regard their text with some criticism. This answer flaged as accepted because they are trying to do exactly what I ask. Though other answers are also very good. $\endgroup$
    – Dan
    Feb 9, 2011 at 22:27
  • 1
    $\begingroup$ Why does "quantum computation shatters complexity theory"? I thought, for example, that it wasn't known in quantum computation makes NP-hard problems easier. $\endgroup$
    – Pradipta
    Feb 16, 2011 at 16:28
15
$\begingroup$

The Church-Turing thesis

The Church-Turing thesis asserting that "everything computable is computable by a Turing machine," (and its sharper forms regarding efficient computation) can be regarded as laws of physics. However, there is no strong connections between the thesis and computability in general and theoretical physics. This is discussed in this related question on the CS site.

When it come to the original form of the Church-Turing thesis, (namely when efficiency of computation is not an issue) it does not seem to matter if you allow quantum mechanics or work just within classical mechanics. However, for a model of computation based on physics it is important to specify what are the available approximations or, in other words, the way errors are modeled. (Failing to do it may lead even to "devices" capable of solving undecidable problems.) There are various proposed derivation of the Church-Turing thesis from physics laws. On the other hand, there are various "hypothetical physical worlds" which are in some tension with the Church-Turing thesis (but whether they contradict it is by itself an interesting philosophical question). A paper by Pitowsky "The Physical Church’s Thesis and Physical Computational Complexity", Iyun 39, 81-99 (1990) deals with such hypothetical physical worlds. See also the paper by Itamar Pitowsky and Oron Shagrir: "The Church-Turing Thesis and Hyper Computation", Minds and Machines 13, 87-101 (2003). Oron Shagrir have written several philosophical papers about the Church-Turing thesis see his webpage. (See also this blog post.) Overall these studies are more of a philosophical nature.

The efficient Church-Turing thesis

Concerning the efficient Church-Turing thesis, namely regarding computations in polynomial time, as Peter Shor mentioned, the rules of quantum physics allows for computationally superior computers compared to classical computation. (This depends on computational complexity conjectures e.g., regarding factoring bot being in P.) The efficient Church-Turing thesis (first stated, as far as I know, by Wolfram in the 80s) is "A (probabilistic) Turing machine can efficiently simulate any realistic model of computation." The analogous conjecture for quantum computers is "A quantum Turing machine can efficiently simulate any realistic model of computation." One aspect of the effecient Church-Turing thesis (again, both in its classical and quantum version) is that NP hard problems cannot be computed efficiently by any computational device. This is a physics conjecture of a sort (but it depends, of course, on conjectures from computational complexity and asymptotic issues.) There are some interesting attempts to relate the impossibility to solve NP complete problems to physics.

One interesting aspect of (both classic and quantum version of) the efficient Church-Turing thesis is that they imply that physical models which requires computationally-unfeasible computations are unrealistic. It is an interesting question if this should be taken into consideration in model-building.

Remark: Oron Shagrir and Yuri Gurevich pointed out to me that my formulation of the Church-Turing thesis is a modern version from the last 2-3 decades which is different from Church's and Turing's original formulation. See Yuri's answer below.

$\endgroup$
1
  • 1
    $\begingroup$ Which blog post? $\endgroup$
    – bavajee
    Feb 9, 2011 at 0:38
14
$\begingroup$

There is a body of literature on the topic of supertasks, which are computational tasks involving infinitely many steps. A large part of this work involves a purely mathematical analysis and development of the concept, such as my work on infinite time Turing machines ("Infinite time Turing machines," with Andy Lewis in the Journal of Symbolic Logic, 65(2):567-604, 2000, doi:10.2307/2586556, arXiv version) and other work on higher recursion theory, E-recursion and other infinitary models of computability. All of these computational models exhibit functions that are not computable by Turing machines, but are computable with respect to the infinitary model. (See this MO answer for an entertaining supertask example.)

Another part of the work,however, has considered the topic of your question, about the extent to which we might actually hope to carry out such infinitary computations. The idea is that the real universe exhibits relativistic phenomenon of which we might take advantage for computational effect, and doing so might take us beyond the Turing barrier. Hogarth and others have described physical models, Malament-Hogarth spacetimes, in which one observer has access to the results of an infinite computation carried out by another in that world. To get the idea, imagine assigning your graduate students, and their graduate students and so on in perpetuity, the task of searching for an arithmetic counterexample, while flying in ever-faster rockets around the Earth, signaling when a counterexample has been found. Because of relativistic time foreshortening, their entire infinite journey will amount to a total finite amount of time for you, and so you can get the answer in finite time.

Philip Welch recently wrote an excellent survey and also this article describing some of the physical models in which one can compute non-computable functions by a physical procedure, among several other articles on the topic on his web page.

$\endgroup$
1
13
$\begingroup$

This is my first venture to MathOverflow. A colleague brought to my attention that Peter Shor wrote the following at the blog: "The Dershowitz-Gurevich paper says nothing about probabilistic or quantum computation. It does write down a set of axioms about computation, and prove the Church-Turing thesis assuming those axioms. However, we're left with justifying these axioms. Neither probabilistic nor quantum computation is covered by these axioms (they admit this for probabilistic computation, and do not mention quantum computation at all), so it's quite clear to me these axioms are actually false in the real world, even though the Church-Turing thesis is probably true."

Church's thesis has been discussed in the literature for a long time. There is a tradition to distinguish between two version of the thesis. The classical version speaks about classical (a.k.a. sequential) algorithms, and the physical version speaks about algorithms in full generality. It is the classical thesis that is addressed in the Dershowitz-Gurevich paper.

"Classical algorithms compute in steps of bounded complexity" said Kolmogorov. Not all classical algorithms are covered by Turing's analysis. Ruler-and-compas algorithms are classical, Gauss elimination procedure is classical. Classical algorithms may use randomization,

I axiomatized classical algorithms (article 141 at my website) and proved that every classical algorithm can be step-for-step simulated by bona fide algorithms. The Dershowitz-Gurevich paper added one additional axiom to tame classical algorithms and to derive the thesis.

The axiomatization of classical algorithm was extended by Andreas Blass and myself to that of synchronous-parallel algorithms [article 157 at my webpage]. Erich Graedel and Antje Nowack showed that this covers (noiseless) quantum algorithms [Springer LNCS 2589, 2003, pp 309-323].

$\endgroup$
1
  • $\begingroup$ Synchronous-parallel algorithms are surely the easy case? It seems unlikely that it is possible to extend the proof to the far more interesting case of asynchronous algorithms, because asynchronous algorithms are time-dependent. $\endgroup$
    – abo
    Dec 15, 2012 at 22:18
9
$\begingroup$

Some specialist would know for sure (I've been wanting to ask), but I think current physical theory (whether it really describes Nature or not) allows generation of unlimited amounts of Kolmogorov-random data through quantum processes, which can't be simulated with a Turing machine. As Leonid Levin put it ( http://www.cs.bu.edu/fac/lnd/expo/gdl.htm ):

As is well known, the absence of algorithmic solutions is no obstacle when the requirements do not make a solution unique. A notable example is generating strings of linear Kolmogorov complexity, e.g., those that cannot be compressed to half their length. Algorithms fail, but a set of dice does a perfect job!

Levin has a paper proposing a slight refinement of the Church-Turing thesis to take this into account: http://arxiv.org/abs/cs.CC/0203029

Nachum Dershowitz and Yuri Gurevich have an interesting article where they try to logically deduce CT from some supposedly more basic ideas about abstract state machines: http://research.microsoft.com/en-us/um/people/gurevich/Opera/188.pdf

I've been wondering for a while (based on Levin's example) about a thought experiment: Fix a universal Turing machine U. Flip a coin 2 million times and call the resulting random bit string S. Let P be the proposition that there is no program for U less than 1 million bits long, that writes S onto the tape (i.e. P says that S's Kolmogorov complexity is more than 1 million bits). There is of course some very small probability that P is false (you might have flipped 2 million heads completely by chance), but P is almost certainly true. But, by Chaitin's version of Gödel's incompleteness theorem, P (if true) is independent of any reasonable axiom system for mathematics, including any iterated consistency hypotheses, reflection schemes, etc. (And you could always use 2 billion flips instead of 2 million). So you've got a simple experimental apparatus that can create unlimited amounts of unprovable but (almost certainly) true mathematical statements. This can't be done with classical algorithms. So where does our creaky old physical universe get such knowledge?

$\endgroup$
3
  • $\begingroup$ I don't understand. Assuming P is true, isn't it sufficient to check all less-than-one-million-bit programs one by one, in order to get a proof of P? $\endgroup$ Feb 9, 2011 at 10:23
  • $\begingroup$ In order to check less-than-one-million-bit programs, you must wait for their halt (or for first fault in trying generate S). Generally this cannot be completed in finite time. $\endgroup$
    – Dan
    Feb 9, 2011 at 11:19
  • 1
    $\begingroup$ "So where does our creaky old physical universe get such knowledge?" It doesn't. If our universe consisted only of S, then it would. But a very simple program can arrange to print S while printing orders of magnitudes longer than S of non-S bits. Take any program that prints out any normal number in binary. Eventually the digits of S will be printed. This article might help you understand. It's a bit long, though. $\endgroup$
    – ike
    Apr 21, 2015 at 19:21
6
$\begingroup$

There's a very nontechnical paper by Geroch and Hartle, "Computability and Physical Theories," which discusses the question of whether there are dimensionless physical quantities that are measurable (i.e., for a given $\epsilon > 0$, with sufficient raw materials, we can perform an experiment to measure the quantity to within $\epsilon$) but not computable. They argue that while currently accepted physical theories produce only computable quantities, there are quantities that we can measure, like the fine structure constant, which are not specified by current theories, and that these could potentially be measurable but not computable. They also give an example of a formulation of quantum gravity that could potentially give rise to noncomputable physical quantities.

As an aside, ever since I read this paper, I've never been able to understand the third-to-last paragraph. The only sensible way I can interpret it is that given a noncomputable physical quantity, we could develop a theory that predicts its value to within some $\epsilon$. But there will always be some $\epsilon' < \epsilon$ such that if we want to refine our prediction to be within $\epsilon'$, we have to do some experiments to obtain "noncomputable" data. In other words, in a world with noncomputable physical quantities, we can refer to experiment as an oracle in our computations. But this seems to beg the question, since by definition, our quantity is computable by an "experiment" computer. Anyway, I'm far from an expert in any of these matters, so I could just be completely missing their point, but I'm putting this out there in case anybody can enlighten me on this.

$\endgroup$
1
  • 1
    $\begingroup$ That is a terrific reference (hence the upvote) ... and it suggests still more questions, that AFAICT are both well-posed and mathematically interesting. For example, the article states (what we will call) the Geroch-Hartle postulate: "The theories of physics are such that all the measurable numbers specified by those theories are in fact computable." This leads naturally to the extended Geroch-Hartle postulate, in which "computable" -> "efficiently computable." Then a great experimental challenge of our century (of which quantum computing is only part) is to falsify it experimentally. $\endgroup$ Feb 10, 2011 at 12:46
5
$\begingroup$

Robin Gandy once wrote a paper listing axioms about physics that implied the Church-Turing thesis. I'm sure you could find it by searching for all articles written by him.

Also, Frank Tipler once wrote a paper claiming that relativity refutes the Church-Turing thesis. I saw this as an unpublished manuscript and I do not know if it ever appeared. The argument went this way: Send a rocket into space. The rocket A carries a computer carrying out an infinite search (which will take infinite time to complete). The trajectory leads into a black hole. The point is that it takes infinite time in the rocket's frame to fall into the black hole. When the search succeeds, it radios back the report of success. But in an external observer's frame, the entire fall into the black hole will take finite time. So if you don't get the report, the (infinite) search failed. Of course we have to neglect the difficulty of building a computer that lasts forever in this philosophical analysis. I believe that for some reason a second rocket was used to relay the result. I don't know if the physics is correct or not but the logic of the argument is OK.

$\endgroup$
2
  • $\begingroup$ This reminds me of Greg Egan's "Planck Dive", gregegan.net/PLANCK/Complete/Planck.html $\endgroup$ Feb 16, 2011 at 16:30
  • 5
    $\begingroup$ Um, but that's the other way around. In the frame of the external observer, the fall into a black hole takes infinite time (gravitational red shift); whereas in the point of view of the rocket, it passes the horizon (and possibly dies a stretchy death) in finite time. $\endgroup$ Jun 27, 2011 at 17:16
3
$\begingroup$

On the issue of "generation of unlimited amounts of Kolmogorov-random data through quantum processes," remember that a dice throw, or data generated via any classical chaotic process, has the same Kolmogorov complexity as the initial conditions since deterministic laws of classical dynamics have small K-complexity. Quantum mechanics appears to be essential if you want to generate K-random sequences from non-random initial data, as I discussed eg in "Complexity, Vol. 6, No. 1, p. 27 (2001)" (also found at ArXiv).

$\endgroup$
3
$\begingroup$

To follow up on Michael Beeson’s answer (I’m not allowed to post comments yet), Robin Gandy’s paper was “Church's Thesis and Principles for Mechanisms” in The Kleene Symposium (http://dx.doi.org/10.1016/S0049-237X(08)71257-6). He established that “Turing's analysis of computation by a human being does not apply directly to mechanical devices,” provided a “set-theoretic form of description for discrete deterministic machines,” and “proved that if a device satisfies the principles then its successive states form a computable sequence.”

The most surprising point is that the result depends on non-Newtonian physics, in particular the impossibility of instantaneous action at a distance. The term “Gandy machine” is sometimes used for this variant of Turing machine, e.g. by Blass and Sieg; the notion even has its own entry in the Esoteric Programming Languages wiki.

(Disclaimer: Robin was my supervisor, and Turing was his, so I’m hardly unbiased.)

$\endgroup$
2
$\begingroup$

I would answer that the question is uninteresting, because even if single physical devices were restricted to calculating computable functions, interacting or communicating physical devices are not.

This can be seen by considering two devices which are able to communicate. Both devices are Turing-like and have tapes. One device is programmed to space to the right until it reaches the end of its input. It then communicates with the second device and outputs what it has communicated. The second device alternates between two states: ready to communicate 0 or ready to communicate 1. The behaviour of the two devices together is time-dependent. If, for instance, one assumes that time is like the real numbers, and the ratio of operating speeds is precise and can be any irrational number, then the two devices together can compute a non-computable function (input being the input of the first device and output being the output of the first device). OTOH, it's probably more realistic to say that the behaviour of these two devices together is simply not functional (and thus, a fortiori, not computable-functional) since the operating ratio is unlikely to be precise.

$\endgroup$
0
$\begingroup$

In addition to the above references, heres(The physical chuch thesis modest or bold..) another variation to the theme of physical church turing thesis, (physical CT), in which the author tries to introduce feasible, and infeasible flavors of the physical Church Turing thesis (see reference). Two types of Church Turing theses are introduced- the modest and the bold. In contrast to the mathematical CT, which the author defines as -

"any function that is computable by following an effective procedure is Turing computable",

Bold CT is defined as - "any physical process is Turing computable ".

Modest CT is defined as - "any function that is physically computable is Turing computable".

The author (Gualtiero Piccinini) contends further that Bold CT is implausible! Modest CT seems to be a better way to think about the physical CT model because of its "usability". All this point to the difficulty of which of these physical models can be assessed. Or, even, which of these physical models can better capture mathematical CT. So, laying the foundations of an argument based on "usability" - modest physical CT is a better model to think about the physical CT. in addition there is a "super Bold Physical CT- loosely, "total" physical computability. Which brings one to the accuracy of such computation, stating the

Church-Turing-Deutsch-Wolfram (CDTW) thesis - Every finite physical system can be simulated to any specified degree of accuracy by a universal Turing machine.

See The Church-Turing Thesis: Logical Limit or Breachable Barrier?. As with the definition of Bold physical CT, it is paralleled to the CDTW thesis.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.