I did not come here to discredit Wegner. I loved the book he wrote in the early 1970’s with the sections on SECD and lambda calculus. Perhaps all Wegner is saying is that it is good to look at concurrent programming – if so many would heartily agree with him – presumably this is what Dijkstra and Wirth were saying in 1970, writing on cooperating sequential processes and also what Ken Thompson was saying in 1970, in inventing Unix.

At present it seems a lot more important for me to read the Davis papers than to puzzle over papers by Wegner.

Perhaps pedagogically there is something wrong with much of the presentation of Turing machine material. If Wegner has difficulties with it, how much more so will the general reader. So Wegner’s rants may ultimately have a beneficial effect in encouraging people to restate results in a better way.

]]>“Timing Toast”, by Piet Hein:

There’s an art of knowing when.

Never try to guess.

Toast until it smokes and then

twenty seconds less.

This sounds really, really doubtful to me. I knew Iverson reasonably well and worked with him at the IBM Philadelphia Scientific Center. He knew an awful lot of mathematics and computing, and I’m absolutely sure he knew about Turing.

Iverson was concerned with notation and expressibility. The claim above has all the hallmarks of a misunderstanding that has been repeated multiple times and changed in the telling.

]]>Certain programs have the property that they manipulate their input data in a “type-safe” way. That is, if the encoding of the input data is changed, then the program will continue to run correctly as long as the data is accessed through the proper channels. This is an extremely important property in practice, and one of the major focuses of high-level language design is ensuring type safety. In fact, I would say one of the most important things that a program expresses, perhaps more important even than its semantics, is the interaction of its data structures.

In a Turing machine, all the data accesses are “flattened out” to pure bit manipulation. In this way, a Turing machine can never be type safe, because it lacks the formalism to specify which data accesses are properly controlled.

This is an example of a important aspect of computation that can be expressed only in high-level languages, not Turing machines.

]]>What do you people think about nondiscrete models of computation as possible counterexamples to the Church-Turing thesis?

http://edelstein.huji.ac.il/staff/shagrir/papers/physical_computation.pdf

(section 3)

I remember other papers but cannot find the references.

I think the Church-Turing thesis for those models is that continuous computation modulo finite precision measurement and setup of computing apparati is equivalent to classical computation.

And do you think that continuous models of computation with beyond-Turing capability (before imposing feasibility assumptions) are useful?

Any reference?

]]>“Common-sense must be used here to avoid embarking on an over-axiomatized, and hence misguided, piece of theoretical physics. We [\hellip;] should not be trapped into axiomatizing theoretical ideas out of existence.”

— Christopher Isham (1984)

(quoted in Matthias Blau’s popular on-line lecture notes *“Symplectic geometry and geometric quantization”* (1992).

Isham’s remark descends from

“If one finds a difficulty in a calculation which is otherwise quite convincing, one should not push the difficulty away; one should rather try to make it the centre of the whole thing.”

— Werner Heisenberg (1971)

(which a Google search finds quoted in many places)

Here the point is that a great deal of heat commonly is generated — especially in the blogosphere — when postulates are mistaken for and/or claimed to be deductions. And the rhetorical motivation for such claims is evident: the correctness of deductions can be verified by more-or-less irrefutable reasoning … the correctness of *postulates* is susceptible to no such checks!

Testbooks on complexity theory (example: Arora and Barak *Computational Complexity: a Modern Approach*) and books on quantum theory (example: Nielsen and Chuang *Quantum Computation and Quantum Information*) generically begin with a statement of postulates. Three questions then are natural.

**Q1** What is the likelihood that the *reasoning* of standard textbooks is globally and seriously flawed? Both common-sense and history indicate that this likelihood is vanishingly small.

**Q2** What is the likelihood that, in coming decades, the *postulates* of 20th century textbooks will be appreciated as mathematically limiting and/or physically unrealistic? Both common-sense and history indicate that this likelihood is sufficiently great as to amount to a near-certainty.

**Q3** What new and/or amended postulates, regarding complexity theory and/or quantum dynamics, will the great textbooks of the 21st century embrace? We are very unlikely to collectively appreciate the answer to this question … until someone writes these textbooks. ðŸ™‚

It is a pretty safe bet — if we are guided by Isham and Heisenberg — that the amended complexity/quantum theoretic postulates of the 21st century will originate in physical concepts rather than reasoning from axioms (Isham’s principle) and that these postulates will embrace as central, difficulties that at present are regarded as peripheral (Heisenberg’s principle).

**Conclusion** Should we be skeptical of deductions that follow from the CT thesis? Reasonably, no. Should we be skeptical that the CT thesis provides uniquely optimal foundations for complexity theory? Reasonably, yes.