What’s not to get?

And Scott — congrats on that important first step!! I also achieved it independently, but I think this is one of those steps where the identity of the authors does indeed matter in evaluating the importance. ]]>

The Japan Society has created a disaster relief fund to aid victims in Japan. 100% of your generous tax-deductible contributions will go to organization(s) that directly help victims recover from the devastating effects of the earthquake and tsunami that struck Japan on March 11, 2011.

]]>Great summary.

]]>The quantum physics of photon detection is a subtle topic that even Feynman got wrong on occasion. The story of Feynman’s mistake is vividly told in the *Physics Today’s* obituary for Robert Hanbury Brown (volume 55(7), 2002), which tells of Richard Feynman standing up during a talk Hanbury Brown, proclaiming (wrongly) “It can’t work!”, and walking out of the lecture.

The quantum physics associated to this Feynman story is summarized in series of six short letters, totaling 12 pages in all, that appeared in *Nature* during 1955-6. These letters describe what is today called the “Hanbury Brown and Twiss Effect”—the first-ever observation of higher-order photon counting correlations.

The thrilling story of the Hanbury Brown and Twiss Effect, as recounted on the pages of *Nature*, in effect has six chapters. **Chapter 1:** Hanbury Brown and Twiss announce (in effect) “In the laboratory, we observe nontrivial correlations in photons generated by glowing gases.” (*Correlation between photons in two coherent beams of light*, Nature 177(4497), 1956). **Chapter 2:** Brannen and Ferguson announce (in effect) “The claims of Hanbury Brown and Twiss, if true, would require major revision of some fundamental concepts of quantum mechanics; moreover when we did a more careful experiment, we saw nothing.” (*The question of correlation between photons in coherent light rays*, Nature 178(4531), 1956). **Chapter 3:** Hanbury Brown and Twiss announce (in effect) “We observe nontrivial correlations even in photons from the star Sirius, and our theory allows us to determine its diameter” (*Test of new type of stellar interferometer on Sirius*, Nature 178(4541), 1956). **Chapter 4:** Hanbury Brown and Twiss reply “The experiment of Brannen and Ferguson was grossly lacking in sensitivity; had they analyzed their experiment properly, they would have *expected* to see no effect” (*The question of correlation between photons in coherent light rays*, Nature 178(4548), 1956). **Chapter 5:** In an accompanying letter, Ed Purcell announces (in effect) “Hanbury Brown and Twiss are right, moreover their theoretical predictions and their experiments data are in accord with quantum mechanics as properly understood.” (Nature 178(4548), 1956). **Chapter 6:** Hanbury Brown and Twiss announce (in effect) “When the experimental methods of Brannen and Ferguson are implemented with higher sensitivity, and analyzed with due respect for quantum theory as explained by Purcell, the results wholly confirm our earlier findings.” (*Correlation between photons, in coherent beams of light, detected by a coincidence counting technique*, Nature 180(4581), 1956).

When we read the 12-page story of Hanbury Brown and Twiss side-by-side with the discussion of photon counting in *The Feynman Lectures on Physics*, we are struck by three aspects of the Hanbury Brown and Twiss experiments that are *not* emphasized in the *Feynman Lectures*.

First, the Hanbury Brown and Twiss articles exhibit a charming physicality that is largely absent from the Feynman Lectures. For example, Hanbury Brown and Twiss describe the use of an “integrating motor” to measure the total current associated to photon detection during an experimental run. Modern physics students will wonder “What the heck is an integrating motor?”, yet in the physics literature of the 1950s this concept was viewed as being so intuitively obvious as to require no explanation: the total number of revolutions of an electric motor (as counted by purely mechanical means!) *obviously* can be made proportional to the integral of the current flowing through it … that’s how electric meters work, right?” As Ed Purcell’s letter to *Nature* rightly observes, the observation of subtle quantum correlations with purely mechanical counters “adds lustre to the notable achievement of Hanbury Brown and Twiss.”

Second, the experimental protocol of Hanbury Brown and Twiss includes elements that are highly sophisticated from the viewpoint of modern quantum information theory. In particular, while aligning their apparatus, they reverse the flow of photons by placing their eyes at the position of the source, and while physically looking at two photodetectors through a half-silvered mirror, they adjust the mirrors such that the images of the photodectors are coherently superimposed. We nowadays appreciate that from the viewpoint of QED, this time-reversed coherence is necessary to ensure that quantum fluctuations in the photon detector currents are deterministically associated to quantum fluctuations in the photon source currents.

Third, it follows that in the observations of Sirius recorded by Hanbury Brown and Twiss, their experimental record of correlated photocurrents here on earth is deterministically associated to currents that span the surface of the remote star Sirus — eight light-years away! This counterintuitive implication was why many theoretical physicists (including Feynman) at first considered the results of Hanbury Brown and Twiss to be (literally) incredible.

Nowadays we appreciate that this seeming paradox is naturally reconciled via the quantum informatic mechanism that Nielsen and Chuang call the “Principles of Deferred and Implicit Measurement” — principles that are formally associated to work by Kraus and Lindblad in the 1970s; principles that were not readily appreciated by Feynman and his colleagues in the 1950s.

Moreover, the experiments of Hanbury Brown and Twiss were vastly wasteful of photonic resources. The star Sirius emits about 10^{46} photons/second, of which Hanbury Brown and Twiss detected about 10^{9} two-photon entangled states/second … the relative production efficiency thus was a dismal 10^{-37}. Even today, more than 50 years later, the production of six-photon entangled states stil is dismally inefficient: in recent experiments 10^{18} photons/second of pump power yield about one six-photon state per thousand seconds, for a relative production efficiency of order 10^{-21}.

We see that one of the fundamental challenges (among many!) that Scott and Alex’s experiment poses for 21st century physicists, is to devise methods for generating entangled photon states that are *exponentially* more efficient than existing methods. To achieve this, modern physicists will have to do exactly what Hanbury Brown and Twiss did … “look” at the photon detectors from the time-reversed viewpoint of the photon source … and then (by careful design) arrange for the photon source currents to have near-unity correlation with the photon detector currents.

This is an immense practical challenge in cavity quantum electrodynamics, that we are certain to learn a great deal in trying to solve. At present we are similarly far from having scalable quantum-coherent n-photon sources, as we are far from having scalable quantum-coherent n-gate quantum computers.

These considerations are why, from an engineeering point-of-view, it is prudent to regard n-photon linear optics experiments, not as being obviously easier than building n-gate quantum circuits, but rather as being comparably challenging from a technical point-of-view. And this is why it will not be surprising (to me) if the Aaronson/Arkhipov distribution-sampling algorithms prove in the long run to be similarly seminal mathematically and theoretically—and similarly challenging experimentally—to Peter Shor’s number-factoring algorithms.

]]>Sounds like quite an achievement. and I say that as someone who has not successfully proven both the Riemann Conjecture and P(!)=NP.

]]>To appreciate why, we reason physically. Photons are created by (quantum) currents in the photon emitter and absorbed by (quantum) currents in the photon detector. Moreover, QED tells us that these emitter-detector currents are deterministically coupled … if one is specified, the other is completely determined.

20th-century technologies are severely limited in their ability to control emitter and/or detector currents. In consequence, a typical experimental strategy has been to specify (what amounts to) a high-power photon source current, then post-select the resulting detector currents. In essence, we wait until (at random intervals) we observe detector currents that have the desired n-photon properties.

Coherent-source methods have two practical limitations: (1) they are hugely wasteful of optical power, and (2) the correlated photons emerge at random intervals. It’s not obvious that these methods can be extended much beyond the present six-photon experiments.

A more elegant engineering approach is to carefully control the (quantum) source current fluctuations, or the (quantum) detector current fluctuations (or some clever combination of the two). Here the practical challenge is that it’s not obviously easier to control the quantum currents in (say) a high-coherence deterministically clocked 10-photon source, than it is to build a high-coherence deterministically clocked 10-gate quantum circuit.

Students should not imagine that the engineering calculations associated to these experiments are easy … Feynman himself famously asserted—wrongly—of the Hanbury Brown and Twiss experiment (an early correlated-photon experiment whose high-power photon source was the star Sirius) that “It can’t work!”.

Moreover, the history of large-scale QED calculations abundantly illustrates that the probability of significant error in published theoretical calculations is nonzero … particularly when the calculations are done in advance of experiment … I’ve had personal experience of the immense work required to track-down even mundane errors in field-theoretic calculations.

The bottom line IMHO is that we should not regard n-photon linear optics experiments as being necessarily easier than building n-gate quantum circuits … and neither should we overlook the striking originality and power of these experiments, which are (IMHO) worthy of the highest admiration.

]]>Scaling up to (say) 10 or 20 photons is a different matter—I’m not sure when that will become possible. As Barry Sanders says in the article, it seems to depend mostly on when it’ll become possible to create a reliable stream of single photons spaced at extremely regular intervals.

]]>