1. The CRA’s Computing Community Consortium, chaired by national treasure Ed Lasowska of the University of Washington, recently put up a website with fifteen brief essays about “Computing Research Initiatives for the 21st Century.”  These essays will apparently be reviewed by the science policy staff at the Obama transition office.  Dave Bacon and I wrote the essay on quantum computing—or rather, Dave wrote it with his inimitable enthusiasm, and then I tried in vain to moderate it slightly.  (When Dave told me that President-elect Obama needed my help with quantum computing policy, what was I going to say?  “Sorry, I’m doing my laundry this weekend”?)

2. Lee Gomes of Forbes magazine wrote a fun article about the Worldview Manager project that I blogged about a while ago.  (For some reason, Lee wasn’t deterred by my pointing out to him that the project hasn’t even started yet.)

1. John Sidles Says:

Scott and Dave assert: If one is to gain traction in understanding many physical systems of great import (such as complex biological molecules or complex materials), a quantum computer represents the only path known to be able to efficiently simulate these systems.

Scott, that assertion was definitely true twenty years ago, and it was arguably true ten years ago, but doesn’t the recent quantum simulation literature indicate exactly the opposite?

For example, the Cambridge Crystallographic Data Centre runs a regular “Crystal Structure Prediction Blind Test”, which was won this year with a first-time-ever perfect score (see the appended BibTeX references).

It’s true that these and other groundbreaking advances in quantum simulation capability are arriving faster than many QIT and complexity theorists have foreseen … which makes it even more important to understand the fundamental mathematics of how this progress is being achieved.

As a bonus, better understanding of the mathematical foundations of quantum simulation can help speed the pace and retire the risks,of pursuing several of the Grand Challenges that Ed Lazowska’s essay mentions, among them Lazowska’s: (1) Engineer Better Medicines, and (2) Make solar Energy Economical, and most broadly (3) Engineer the Tools of Scientific Discovery.

With respect, isn’t it true that Ed Lazowska’s three opportunities are best embraced in the broadest sense of QIT/complexity theory, which definitely includes classical simulation of quantum systems—viewed as a powerful real-world capability, not as a theoretical impossibility?

That’s why it’s past time to jettison the folk theorem that classical simulation of quantum systems is infeasible, because this theorem is true only under assumptions that are so restrictive, as to be inapplicable to the real-world quantum systems that Ed Lazowska has identified as being of most urgent practical interest.
——
@article{Title = {Crystal structure prediction from first principles}, Author = {Woodley, Scott M. and Catlow, Richard}, Journal = {Nature Materials}, Month = {12}, Number = {12}, Pages = {937–946}, Volume = {7}, Year = {2008}}

@article{Title = {A major advance in crystal structure prediction}, Author = {M. A. Neumann and F. J. J. Leusen and J. Kendrick,}, Journal = {Angewandte Chemie International Edition}, Number = {13}, Pages = {2427-2430}, Volume = {47}, Year = {2008}}

2. Scott Says:

that assertion was definitely true twenty years ago, and it was arguably true ten years ago, but doesn’t the recent quantum simulation literature indicate exactly the opposite?

John, since you’re more familiar with this literature than I am, surely you could (if you wanted) give examples of quantum systems that are currently hard to simulate classically? What about high-Tc superconductors, or quark-gluon plasmas?

3. Matt Leifer Says:

Nice essay Scott and Dave. I think the over $50 million estimate for Waterloo must be well below the bar though. The initial investment in Perimeter alone was 50 million CAD and if you add on every funding announcement for IQC and PI since then I’m sure it must have more than doubled. Even on conversion to USD we must be pushing close to the 100 million mark by now. Also, you could add funding for Montreal, Calgary, UBC, etc. to argue that total funding Canada must be much more than that. If your aim is to scare the US government into funding quantum stuff by showing how ridiculously large the funding is in other countries then the country-wide funding would be a more appropriate figure to quote. You might also include the significant funding in Australia for extra scare factor. Given my interests, I would also have mentioned something about the fundamentals of quantum theory, which arguably provided the basic concepts that are now used in quantum information science. I don’t want to get into an argument about whether it is necessary to appreciate foundations in order to make progress in quantum information, but it is usually a mistake to fund the applied aspects of a subject without also investing in the basic science, which could easily throw up another surprise at least as significant as quantum computing. I would say that at least funding for experiments in this field should be argued, since the technology required is identical to that used in quantum computing experiments. Don’t forget that many young scientists are attracted to the field because they want to understand quantum weirdness, and such people would be attracted to a funding regime that allows them to develop their interests. 4. John Sidles Says: Scott says: John .. surely you could (if you wanted) give examples of quantum systems that are currently hard to simulate classically? What about high-Tc superconductors, or quark-gluon plasmas? Gosh, Scott … that’s a mighty tough question, especially since (in the spirit of the holiday season) I’ll strive for an Obamic answer—in the spirit of “Yes, we can!” There are plenty of folks who regard quark-gluon plasmas as definitely simulatable with classical resources, and condensed-matter systems like high-Tc superconductors as likely to be simulatable pretty soon. See, for example, Frank Wilczek’s recent review in Nature (BibTeX appended). My experience has been that in private conversations, plenty of theorists are even more optimistic. I’ve been told by a (pretty distinguished) condensed-matter theory colleague that “high-Tc superconductors will be (numerically) solved within the next eighteen months.” On the other hand, that was four years ago … so progress has been slower than this colleague foresaw. Another consideration comes straight from David Deutsch’s Theory of Reality, namely, that what we we are really looking for is not simulation, but understanding. And this is true whether the simulation methods is classical or quantum—either way we value tools that illuminate even more than we value tools that simulate. Balancing all these together, maybe we should ask the following present from the QIT/complexity theory community—stronger theorems about the boundary between what is classically simulatable, and what is not. Here I have in mind (dimly) something like a generalization of the Niyogi-Smale-Weinberger (NSW) Theorems to encompass the tensor network manifolds that the above simulation calculations use. These manifolds are loaded with singularities that (AFAICT) formally obstruct the NSW theorems from being applied directly, but in practice these singularities do not create computational difficulties, and so (perhaps) a more general NSW-type theorem could be proved. The physical idea being, that perhaps any system whose state-space manifold can be learned (with polynomial resources) can be simulated efficiently. This would lead to a pleasingly Obamatic world, in which elegant theorems (possibly of NSW-type?) guarantee that noisy and/or controllable quantum systems are generically easy to cimulate, while low-noise and/or non-controllable quantum systes are generically hard to simulate. This would remedy a striking lack in the condensed matter literature, namely, that the work “theorem” appears too seldom. The word “ergodic” appears pretty often, however, and this can be viewed as a tribute to the powerful ergodic theorems of von Neumann and Birkoff. In summary, it seems to me that the QIT/complexity theory community has the tools at-hand to prove similarly powerful, beautiful, and practical theorems today. In which case, a definite answer to your question would be: “At sufficiently low temperature, generic spin systems are hard to simulate, while at sufficiently high temperature, the same spin systems are easy to simulate.” A more Obamic answer is “Yes we can” do mathematics, physics, and engineering that is similarly as beautiful, deep, and useful as that of von Neumann and Feynman’s generation. —— @article{Author = {Wilczek, Frank}, Journal = {Nature}, Month = {11}, Number = {7221}, Pages = {449–450}, Title = {Particle physics: Mass by numbers}, Volume = {456}, Year = {2008}} 5. rrtucci Says: Odd that you make such a big deal about preferring NSF to DARPA. DARPA has lots of money, and if you don’t take it from them, they’ll invest it on much worse things than quantum computing. 6. John Sidles Says: rrtucci Says: DARPA has lots of money, and if you don’t take it from them, they’ll invest it on much worse things than quantum computing. With respect, rrtucci, that assertion doesn’t have much empirical evidence to support it. The best single review I know of (D)ARPA fundamental research is Richard J. Barber’s The Advanced Research Projects Agency, 1958–1974. Only five copies were printed, and so you have request copies under the Freedom of Information Act (cost ~$100 or so; see BibTeX below). The expense is well-worth it, since Barber interviewed every ARPA/DARPA director at-length.

According to my own “Obamic” reading of Barber’s review, the value of (D)ARPA research during its first two decades is simply explained as follows. (D)ARPA supported fundamental research in three seemingly unrelated areas: materials science (the Army’s MatSci), sensors (AGILE), and networks (ARPANET). These were managed by (D)ARPA as unrelated projects, but subsequently they merged, more-or-less spontaneously, to create the modern disciplines of mechatronics and silicon-based computation, which in turn provided solid foundations for US (and global) economic growth, job creation, and security during the period 1970-2000.

Today, we again have several lines of (D)ARPA-supported fundamental research, which again are managed as unrelated projects: nanotechnology, information theory (both classical and quantum), simulation science (both classical and quantum), and system and synthetic biology.

If we are lucky and diligent, these fields similarly will merge, to provide new foundations for sustained US (and global) economic growth, job creation, and security during the period 2010-2050.

Obviously, something like this needs to happen, since one of our planet’s most pressing needs is the creation of (one the order of) 10^8-10^9 good, family-supporting jobs. This is the equivalent to 10^2-10^3 new companies on the scale of IBM, Toyota, Google, or MicroSoft.

If we aren’t thinking on this scale—given our national and planetary challenges, resources, and emerging opportunities—then IMHO we’re thinking too small.
——-
@book{Author = {R. J. Barber}, Publisher = {Richard J. Barber Associates}, Title = {The Advanced Research Projects Agency, 1958–1974, Year = 1975}, note={Order this product from NTIS (phone: 1-800-553-NTIS) as accession no. AD-A154 363}.}

7. rrtucci Says:

John, I don’t disagree with anything you said. None of it contradicts my assertion.

You seem to be saying that DARPA is a overall beneficial to science/ technology and job creation.

On the other hand, This is what Bacon/Aaronson say (note that they make this the centerpiece of their proposal, and devote about 2 out of 11 paragraphs to it)

A Proposed Program
Current funding for quantum computing is split over numerous agencies, the larger portion coming from defense and intelligence agencies (iARPA, DARPA, etc.) and a smaller portion coming from the National Science Foundation. If a national investment in quantum computing is undertaken, we believe that NSF is the natural agency for such an investment to be housed. First, this is because the majority of currently active research groups are centered around academic computer science, physics, and mathematics departments….”

Sounds like they are disdaining DARPA and non-academics. because, well, by coincidence, Bacon/Aaronson are academics. Nothing about job creation either

8. Scott Says:

There seems to be a serious misunderstanding. Dave and I were asked to address the question of where funding “ought” to come from, and after some discussion, agreed that from an intellectual standpoint, the most “natural” agency at this time is NSF. But if NSF can’t adequately support the field (and/or rules prohibit it from funding LANL for example), then certainly I’d rather see DARPA, NSA, ARO, etc. pick up the slack than see scientifically compelling projects go unfunded. (In the past, QC funding has always come from a bewildering hodgepodge of programs/agencies, and realistically, I imagine that will still be true in the future.) In retrospect, we should’ve said explicitly that we’d be delighted to see DARPA, the national labs, etc. continue to play a role in this field.

9. John Sidles Says:

Scott and Dave propose the goal: “A goal should be set for these centers of building a quantum computer which outperforms today’s classical computers at quantum simulation tasks within the next decade”

Entrepreneurs know “you always end up running an enterprise that is different from the one you first envisioned.”

So it’s risky to set a goal as specific as “building a quantum computer that outperforms today’s classical computers at quantum simulation tasks” Because this goal competes head-to-head with a discipline—classical simulation of quantum systems—that has been radically expanding its capabilities.

A classic technological arbitrage strategy would be to set this subtly modified goal: “A goal should be set for these centers of transformationally accelerating quantum simulation capabilities within the next decade.”

Under this larger aegis, the centers could develop both classical and quantum simulation methods in tandem. The classical methods are nearly certain to yield transformational gains, since every previous decade since 1920 has done so. The quantum computer approach poses far higher risks, while promising even higher payoffs.

These two paths could then cross-pollinate each other—which from a mathematical point of view, comes about very naturally in any case. That way, at the end of the decade, the centers could be certain of having catalyzed massive progress in quantum simulation capability.

The point being, a decade passes swiftly; so it’s prudent to embrace high-risk goals that are nearly certain to be achieved!

Also, just to mention, a major gap in Ed’s list of fifteen essays is any mention of ab initio quantum simulation, and yet my IBM Blue Gene colleagues tell me that on many machines, this class of application is the single biggest consumer of computing cycles.

The radically new mathematical and conceptual tools that the QC+QIT community has developed are well-suited to catalyze further advances in quantum simulation science. Indeed, our QSE Group’s view is that the QC+QIT community is already delivering major breakthroughs in this area, and we’ll talk about this point of view at SQuinT. See you there!

10. nuncio Says:

World of Warcraft is the biggest and the most successful MMORPG. The game continues the story of World of Warcraft III : The Frozen Throne. After the big war in Azeroth, the habitants will fight now to get Azeroth free of the Dead Legion.

11. rrtucci Says:

Nuncio, as a catholic myself, I can see why a papal nuncio would be so interested in the world of warcraft. The catholic church has contributed immensely to the craft. You must fell right at home.

12. trond Says:

Maybe outlawing software from programming languages that doesn’t enforce boundary checks when storing to memory would be better for national security? I’d say internet security for online banking if far more important than military wargames.

13. Chris W. Says:

Somewhat off-topic: A great physicist puts in a plug for TCS and the study of computational complexity in an essay on Hermann Weyl’s legacy (via N.E.W.).

14. John Sidles Says:

What a fine essay, Chris W! No author is given, but apparently it is by Frank Wilczek?

15. John Sidles Says:

Just to confirm, the essay linked-to by Chris W is by Frank Wilczek (it’s hosted on Wilczek’s web site, doh!)

I’ve never seen the information-theory blogosphere to be so dead and/or downbeat as it is right now (except for that active and always-optimistic Nuit Blanche, that is). This gloom surely isn’t appropriate to the Holiday season, and so I’ll post a brief appreciation of Wilczek’s essay, that emphasizes two of its aspects that (to me) seem directly applicable to opportunities for younger researchers in quantum information science.

p. 4: “Symmetry has proven a fruitful guide to the fundamentals of physical reality, both when it is observed and when it is not!”

Quantum information science has (AFAICT) only one fundamental symmetry, but it is an extraordinarily powerful one. That symmetry is Nielsen and Chuang’s Theorem 8.2: Unitary freedom in the operator-sum representation. This is a regrettably awkward name for what is arguably the most fundamental mathematical theorem in the book, so I am going to abbreviate it as UNFOSURE—suggestions for a better name would be very welcome!

We first notice that UNFOSURE is a gauge invariance, in the sense that when we unravel a quantum trajectory, we are free to choose our UNFOSURE representation independently (or even adaptively) at every point in the state-space. This point of view has three benefits: (1) it immediately links UNFOSURE to Terry Tao’s wonderful on-line discussion of gauge theory, (2) it links UNFOSURE to field theory and general relativity, which as Wilczek reminds us, historically made little progress until gauge invariance was appreciated as central to field-theoretic dynamics, and (3) UNFOSURE guarantees that pure states of open quantum systems are unobservable, so that open quantum systems become paradigmatic examples of Wilczek’s principle “Symmetry is a fruitful guide both when it is observed and when it is not!”

This sets the stage for a second appreciation of Wilczek’s essay:

Under the influence of information technology, attention has turned from the issue, famously pioneered by Gödel and Turing, of determining the limits of what is computationally possible, to the more down-to earth problem determining the limits of what is computationally practical.

If we take practical in it’s most literal and down-to-earth sense, namely, asking what we can compute in-practice, right now, with tools presently available, then we see that UNFOSURE is among the most valuable mathematical tools for practical calculations and simulations in quantum information science.

This comes about for the same that gauge invariance is among the most valuable mathematical tools for practical calculations in field theory, that reason being, if we are clever and diligent, we can hope to find an UNFOSURE “gauge” that makes our calculations and simulations vastly simpler and more efficient.

At present, gauge theory is much more fully developed than UNFOSURE theory … so much so, that UNFOSURE theory does not (at present) even have a name. The point of this post being, the as-yet uncharted territory of UNFOSURE invariance is fertile habitat for young researchers in quantum information science. It could even happen, perhaps, that both gauge and UNFOSURE invariance might someday be subsumed in a fundamentally informatic point of view.

Of course, a rich essay like Wilczek’s has many alternative readings … that above is just one possibility, which is Holiday-spirited to the best of my capabilities!

16. Chris W. Says:

Indeed, as credited in this post (see appended update).

17. Chris W. Says:

Given the emphasis of your comment, John, the following from Peter Woit’s May 2006 post on Wilczek’s Fantastic Realities will be interesting:

Perhaps my favorite piece is one entitled “What is Quantum Theory?”, which deals with one of my obsessions. Wilczek claims that perhaps we still don’t properly understand the significance of quantum theory, especially what it has to do with symmetries. He notes that Hermann Weyl, soon after the discovery of quantum mechanics, realized that the Heisenberg commutation relations are the relations of a Lie algebra (called the Heisenberg Lie algebra), and that this exponentiates to a symmetry group (the Heisenberg group to mathematicians, Weyl group to physicists). Wilczek goes on to speculate that:

The next level in understanding may come when an overarching symmetry is found, melding the conventional symmetries and Weyl’s symmetry of quantum kinematics (made more specific, and possibly modified) into an organic whole.

18. Bram Cohen Says:

It’s a pretty good essay, although it completely skips the fact that the whole concept of a quantum computer is as of yet experimentally unproven, making the whole field of endeavor quite speculative.

19. John Sidles Says:

With respect, Bram, thinking about quantum computing has already greatly enhanced our understanding of quantum mechanical processes in general. This pay-off has immense practical value, is already in-hand, and promises to increase. Good!

For this reason, it seems to me that fewer researchers are regarding quantum computing research narrowly, as being about building quantum computers, because it has become clear that this achievement is a long way off. More researchers are regarding quantum computing broadly, as being about physical systems conceived as quantum computational processes, because this point of view yields immediate practical benefits and suggests wonderful new mathematical and physical questions.

20. math idiot Says:

Hi Scott,

Happy New Year to You!

MI

21. Scott Says:

Thanks, MI! Happy New Year to everyone!