Here’s a letter that Umesh Vazirani (my adviser at Berkeley) sent to *The Economist*, and which he kindly permitted me to share. I’m guessing they’ll print this one instead of mine, which is fine by me.

Sir,

Your article “Orion’s belter” regarding D-Wave’s demonstration of a “practical quantum computer”, sets a new standard for sloppy science journalism. Most egregious is your assertion that quantum computers can solve NP-complete problems in “one shot” by exploring exponentially many solutions at once. This mistaken view was put to rest in the infancy of quantum computation over a decade ago when it was established that the axioms of quantum physics severely restrict the type of information accessible during a measurement. For unstructured search problems like the NP-complete problems this means that there is no exponential speed up but rather at most a quadratic speed up.

Your assertions about D-Wave are equally specious. A 16 qubit quantum computer has smaller processing power than a cell phone and hardly represents a practical breakthrough. Any claims about D-Wave’s accomplishments must therefore rest on their ability to increase the number of qubits by a couple of orders of magnitude while maintaining the fragile quantum states of the qubits. Unfortunately D-Wave, by their own admission, have not tested whether the qubits in their current implementation are in a coherent quantum state. So it is quite a stretch to assert that they have a working quantum computer let alone one that potentially scales. An even bleaker picture emerges when one more closely examines their algorithmic approach. Their claimed speedup over classical algorithms appears to be based on a misunderstanding of a paper my colleagues van Dam, Mosca and I wrote on “The power of adiabatic quantum computing”. That speed up unfortunately does not hold in the setting at hand, and therefore D-Wave’s “quantum computer” even if it turns out to be a true quantum computer, and even if it can be scaled to thousands of qubits, would likely not be more powerful than a cell phone.

Yours sincerely,

Umesh Vazirani

Roger A. Strauch Professor of Computer Science

Director, Berkeley Quantum Computing Center

**Update (2/18):** There’s now a *Nature* news article about D-Wave (hat tip to the Pontiff). Like pretty much every other article, this one makes no attempt to address the fundamental howlers about the ability of quantum computers to solve NP-complete problems — but at least it quotes me saying that “almost every popular article written on this has grotesquely over-hyped it.”

If you really want this campaign to be effective, you should be aware of

allof the journals and magazines that wrote about D-Wave this week. A Google News search for “D-Wave” reveals a depressing number of them. If the D-Wave people were cynical, they would expect positive publicity to outrun the negative backlash. Or they may even calculate that “there is no such thing as bad publicity”.So. basically, D-wave’s quantum computer is useless!

[…] (via Scott Aaronson) […]

I don’t really understand what the fuss is all about.

It is a common knowledge that journalism (any kind of, including the “respectable Economist”) is something people should generally be skeptical about.

The current affair, just draws our attention to the sloppy job journalist do, because it deals with something close to our hearts.

However, the damage sloppy/biased journalism do is much more dangerous when it come to the political arena.

Hederlin, I apologize for not taking on all the sloppy journalism in the world at once. From the Beatles song:

You ask me for a contributionWell, you know

We’re doing what we can

err…

D-Wave needs to fool the investors to pour their money into a company that has something that is nowhere near of being useful, and won’t have it for a number of years. Journalists like an exciting piece of PR, it reduces their effort.

We just had a biotech combichem inventor here at our institute last week, a quite high-profile guy with papers in Nature to his name – a guy who is notorious for pushing his stuf into dubious startups that crash and burn one after another. He made quite a carreer of this over the last fifteen years (and probably made good cashing his stock options before the companies went under). The worst thing is that the guy displays a completely wishful thinking about drug discovery and over the years his ideas are getting progressively more hare-brained.

He was visiting on a sales pitch, wanting to latch on us because of the name and solid funding and convince us to test his mixture of chemicals that he made in combinatorial fashion. I actualy worked for a combichem company (that was guilty of similarl atrocities) ten years ago – so I asked him how he knows that he has ever made whatever he claims to have in his combinatorial mixtures of chemicals and he answered that maybe thing didn’t quite work every time – but not to worry because this is good, too – as long as we could go back and get the same crap in reproducible manner.

It was the most annoying lecture that I have seen in a long time.

“therefore D-Wave’s “quantum computer” even if it turns out to be a true quantum computer, and even if it can be scaled to thousands of qubits, would likely not be more powerful than a cell phone.”

I guess I had better sell my stock then. Darn.

I’m not suggesting these two things are in any way comparable, but all this discussion about D-Wave reminds me of ZeoSync.

LOL — breaking the Pigeonhole Principle barrier! That deserves to go down as a classic. (However, I was extremely disappointed with Wired News for asking the guy about “scientific validation,” instead of about the fact that 2

^{n}> 2^{n-1}.)I’m not suggesting these two things are in any way comparable, but all this discussion about D-Wave reminds me of ZeoSync.But they are comparable, because Rose claims that they are building a new quantum computer that will either show that BQP contains NP or violate the PCP theorem:

Rose has also expressed regret that BQP probably does not contain NP (I too was

crushedwhen I learned it), so apparently D-Wave considers the PCP theorem to be the more surmountable obstacle.it isn’t proven that P!=NP implies NP not in BQP, or is it ?

(i don’t know admittedly)

However, the company might have developped an unexpected algorithm. ( for sure it has not but in theory it’s possible)

it isn’t proven that P!=NP implies NP not in BQP, or is it ?Nope.

i think the best part is the claim that the new system can give approximate answers to “arbitrarily large NP-complete problems”. It’s pretty intuitive that even very easy problems are not solvable, if the problem size is really large.

Well, even NP-complete problems obviously have tractable special cases. The relevant question is whether there are “naturally-arising” special cases (i.e., not reductions from factoring) that are easy for quantum computers but

notfor classical computers. To date, we don’t have much evidence either way.Here is a selection of original articles on the D-wave demo (i.e., not reprints from news feeds). Again, if you want to debunk this stuff, I am not sure that you should stop at the Economist.

TechNewsWorld describes quantum computing as free parallelism and describes skepticism of D-wave as debating “finer points”.

HPC wire equates quantum computing with an exponential speedup of NP-complete problems. It repeats Rose’s disclaimer concerning approximate solutions. It acknowledge skepticism in terms of practical feasibility.

ABC news describes quantum computing as massive parallelism, and relates quantum searches to Google-like database searches. It describes skepticism as “wait and see”. Herb Martin from D-Wave mentions searches for designer drugs.

Extreme Tech quotes Martin as saying that D-wave “mark[s] the end of the beginning of quantum computers.” Rose specifically mentions the travelling salesman problem, despite the PCP theorem. It claims that the D-wave can do 64,000 calculations simultaneously.

The Telegraph declares “Canadians win race to build ‘super computer'” . Like many of the D-Wave articles, it emphasizes 64,000-fold simultaneity.

CBC News reports that the experts are skeptical because D-Wave hasn’t submitted its work for peer review.

Daily Tech mentions peer review and “wait and see” skepticism. In this article, Herb Martin says, “A general purpose quantum computer is a waste of time.”

May I suggest, with respect, that perhaps this hype is a canary in the mine with respect to the paucity of products in the pipeline resulting from fundamental quantum system engineering research?

In other words, if there were stronger products coming on-line, and perhaps also ABET-certified degree-granting programs in this area, then the public would be better able to distinguish “hope from hype”.

Our UW QSE Group would be very interested to hear from faculty at other institutions who are contemplating the ABET certification requirements, as they relate to quantum system engineering.

Hi John,

I’m left wondering whether your post was supposed to be a joke. While I agree in general that getting certification for degree-granting programs is a good idea, it is very hard for me to see how this will reduce the number of clueless science journalists in the world.

I used to work for an analyst firm (BDTI) which produced reports on the embedded processor market. I remember being pleasantly surprised when a vendor asked us to review marketing claims regarding a new product for accuracy before the vendor went public with them. Seeing the controversy here, I can better understand why they bothered.

I was wondering if I’m misunderstanding something. Has anyone heard or read something from the D-Wave people (Geordie or others) that directly implies they really misunderstand the Vazirani, van Dam, and Mosca paper or the PCP theorem? It seems to me that the key issue is that which Scott addressed in his last comment (Feb 18, 12:36 pm), namely:

The D-Wave people seem to be betting that there are such special cases. This is something far, far different than the D-Wave people either:

1) misunderstanding Umesh, Wim, and Mike’s argument that existing query lower bounds can’t rule out the possibility of adiabatic quantum computation putting NP in BQP as saying adiabatic quantum computing really does put NP in BQP

or

2) blithely assuming that approximation of NP-complete problems is fundamentally easier than exact solution when, of course, the PCP theorem shows that the ability to approximate, say, MAX CLIQUE to within a constant factor

on any instanceis in fact saying P = NP.This isn’t to say I’m not highly irked to see claims that D-Wave’s device, which as I think Umesh very rightly emphasizes has not been shown to be globally coherent, is doing “64,000 calculations at once.” And as any self-respecting quantum computing scientist should be, I’m aghast at the wholesale ignorance by certain writers that quantum parallelism cannot brute force NP into BQP because of the well-known, decade-old arguments for how Grover’s algorithm is optimal. Nonetheless, as people get annoyed with bad science news writing and with D-Wave not trying to swat down this bad science news writing that benefits them, I think it’s important to not overstate the anti-D-Wave (or better said, the anti-adiabatic-quantum-computation ever being useful) case.

Or am I misunderstanding something?

Peter says: “I’m left wondering whether your post was supposed to be a joke.”

The part about ABET certification for programs in quantum system engineering was not a joke. It’s a long-term goal — perhaps as much as a full decade off — but one that we are definitely planning and preparing for.

As for preventing journalists from writing amazing but fallacious stories, well, that’s much tougher to fix!

We all know Dwave is a scam. And yet people have invested in it. How can we, who know what’s going on, financially exploit this state of affairs? If it were a publicly traded company we could short them.

As Baez reminds us, if we have S bits of inside information we should be able to multiply our money by 2^S. Knowing that Dwave is a scam is essentially inside information.

I was wondering if I’m misunderstanding something. Has anyone heard or read something from the D-Wave people (Geordie or others) that directly implies they really misunderstand the Vazirani, van Dam, and Mosca paper or the PCP theorem?It’s something of a judgment call because you can hide a lot behind the fig leaf of “I explained it badly”. I personally am convinced that this really does go against the PCP theorem. You’re entitled to read it yourself carefully to come to your own conclusions.

As I read it, Rose clearly tries to use approximate optimization to dodge complexity bounds on BQP. He says nothing about the PCP theorem and his only mention of special cases is exact solutions. He never quite explicitly proposes a superpolynomial acceleration for a NP-complete problem for which the PCP theorem holds, but he comes very close.

I recently had an opportunity to hear a fairly reliable conjecture about what the D-Wave folks are (or might be) thinking internally. It doesn’t justify or excuse the hype (for which the lion’s share of blame is split evenly between journalists and Herb Martin), but it

doesprovide a useful reference frame, at least for me. So think of this as a plausible description of what somebody in Engineering might have been thinking.1. This demo was really not supposed to be a proof of technology: it was a systems-level demonstration. Which means, precisely, that it demonstrated the interface to a pending technology, along with a mock-up of how the actual device would work. This is a fairly relevant thing to do, and normal in the tech-production scene, but it’s pretty alien to how scientists typically work (we’d demo the guts, not the interface).

2. One might plausibly hope for the device to be equivalent to a brand-new classical heuristic for NP-complete problems. As several folks have pointed out, only a

fewproblem instances need to be hard for it to be NP-complete. Any effective classical algorithm finds certain instances to be easy. A different heuristic might find a different set of instances to be easy. If their adiabatic device performs like this, then it could be worth quite a lot in practice, without (a) violating PCP, VvDM, or P!=NP; or (b) re-arranging the complexity classes at all. And, no, I’m not proposing this as an interpretation of what Geordie wrote. Just as a hypothetical description of whatsomebodymight be thinking.The discussion about D-wave is a nice case-study for the issue of skepticism in science, its industrial applications, and the interface with the media.

Here is how I see the “Economist” article

1. The article in economist follows standard convention of reporting about new scientific/commercial endeavor in a way which is overall empathic (and not skeptical) and based mainly on material provided by the company (researchers). Overall, this is a good convention.

2. The scientific-popular explanation in the article is overall rather reasonable. There is one unfortunate naive sentence about NP completeness and quantum computers which goes contrary to the understanding of researchers in the field. But this is a subtle mistake (for a layperson), not an obvious one. (The second paragraph is also not optimal but it will be a difficult task to replace it by a better one.)

The EDIT-Distance of the article to a good account of quantum computing is small. Replace “one shot” by “quicker than in digital computers”, or something like that.

3. It is not a trivial matter to explain in a popular way the potential advantages of quantum computers and related subtleties.

Thus, I cannot understand Scott’s sentence: “…a lazy magazine writer, with a single harebrained sentence, cancel out your entire life’s work twenty times over… ” unless it represents an effort to be the funniest physics blogger at all costs. And I cannot agree with Umesh that “this article sets a new standard for sloppy science journalism”.

I have seen an article stating that “the largest prime number have been found!” and an article about a person who can remember by heart the first 2000 digits of pi. The article then stated that pi=22/7 and included the first 100

digits or so starting with :3.142857142857142857142857142857… . Umesh’s main argument is with D-wave not with the journal. The “Economist” cannot and should not check if D-wave have the correct understanding of a technical paper in quantum information.

Almost in every case when you read a newspaper article on a matter you understand, you find that it is quite inaccurate. In many cases these inaccuracies are not important from the point of view of layperson. If you want more media coverage for related scientific issues (and Scott strongly advocates for more media coverage,) you have to understand and tolerate the media limitations and standards.

And now about D-wave.

What is the main reason to be skeptical about D-wave? Here are some options:

1) Their fantasies are too high: They are talking about solving quickly NP-complete problems.

2) Their PR is too outrages. They’re PR hints about computational power beyond BQP.

3) Their machine is too WEAK. Their machine is a less powerful computer than a cell phone.

4) Their machine is unscalable. It will remain less powerful than a cell phone even when scaled up.

5) They are basing their project on a misunderstanding of a certain scientific paper.

6) Their machine looks too STRONG to be believable. It seems hard to believe that they built a genuine quantum computer that can do what they claim it is doing.

7) Skepticism about quantum computers in general.

Points 1) and 2) the PR, hype and fantasies can be warning signs. But overall PR and hype is not a big deal and certainly not the crux of the matter. The relevance of point 7) is small. Serious attempts to build quantum computers is something serious skeptics should welcome. Non-serious attempts make no difference whatsoever. The possibility (but not the certainty) of quantum computers while bold can be regarded as part of our current collective understanding of nature, and in view of the high stakes it should be vigorously pursued.

(Of course, while I talk about “serious/non-serious” as a binary variable it is actually a continuous variable and often there is also a large amount of uncertainty.)

Points 3) and 4) are serious but (without further details like point 5)) they represent a too high threshold even for serious attempts to build quantum computers.

Most likely every attempt will start with a machine less powerful than a cell phone and the issue of scalability will be questioned until it is will be resolved. Many failures are expected on the way (if not all the way.)

For me the crux of the matter is point 6). Not the fantasies, not the PR, not the comparison with cell phones, and not the grand theoretical debates, but the question what did D-wave actually DO.

The paragraph in the Economist article (which is not the fault of the journal) which can raise the eyebrows is

“D-Wave Systems chose two very practical problems to demonstrate its 16-qubit processor, which it has called Orion. One was a pattern-matching application for searching through a database of molecules—the sort of task that a drug company looking for the key to a particular molecular lock in a disease-causing protein might want to do routinely. The other was a scheduling application for assigning people to seats, subject to certain constraints. Airlines might be interested in that one.”

This simply does not seem believable, and it is the task of D-wave to convince that this is “real”. Hence the possible analogy with 19th century chess playing machines (like “the turk”).

I recently had an opportunity to hear a fairly reliable conjecture about what the D-Wave folks are (or might be) thinking internally.Most of your comments ring true. But I note that if their thinking isn’t completely logical, then one can only analyze it so far. My impression is that they have been bent by some of their hype.

the lion’s share of blame is split evenly between journalists and Herb MartinOn the contrary, it seems to me that Geordie Rose is in the best position to keep everyone else in line. Neither journalists nor salesmen have the training to tone down what they are told by someone with a PhD in physics. Maybe Herb Martin is a major part of the problem — that is an internal D-Wave matter — but I really think that people are too quick to bash the journalists in this thread.

This demo was really not supposed to be a proof of technology: it was a systems-level demonstration.Rose’s own words are “proof of concept”, but I agree that it systems-level demo could be the closest description to what it really is. In other words, it looks like stone soup.

But note that in the same comment on his blog, Rose plainly implies that the

only reasonthat the “Orion” isn’t useful is that it has too few qubits.One might plausibly hope for the device to be equivalent to a brand-new classical heuristic for NP-complete problems.Of course, the main history of computing is a trend away from SPDs, especially analog SPDs, in favor of Turing universality. How much can you really hope that another such device is usefully new?

At one level there is little need to speculate, because Rose’s blog has a rough description of their device. It is (or is intended to be) a model spin system with a pairwise interaction and some noise.

Gil: It is fair to ask what it accomplishes to criticize D-Wave. I have grossly violated the adage, “If you don’t have anything nice to say, don’t say anything at all.” It may not accomplish much. But for the record, what bothers people is the feeling that D-Wave is diverting both money and public attention away from more worthy work in the field. I suppose that you could argue that they are partly diverting attention away from Anna Nicole Smith and similar, which is just fine.

But on second thought, something really is wrong. I have been asked whether the D-Wave announcement is “for real” by highly trained people on campus who are only “lay” in the sense that they are not QC insiders. The QC community could have a lot to lose if no one objected to D-Wave’s hype. If no one objected, other university faculty and even the NSF might think that the rest of the community is like D-Wave.

As for journalists, let me explain why I think that we should cut them some slack. They have been told for a long time that QC is serious even though it sounds like quantum-mechanical magic, and even though are plenty of skeptics. Our position is that even though these skeptics may have PhDs, they are not really experts in QC specifically. Now we are blaming them for believing a smaller group of apparent experts who ratchet up the hype beyond the rest of the QC community. Unless you expect the journalists to all hold PhDs themselves — I suppose that a bare handful do, but I don’t see anyone here sending a job application to the AP — they simply aren’t prepared to resist D-Wave’s obfuscations.

As for journalists, let me explain why I think that we should cut them some slack.And before Gil gets too irritated with me, let me add that he seems to agree with me and already made some similar points.

You know, given that Scott has announced that his sympathies can be bought, I’m kind of surprised that Geordie hasn’t funneled a little venture capital in his direction. Maybe Scott is just playing hard-to-get.

Here’s a technical question about Deutsch’s interview in Wired: At one point he says that a functional quantum computer would break all (classical) forms of cryptography. Am I missing something here? I mean, I can see the quadratic speed-up, and yes it would break RSA and possibly some related algorithms. But… for most codes wouldn’t it suffice to double the number of bits of key to get the same amount of security?

There are cryptosystems — even public-key ones — that we don’t know how to break with a quantum computer. If Deutsch said otherwise, he was either mistaken or oversimplifying for the purpose of the interview. Some of these cryptosystems are based on the conjectured hardness of the approximate shortest vector problem. See for example this paper by Oded Regev.

Another good cryptographic resource is the NSA’s Fact Sheet for Suite B Cryptography. Basically, the NSA now recommends that all government agencies switch from RSA (factoring-based) to ECC (elliptic curve) cryptotography.

To the best of my (quite feeble) cryptographic knowledge, ECC codes are vulnerable to quantum algorithms; perhaps someone with greater knowledge than me can confirm this.

On the status of cryptosystems not known to be broken by quantum computers here is the text of a comment I originally posted at http://dabacon.org/pontiff/?p=1334 in response to a statement of Smolin’s that was similar to Deutsch’s.

Smolin’s statement is correct for _practical_ asymmetric (public key) encryption systems whose security is widely agreed upon by the cryptographic community. (It is too strong because quantum algorithms haven’t touched the security of symmetric key encryption systems.)

McEliece is not practical; according to The Handbook of Applied Cryptography, for the recommended security parameters the public key size is 2^19 bits, which will be unwieldy for most applications for the foreseeable future (also the data expansion factor is about 1.6). Also because of its impracticality its security has received less scrutiny.

NTRU: According to Koblitz and Menezes’s A Survey of Public Key Cryptosystems (http://citeseer.ist.psu.edu/695523.html), “a history of successful attacks on various versions of NTRU makes many people hesitant to endorse its use.”

My impression is that the Ajtai-Dwork lattice based system, and its successors, also have impractically large public keys, even after the improvement by Regev. Can anyone confirm this? (Also Regev has shown that these systems would be broken by any quantum algorithm for the dihedral HSP involving coset sampling. So these systems depend on a problem closely related to those Shor’s algorithm solved, though that problem has proven difficult.) Does anyone know the relation between these lattice based systems and NTRU?

Koblitz and Menezes’s A Survey of Public Key Cryptosystems also discuss PKE cryptosystems based on braid groups which are widely believed to be insecure. They also mention systems based on hidden monomials. Anyone know the status of those? Many versions of knapsack based PKE cryptosystems were broken so that people are pessimistic about using this problem as the basis for a PKE system. Many other types of PKE cryptosystems have been developed and then broken.

Given the historic difficulty of creating practical PKE cryptosystems based on problems other than factoring or discrete log, here’s a question I believe is open to debate: which will come first, the building of a quantum computer capable of breaking existing factoring and discrete log PKE cryptosystems or the development of a practical PKE cryptosystem which is widely believed to be secure against quantum and classical attacks?

John, yes, elliptic curve cryptosystems are based on the discrete log problem so are broken by Shor’s algorithm.

You know, given that Scott has announced that his sympathies can be bought, I’m kind of surprised that Geordie hasn’t funneled a little venture capital in his direction.Kurt: My views on

quantum gravitycan be bought. My views about the hardness of NP-complete problems are not for sale to anyone.(Also Regev has shown that these systems would be broken by any quantum algorithm for the dihedral HSP involving coset sampling.Hi Eleanor!

Note that my algorithm does apply to coset sampling, even cosets with moderate spurious superposition. (Cosets with the other kind of error, too little superposition, are poison.)

However, my algorithm combined with Regev’s paper has roughly the same complexity as good classical lattice algorithms. (And “roughly the same” probably means moderately worse rather than moderately better, on top of the fact that classical algorithms don’t need a quantum computer.) The gist of my result is that some of the ideas for good lattice algorithms also give you good DHSP algorithms. The gist of Regev is that DHSP and lattice problems are quantumly polynomially equivalent. However, his polynomial equivalence does not preserve “good” in the weak sense of my paper.Hi Greg!

Thanks for the information; I hadn’t known that your result was based on ideas from classical lattice algorithms. (I looked at your paper when it first appeared on the arXiv, but hadn’t noticed your extended version until now.)

For clarity I should have said any _polynomial time_ quantum algorithm would break these lattice based systems. Your algorithm for the DHSP, together with Regev’s improvement, is the best quantum algorithm we have so far but it is still superpolynomial so doesn’t break lattice based systems. My main concern about the Ajtai-Dwork algorithm and its successors is that I believe they have unwieldy key sizes; I should check that myself but if you or someone else knows for what key size these give the same level of security as RSA for example let me know! The lattice based approach still seems like the most promising candidate for the development of a practical public key encryption system safe from quantum attack, but I wouldn’t count on it yet; apart from the key size issue, the DHSP problem certainly seems hard, but I’m not ready to say it won’t fall to a novel quantum attack. Should I be?

Thanks for the information; I hadn’t known that your result was based on ideas from classical lattice algorithms.More “related to” than “based on”. Sadly, that is about all I can say in response to your good questions.

The DHSP problem certainly seems hard, but I’m not ready to say it won’t fall to a novel quantum attack. Should I be?Oh sorry, I can say something about this. Every quantum algorithm that I know of is in a sense a communication result: it manages to extract a miraculous amount of information from a classically unyielding oracle. If you’re asking about polynomial-time algorithms for DHSP, then Regev has at least removed the oracle from the problem. So it is hard in the sense that we’ve all been robbed of ideas. There are no reduction results that indicate that it is hard, except for Regev’s two reductions that go in a circle.

On the other hand, there is an important reduction that indicates that SHSP (symmetric group HSP) is hard. Namely, HSP for a group G is at least as hard as HSP for any subgroup H of G. For example, the symmetric group contains huge dihedral groups, so SHSP is DHSP-hard. I (and others) conjecture that SHSP is much harder than graph isomorphism, which its famous special case.

Well, there is a reduction result for DHSP that isn’t all that conclusive. DHSP can be defined as the abelian hidden shift problem with a single shift. It is therefore at least as hard as any problem with multiple hidden shifts, which includes among other cases some other near-abelian hidden subgroup problems.

Thanks, Greg!

I like your suggestion to think of a quantum algorithm as a communication result. I’m a bit puzzled by what you said next. Since DHSP is an oracle problem by definition, I’m not sure what it would mean to remove the oracle from it. I would have said that Regev’s result meant that we can’t beat classical algorithms if we only use of the oracle in the usual way (coset sampling) – the only way we know – and for this reason “we’ve all been robbed of ideas.” Did you mean something stronger (or different)?

Since DHSP is an oracle problem by definition, I’m not sure what it would mean to remove the oracle from it.My remark is that if you want polynomial time, you should just try to solve SVP directly. The oracle is a distraction; SVP doesn’t have an oracle and it is already hard enough. Part of the point is that Regev’s important other reduction, from DHSP to subset sum, suggests that SVP is not all that much easier than all of DHSP.

Greg, thanks! Very helpful.