## Eigenvalues up the wazoo

An anonymous commenter on my last post asks:

isn’t the real problem with complexity theory that the resulting mathematics is usually superficial and shallow?

does this make it less fun to do complexity theory?

are complexity theorists ever saying something deep?

Later, the same commenter writes:

i’m just curious because I don’t really understand how I feel about the issue myself.

maybe we should start with something more basic. can we all agree that “logic” (i.e. foundations of math) is pretty boring and flavorless?

sure, we all got that little rush when we heard the story of the “fall of mathematics” in the early 20th century … and then maybe again with the axiom of choice, continuum hypothesis, independence, forcing, and various incompleteness theorems.

but is logic actually fun to do? on an emotional level, do you achieve understanding, intuition?

Alright, look, anonymous. You’ve nailed why I don’t work on logic myself — besides not understanding what the big, meaty open problems are. For me, frankly, reading about logic (or recursion theory, or programming language semantics, or distributed computing) has always felt like sipping broth. Sure, it might be delicious broth. In the case of (say) GΓΆdel and Cohen’s independence results, it might even be the best broth I’ve ever tasted. But eventually I hanker for some noodles, some carrots, maybe some complex numbers or a Cauchy-Schwarz inequality. I mean, how long can a person go without bounding anything?

But you see, anonymous, that’s what I like about complexity. It packs the same theological punch as logic does, but it’s got math in it too. And I’m not just talking combinatorics and graph theory. Let me put it to you this way:

You like groups? We got groups.

You like vector spaces? We got them too.

But what about number theory? Finite fields? Fourier transforms? Continued fractions? “Shor.”

Eigenvalues? Chebyshev polynomials? Gaussians? Random walks? Lattices? Convex polytopes? Banach spaces? Metric embeddings? You better believe it.

Or how about this, anonymous: what’s your favorite constant? Ο? e? The golden mean? Maybe 0.288…=(1/2)(3/4)(7/8)(15/16)…? Becoming a complexity theorist doesn’t mean bidding any of them goodbye.

Look, we even got knots, braids, manifolds, unitary representations, varieties, cohomologies, plethysms — alright, maybe not so much of the last few. But if your favorite mathematical object isn’t in stock, bring it yourself! That’s the thing about complexity: anything is fair game if it yields a new upper or lower bound. The reason it’s so hard to prove P!=NP is precisely that a fast SAT algorithm could be hiding anywhere in the math library.

Now let me turn the tables, anonymous. Can you name a subfield of math that involves so many different kinds of math?

### 46 Responses to “Eigenvalues up the wazoo”

1. Kurt Says:

Can you name a subfield of math that involves so many different kinds of math?

Uh, complexity theory? Oh, wait, nevermind…

But with regard to the broth analogy, if it eventually turns out that P vs. NP is logically independent, you might find yourself sampling that broth yet again.

2. Scott Says:

But with regard to the broth analogy, if it eventually turns out that P vs. NP is logically independent, you might find yourself sampling that broth yet again.

Right — and for me, that’s one more reason to guess that it’s not independent (or at least not provably so)! It’s hard to imagine how anyone could scale this mountain on a diet of logical broth.

3. Kurt Says:

Okay, I was just being cute in my last comment. But I do regard TCS as being a branch of mathematics, even if it does have quite a different feel than classical areas of math.

With regard to the possible logical independence of P vs. NP, I think most computer scientists would be totally unphased by such a result. Since P vs. NP represents a question about the physical world, as well as a mathematical question, logical independence would only serve to show an insufficiency in the axioms of ZFC. It would certainly be of interest to FOM types, but I don’t think it would have any significant impact on most complexity theorists.

(And of course the main reason for thinking about independence in the first place is just an argument from arrogance–gee, this problem is really hard, maybe it’s not that we’re ignorant but that the problem is actually unsolvable.)

4. Anonymous Says:

first of all, yes, I can name other subfields of mathematics which involve all these things and more. in fact, I can name sub-sub-fields, like equidistribution of primes (e.g. how many prime solutions less than N satisfy a system of polynomial equations as N to infty) or, say, geometric group theory (what can we say about a group using only the language of its intrinsic geometry).

but the question you are asking is probably not the right one. indeed, I can say rather confidently that the connections in number theory are deeper than those in geometric group theory. it’s not surprising: number theory is a much older field. perhaps one can really test the strength of connections by asking if they are only uni-directional. does complexity theory contribute back to the mathematical fields from which it borrows? I know of a small number of (rather unimpressive) cases for which this is true, but they are unconvincing at best.

is this even the right question?
I don’t know.

but certainly the mathematics of Shor’s algorithm (i.e. fourier transforms on abelian groups) is elementary. the same goes for the use of chebyshev polynomials in proving degree lower bounds for a polynomial fitting some data (e.g. to prove lower bounds on quantum query complexity). I suspect that mathematicians would not call this deep. (I am not, in any way, degrading the _difficulty_ or _novelty_ of these proofs, only their “deepness”).

there are instances of deeper mathematics being used in complexity theory, but usually in a superficial way, e.g. the use of Milnor-Thom for lower bounding algebraic decision trees. but no one is delving into hard core Morse theory to accomplish their goals, and that brings up another point: why don’t we expect mathematicians to find the deep connections that lead to P neq NP (as opposed to complexity theorists)?

(p.s. does blogger show the ip addresses of commenters to the blog author? I assumed this was the case, and thus scott knew (essentially) the identity of posters.)

5. Scott Says:

With regard to the possible logical independence of P vs. NP, I think most computer scientists would be totally unphased by such a result.

Dude. If that doesn’t faze you, then you’re ipso facto not a computer scientist.

(Note that the current proof techniques don’t go outside certain fragments of PA, let alone ZFC. See my survey paper for more.)

6. Scott Says:

p.s. does blogger show the ip addresses of commenters to the blog author? I assumed this was the case, and thus scott knew (essentially) the identity of posters.

I don’t go sleuthing with IP addresses, not only because it seems ungentlemanly but also because I don’t have time. Why don’t you unmask yourself? Your views on complexity might be mistaken, but they’re not a discredit to you. π

7. Scott Says:

why don’t we expect mathematicians to find the deep connections that lead to P neq NP (as opposed to complexity theorists)?

Well, no one’s stopping them. π

Seriously: I talk to mathematicians all the time about my complexity problems. Sometimes they’re a real help. Other times all they can do is restate the problems in fancier language…

8. Scott Says:

perhaps one can really test the strength of connections by asking if they are only uni-directional. does complexity theory contribute back to the mathematical fields from which it borrows?

There’s a huge number of cases where complexity led to new questions being raised. (Babai’s short presentation conjecture, the majority-is-stablest conjecture, Rudich’s minterm conjecture, degree vs. approximate degree, …)

There are also many cases where complexity led to a new kind of question being asked about a familiar object. I.e., is it NP-hard to distinguish a knot from the unknot? Is finding a Nash equilibrium as hard as finding other kinds of fixpoints?

But I’ll admit that, if your goal is to contribute to analysis, group theory, etc. (rather than just using them on a daily basis), then complexity probably isn’t for you. You should continue studying “deep” math, and then, when something like Shor’s algorithm or PRIMES in P comes along, remind yourself that you could easily have done it first. π

9. Anonymous Says:

Scott’s post is accurate of course, but in some sense this is beside
the point. If you really like mathematical area X, then go work on
X. While it may turn out to be connected to complexity, the surest
way to making a contribution to X is working on X, just as the
surest way of making a contribution to complexity is working on
complexity (and that’s why I’m not holding my breath for a great
mathematician to stumble upon a proof that P is not NP).

Everyone has different reasons to do what they do, but I work on
complexity because I deeply care and am very curious about the
questions. I also find it to be fun – there are a lot of beautiful
proofs and new techniques pop up all the time. For me the fact that
it is a fresh field with everything wide open is a plus and not a
minus – I’m not jealous of the people 300 years from now that will
be working on Aaronson’s hypothesis that the circuit complexity of
circuit-SAT is exactly 2^{n-sqrt{n}}, even if these people will scoff at the stuff we do now as elementary and shallow.

The lesson that one should take from Scott’s examples is not that
complexity is interesting because it’s connected to this or that
subfield of math. Complexity, dealing with some of the clearest to
state and thought provoking questions of science, does not need this validation. The slew of different areas (even if it’s only
“elementary” parts of these areas) that are used just demonstrates
the variety of tools and skills that can be employed in complexity.
This contributes to a wonderful and varied set of people working on
complexity (which is another reason why I like it so much). They
also tend to quite nice and un-snobbish – perhaps working in a
field where you are daily humiliated by the most elementary
questions does this to you.

–Boaz

p.s. Sorry Scott, I picked you because you’re the youngest researcher I know, you’re about -100 years old, right?

10. Anonymous Says:

sorry about the ugly formatting -boaz

11. Scott Says:

Boaz: Thanks for the eloquent comment!

I’m not jealous of the people 300 years from now that will be working on Aaronson’s hypothesis that the circuit complexity of circuit-SAT is exactly 2^{n-sqrt{n}}

But I never hypothesized that! Incidentally, do we know that 2^{n-sqrt{n}} is an upper bound?

Sorry Scott, I picked you because you’re the youngest researcher I know, you’re about -100 years old, right?

Plus or minus 124 years.

12. Greg Kuperberg Says:

I agree that complexity theory can be seen as an area of pure mathematics. (Maybe that’s why I like it.) But your argument is a closer fit for quantum computation than for the rest of complexity theory. (Which is maybe why I like quantum computation in particular.)

13. Anonymous Says:

Scott said:

You should continue studying “deep” math, and then, when something like Shor’s algorithm or PRIMES in P comes along, remind yourself that you could easily have done it first. π

I suspect that mathematicians would not call this deep. (I am not, in any way, degrading the _difficulty_ or _novelty_ of these proofs, only their “deepness”).

So he/she is not claiming the discoveries in the field are “easy”, he/she seems to be saying only that the field doesn’t seem to be a “deep” field of mathematics to him/her.

While I disagree with the tone of his/her remarks, I think he/she is raising a point that can be reasonably discussed.

14. Anonymous Says:

“I suspect that mathematicians would not call this deep”

Could you define what you mean by deep? Deep as in related to questions that are interesting to mathematicians in general? to algebraists? to combinatorialists? Deep as having broad applications to other sub-fields of math? Deep as being very general?

In any case, who cares whether the “mathematics used are deep” (whatever the definition of deep may be…). What matters here is that the *computational* results are deep. And by deep, I mean relevant to fundamental questions about our world and its limits, from a perspective that has never been really considered before.

15. Anonymous Says:

I don’t go sleuthing with IP addresses, not only because it seems ungentlemanly but also because I don’t have time. Why don’t you unmask yourself?

being anonymous affords one the ability to express sentiments which are not completely in line with their actual views, while not being held accountable for this expression later. I don’t think I would have gotten such interesting responses from scott and boaz had I simply said “isn’t complexity theory neat?”

consider, for instance:

While I disagree with the tone of his/her remarks, I think he/she is raising a point that can be reasonably discussed.

while I thought my tone the whole time had been primarily inquisitive, it was apparently perceived as aggressive by some who disagreed with me.

one can ignore these subtle social implications once under the shroud of anonymity…

16. Anonymous Says:

Could you define what you mean by deep? Deep as in related to questions that are interesting to mathematicians in general?

if you list something like “knot theory” or “cohomology” as having some relation to your field, one can ask about the substance of that relationship.

if the answer is that “well, deciding the equivalence of two knots is NP-complete,” then I think we all agree that this relationship is superficial and not “deep.”

if, on the other hand, you use algebraic topology to prove that ACC_0 neq P, this probably qualifies as more substantial.

17. Anonymous Says:

in any case, my main concern is whether complexity theory has gotten side-tracked precisely because working on the real guts of the matter proved unrewarding, in terms of the richness of techniques available.

people jumped on the razborov-rudich bandwagon a little too eagerly. why don’t we have entire schools of thought devoted to circuit lower bounds? why is it essentially career-ending in computer science to produce (on average) only one paper every year, even if it is very good? it doesn’t seem like we actually value depth. why is it so hard to get a job of you’re doing complexity theory?

do we really assign our fundamental problems the same importance that other fields do theirs?

18. Greg Kuperberg Says:

I think that “deep” could be the most arrogant word in mathematical research. I have no idea what it means.

19. Anonymous Says:

I may not hit upon the original poster’s meaning of the word, but to me what is not “deep” about complexity theory (and TCS in general) is that it is essentially a series of unrelated results (each using very nice tricks and techniques), but without much unifying structure or theory. (Obviously some tricks turn out to be very useful and have produced a series of results. But I still have the feeling we are left with a bag of tricks at the end of the day and not much else…)

I suspect (and hope!) that this will change as complexity theory matures.

20. Scott Says:

I think that “deep” could be the most arrogant word in mathematical research. I have no idea what it means.

I’m reminded of a comment by John Conway in Bill Gasarch’s P vs. NP poll:

“In my opinion [P vs. NP] shouldn’t really be a hard problem; it’s just that we came late to this theory, and haven’t yet developed any techniques for proving computations to be hard. Eventually, it will just be a footnote in the books.”

For those who don’t quite grasp this, let me spell it out for you. The Riemann hypothesis is Deep. The Langlands conjectures are Deep. For P vs. NP, the problem is merely that we don’t yet have techniques for proving computations to be hard.

21. Anonymous Says:

I think that “blistolpock” could be the most arrogant word in mathematical research. I have no idea what it means.

somehow it seems hard to feel strongly about a word if you don’t know what it means, but anonymous 10:34 did a good job of describing what many people mean when they refer to TCS as not (yet) being deep.

scott, I see how it’s easy to get offended by conway’s comment, but it’s ridiculous to have such a strong reaction when you are not largely aware of the mathematics surrouding the riemann hypothesis. it’s probably a ridiculous comment for conway to make as well, given that he has not actually worked on P vs. NP.

I disagree with conway, but I don’t see why it’s such a bizarre thing to suggest, or why–not having a significant background in number theory–you can somehow assert that P vs. NP is at least as “Deep” as RH.

in fact, if you’re going to wait a year before taking a job, I would suggest you take first a course in complex analysis, and then a course in analytic number theory. for the same reason a painter might want to understand the art of a musician–an appreciation for a type of structure and richness in mathematics that doesn’t arise (yet) in complexity theory. it’s really quite stunning.

22. Scott Says:

scott, I see how it’s easy to get offended by conway’s comment…

I’m not offended, just amused.

I haven’t studied RH much, but my opinion is that it shouldn’t be a hard problem. It’s just that we haven’t yet developed any techniques for proving that the nontrivial zeroes of the zeta function have real part 1/2. π

in fact, if you’re going to wait a year before taking a job, I would suggest you take first a course in complex analysis, and then a course in analytic number theory.

I’m always up for learning new things if I’ll understand them, but in this lifetime I’m through with taking courses. I did take real analysis, abstract algebra, and topology at Cornell. The theorems in real analysis all seemed like painstaking formalizations of the obvious. I enjoyed abstract algebra and topology, but I kept asking myself: “What is the computational problem here, and can it be solved efficiently?”

23. Anonymous Says:

I may not hit upon the original poster’s meaning of the word, but to me what is not “deep” about complexity theory (and TCS in general) is that it is essentially a series of unrelated results (each using very nice tricks and techniques), but without much unifying structure or theory. (Obviously some tricks turn out to be very useful and have produced a series of results. But I still have the feeling we are left with a bag of tricks at the end of the day and not much else…)

I suspect (and hope!) that this will change as complexity theory matures.

On the contrary, mathematics would be so much more useful if mathematicians actually tried to solve problems that can be understood, rather than trying to construct complex structures and theories that are so difficult to understand to be useless to solve problems.

Complexity theory is applied mathematics. Structure and unifying theorems is not important. When you get so stuck up with constructing theories, you fail to solve problems.

24. Anonymous Says:

in fact, if you’re going to wait a year before taking a job, I would suggest you take first a course in complex analysis, and then a course in analytic number theory. for the same reason a painter might want to understand the art of a musician–an appreciation for a type of structure and richness in mathematics that doesn’t arise (yet) in complexity theory. it’s really quite stunning.

too bad that structure and richness doesn’t help you get laid. lower bounds on real problems are much more stunning to hot chicks than classification of groups. You can just pick the chick up and explain waht the problem is easily. Try that with math.

25. Scott Says:

When you get so stuck up with constructing theories, you fail to solve problems.

Let’s agree that when someone like Galois or Wiles produces a deep theory that also solves a simply-stated open problem — “dude.”

But if I have to choose between a theory with no easily-stated results or easily-stated results without a theory, I’ll always go for the results.

26. Anonymous Says:

what is this picture of the world where mathematicians are “just constructing useless theories that nobody can understand?” it’s ignorant, myopic, and arrogant.

– if you design a theoretical algorithm that (a) nobody ever implements and uses and (b) does not help develop the theory of algorithms itself, then you have failed on both counts, and your contribution is essentially useless.

– funny, I thought the only thing that complexity _does have_ is structure and unifying theorems. the biggest advances in complexity theory are about unification (e.g. hardness vs. randomness, time vs. space, determinism vs…).

– the unfortunate thing about computer science is that everyone thinks that problems should be easy. I mean, Moore’s law says that we only have to wait 1.2 years before our abilities double (I have no idea what the real number is), so certainly the number of theorems we produce should grow at the same rate. We have little tolerance for intermediate progress (e.g. “look, harmonic analysis of boolean functions was really useful in solving problem A, and such and such issue arose. we study this issue, and understand some complex behavior precisely, and in an interesting an unexpected way. we expect it to be useful in the future.” and responses are well, but what are they solving for us NOW? where is contrived problem A’ so that we all feel safe that this is actually computer science…)

– finally, for picking up “chicks” or “dudes” or whatever
the best way is actually to make what you do sound as much like art as possible (which isn’t so far from the truth). for this, geometry seems useful, and quantum mechanics works too, and crypto has a certain cool factor. explaining to her some contrived story about placing factories so as to minimize the average expected commute time gets really boring, really fast.

27. Anonymous Says:

people seem to be getting defensive, so let’s clarify:

TCS is an utterly amazing field in which to exist. you can do as much or as little mathematics as you want, you can go work for google, you can quite easily make significant contributions to real world applications by working with colleagues in AI, systems, and databases, and they are a constant source of interesting problems. TCS people are amazing too, and they’re really fucking smart.

a lack of depth is made up for by an abundance of creativity, and there is depth if you allow yourself to delve. in my opinion, you will find as many new ideas in focs/stoc as in any top math journal. additionally, the connections with mathematics are growing faster than ever before, and this trend will only continue: they like that we are using their techniques, and are quite willing to work on our problems if we can state them in a way they understand (and then they say “can I put that in my grant proposal?”)

we’re all very lucky.
the end.

Scott said:

I did take real analysis, abstract algebra, and topology at Cornell. The theorems in real analysis all seemed like painstaking formalizations of the obvious.

The most interesting subjects, in my eyes, are exactly those that seem obvious when the right framework is set. The construction of these theories was probably incredibly hard, and took some time until it all settled down. I’ll give a few examples from TCS: Look at VC-dimension arguments. This is a unifying concept that rules across of TCS, from geometry, through learning, to complexity. And it’s totally obvious — but only after you see it. Look at Irit Dinur’s new proof of the PCP Theorem. It’s obvious. You see the proof and you say — “well, this is completely obvious. It’s just a bunch of simple ideas, and a bit of technicalities, and you’re done.” Not to mention the Zig-Zag product itself, which is another obvious thing. But both of them had far-reaching consequences. Sariel Har-Peled’s results on core-sets (and results involving sampling in general) is another one of these things, where you think a result should be hard, and then someone comes up to you and explains it to you using completely trivial ideas. It’s like a magic trick — Scott’s comment discussed the magician’s obvious “abracadabra”, while the thing you did not see him perform is a small change in perspective that _made_ it all seem completely trivial. The above results are all “painstaking formalizations of the obvious”. Is it obvious that there are unmeasurable sets before you study real analysis? You took a couple of courses on Calculus before you took real analysis, and you were completely content with the Reimann integral, not once thinking, like I imagine Lebesgue thought, “this is bullshit, just because the function is not defined on the rationals it doesn’t mean it is not integrable”. Real analysis is deep. The presentation is trivial. (by the way, Linear Algebra is the same way. It’s totally deep, and the presentation is totally trivial. Annoyingly so).

I can go on and on with examples like this, but instead I’ll mention an article of Tim Gowers which discusses many points which were discussed in this thread, especially the subject of “Deep”ness. Gowers argues that combinatorics (the same point can be made for complexity) is deep because although they can be viewed as a huge bag of tricks, there is actually a small number of extremely novel unifying trends behind these tricks, and that is what combinatorics (or complexity) gives to science. He says that thinking of a “typical” object was very hard until the probabilistic method came along, and allowed us to say something like: You want to prove that every graph with 100 vertices has either a clique or an ind. set of size 3? Take a random graph and prove it there. This completely mathematicians’ ways of thinking. The article is called “The two cultures of mathematics” and is available at http://www.dpmms.cam.ac.uk/~wtg10/2cultures.ps

Also, Scott said:

I’m always up for learning new things if I’ll understand them, but in this lifetime I’m through with taking courses.

I know how you feel, but I’m quite sure that you don’t really mean that. A course _is_ one of the best ways to learn a new subject in depth, without actually having to read the material yourself. And you get better at understanding _well-structured_ courses as your skill develops, so taking, say Noga Alon’s course on the probabilistic method now is very different for me than taking it 5 years ago. (by the way, I believe that tail-estimates are perhaps the most horribly under-taught subject to CS undergrads, at least here).

Unfortunately, it seems that Lebesgue would not agree with me:

“Reduced to general theories, mathematics would be a beautiful form without content”

30. Anonymous Says:

I think I lost track of this discussion, but just a couple of quick comments:

1. If the whole point of that deep comment is that more people should work on lower bounds then I absolutely agree – Razborov-Rudich should be viewed as a hint to how we can proceed, not a limitation (similar to relativization. So, anonymous, you can say we’re shallow, boring and stupid – just please prove a non-natural lower bounds (whether it’s using Morse theory or using stupid induction).

2. I think that people who have very few good results are greatly appreciated within the TCS community. The main problem is that because TCS by itself is not appreciated enough by the outside world (and especially funding agency) people who do core TCS work and do not have also a more practical-sounding aspect to their research are at a disadvantage. This is what theorymatters is trying to change.

3. I think it’s important that we keep a pluralistic culture where we have and appreciate people of very different modes of works and skill sets, if we want to make progress on our goals. At least in this point I see this community as quite successful, and don’t see a need to try to emulate other mathematical communities.

4. I’m not sure I understand the meaning of “deep” and so I’m not sure whether it’s or not it’s a good thing. I guess it’s also depends whether you’re asking about the depth of the questions or the answers. Obviously if you measure depth in terms of logical dependencies in current proofs then complexity has not been around long enough for our answers to compete with other fields, which I’m sure have deep and beautiful struture complexity has yet to develop. As I said in my previous comment, it’s a question of whether you prefer to work close or far to the root of the result tree.

About the questions, if we measure depth by how hard/long/logically deep is their shortest proof, or the time it will take mankind to solve it, then I guess nobody can really know. I’m very ignorant of number-theory and math in general, and so may be completely offbase, but I would not compare P vs. NP to the Riemann Hypothesis, but rather to questions such as the status of the Euclidean 5th postulate (called by Gauss a “shameful part of mathematics”) – a seemingly elementary question that it’s simply outrageous we can not solve. What’s surprising is that the questions of complexity seem to be several orders of magnitude harder than this and other similar examples in math history. It may be that we’re all missing something simple, and it may be that we’re like the ancient greeks – trying to solve a question 2000 years before its time.

–Boaz

31. Greg Kuperberg Says:

What’s the issue with Euclid’s 5th postulate? Projective geometry and hyperbolic geometry show that it doesn’t have to be true.

32. Anonymous Says:

So, anonymous, you can say we’re shallow, boring and stupid…

anonymous (= me) never said any of these things. I questioned whether complexity theory itself was deep (as opposed to the people who do it), and thought this might be a reason that fewer people work on “core” complexity. I get the sense that people are thirsting for a richness and variety that we haven’t seen in, e.g. communication complexity.

I am not trying to discourage people, I am just asking questions. The amount of money & time spent on hardness of approximation vs. circuit complexity is disproportionately large. the same thing goes for quantum computing or extractors. is it because these other sub-fields have a far more rich/interesting collection of techniques/connections?

Greg Kuperberg said…

What’s the issue with Euclid’s 5th postulate? Projective geometry and hyperbolic geometry show that it doesn’t have to be true.

Why did the chicken crose the road? Beause it felt fat.

Here is a question: Suppose that we accept P=NP as an axiom (but without actually getting an algorithm that proves it, not even as a black box). Then the complexity hierarchies collapse, and so on. But do we actually get something we can use from the axiom we added? Can there be a specific algortihm for factoring, call it A, such that if we know that P=NP then we know that A runs in polynomial time? Or vice versa: Can you prove that there are no “constructive” ramafications to knowing that P=NP by itself, without getting an actual algorithm that shows it to be true?

34. Scott Says:

Elad: That’s an excellent question! The answer to it is yes — there’s an explicit algorithm A that factors in polynomial time if P=NP. Here it is: given an integer n, dovetail over all possible Turing machines. Halt when one of them has output the factors of n. π

(Sure, there’s a pretty big constant overhead, but that’s not our department…)

Here I used the fact that there always exists a prime factorization. More generally, we can say that there’s explicit algorithm A that finds a satisfying assignment to any satisfiable formula in polynomial time if P=NP (but might not halt if P!=NP). There’s also an explicit n^{log log log n}-time algorithm that decides all but finitely many instances of SAT if P=NP (do you see why?).

35. Anonymous Says:

I’m through with these discussions of math is better/worse than CS.

In CS’s infancy, it was a truly open question whether there was such a thing as a “science” of computing. But today we are way past that, and CS as such is as valid as any other science. Not surprisingly as a new science CS uses (a) mostly new and uncomplicated techniques (duh!) and (b) has incredibly important problems for the picking.

The same was true for, say, Topology when it first vaulted into the world of mathematics. I’m sure Brouwer and Jordan got flack for doing non-deep mathematics. One hundred years later most of those mathematicians criticizing them are forgotten while Brouwer and Jordan live on.

Today a large percentage of the old guard in pure mathematicians haven’t gotten over the fact that first Graph Theory and then Computer Science have stolen their thunder. They sit atop their stale perches and criticize the lack of “depth” and “elegance” in the new field, all the while these fields continue to evolve and pass them by.

I don’t care much about their opinion. As Thomas Kunz observed those people will never be swayed, they will simply slowly drift into retirement while the younger generation works in the new reality from the get go (anyone noticed how many young solid mathematicians are succesfully dabbing into the field of computing?).

I worry more that as CS grows it’ll repeat this pattern. Elsewhere in the blogosphere we have seen complexity types sneering at algorithmic results for their lack of “depth” and “elegance”. There is nothing to be gained in that line of argument, and complexity theorists will not create jobs for themselves by going there. CC is valuable (or not) on its own merits and talking down the competition is a loss-loss proposition.

36. Anonymous Says:

I think you mean ‘Thomas Kuhn’.

I have a question: why do people seem to be sweaty over the prospect of topological ideas revolutionizing complexity? Are there any serious vindications of this hunch, or has the seeming exoticism of the idea (and the prospect of visualizing donuts ‘n things) simply transfixed people?

37. Scott Says:

why do people seem to be sweaty over the prospect of topological ideas revolutionizing complexity?

I’m not sweaty over it. (Living in Canada, I’m not really sweaty over anything.)

Mike Freedman has had some interesting ideas about a topological approach to P vs. NP, but nothing concrete that I know of yet.

In a different direction, there’s also work by Freedman, Kitaev, and Wang (and more recently Aharonov, Jones, and Landau in STOC’06), which shows that simulating topological quantum field theories and approximating the Jones polynomial at a root of unity are BQP-complete.

As for P vs. NP, there’s no shortage of exotic ideas: the Mulmuley-Sohoni approach (based on geometric invariant theory), Mike Nielsen’s approach (based on geodesics in Riemannian manifolds), approaches based on cohomology, etc. etc. I don’t pretend to understand most of this stuff; what I keep coming back to is how any of it gets around the Razborov-Rudich barrier.

38. Anonymous Says:

I’m through with these discussions of math is better/worse than CS.

at no point did I propose that “math is better than TCS” or that “subfield A of TCS is better than subfield B.” I don’t see why many people are incapable of having a reasoned discussion about their field without getting defensive and clouding everything with personal insecurities.

I think perhaps my real question should have been something like “is it the right time in history to solve the big problems of complexity theory?” in other words, is there anything to be gained from attacking the main questions head-on, as opposed to setting in for the long haul, developing a theory, and being perfectly happy with a lot of very incremental progress for the next 20 years. what should we expect of ourselves? should be people be encouraging their students to work in a different field?

as for the sweat-inducing prospects of applying topology to complexity theory:

– the stuff of freedman, et. al. is very cool, but the reason topology comes in there (topological quantum field theories) is very different (a priori, at least) from the reason it might be useful for e.g. circuit lower bounds.

– I think the hopes for topology in complexity theory revolve around connected-ness and information flow. for instance, we know there are problems for which every circuit of a certain type requires that there exist disjoint paths from every set of k inputs to every set of k outputs.

in other words, the problem itself imposes certain connectivity requirements on the circuit. this can be used to get very weak but non-trivial lower bounds on the number of wires required by any circuit that solves the problem (see valiant, pudlack, or raz and shpilka for examples of this that I can recall).

at this elementary level, one might hope that higher-order topological invariants (e.g. homotopy, homology) might lead to stronger lower bounds, but it’s unclear whether this is true, and what it should mean.

– bjorner, lovasz, yao, and others used topological methods to study size and depth lower bounds for algebraic decision trees. suppose you have a subset S of R^n and you want a decision tree that decides membership in S. one method to prove lower bounds is to count the number of connected components of S (or its complement). but if S is connected, then this only provides trivial bounds.

here, one can use higher order notions of “connectivity” like the betti numbers of S to get lower bounds on depth (this uses non-trivial results on the betti numbers of algebraic varieties, which I mentioned upthread somewhere).

I guess there is hope that similar methods might hold a lot of power, but I agree there is little evidence. there is some recent attempt by joel friedman to study this stuff more generally, but I don’t really understand what his paper is saying.

39. Anonymous Says:

isn’t the real problem with complexity theory that the resulting mathematics is usually superficial and shallow?

and

are complexity theorists ever saying something deep?

and

I don’t see why many people are incapable of having a reasoned discussion about their field without getting defensive and clouding everything with personal insecurities.

TROLL!

Hot mama, a troll. And there I was thinking that they hide under a bridge or something. Maybe it’s that way in Canada.

41. Moshe Vardi Says:

This discussion reminds me of a sentence I once heard from an MIT doctoral student: “Isn’t everything about databases trivial?” A short inquiry quickly revealed that this student knew nothing about databases. One does not find depth unless one looks for depth.

42. Anonymous Says:

Calling the questioner a “troll” when he/she seems to have made some interesting points (in addition to the initial aserbic objection) is probably an easy way out, as is comparing her/him to someone with no knowledge of databases, since the questioner seems to know more about complexity theory than me.

43. Jonathan Says:

Could you define what you mean by deep?

Here’s how it was explained to me. (I’m a theoretical CS guy, so I’m not speaking from personal experience.) In mathematics, especially pure mathematics, a result is “deep” if it exposes a connection between disparate fields of mathematics, where nobody expected a connection to exist. For example, the original proof of the prime number theorem is “deep” because it’s based on complex analysis. (It has little to do with the length of the proof.)

In any case, who cares whether the “mathematics used are deep”?

“Pure” mathematicians do. (Here, I use “pure mathematician” as a shorthand for “mathematicians whose work is not motivated by applications.”) They need a philosophy or aesthetic for deciding how important mathematical results are. Since “usefulness” is out, “depth” is a big one. (“Generality” and “elegance” are big ones too.)

44. Anonymous Says:

Calling the questioner a “troll” when he/she seems to have made some interesting points

Calling the questioner a troll after he has just accused his reasoned opposition of being “incapable of having a reasoned discussion about their field without getting defensive and clouding everything with personal insecurities” is most definitely the right thing to do.

45. Jonathan Vos Post Says:

Compexity Theory is NOT the same as the Theory of Complex Systems. I say that awkwardly, having been Session Chair of 3 sessions at the 6th International Conference on Complex Systems [google: ICCS and NECSI). That was the one where lifetime achievement awards were presented to John Forbes Nash, jr., and “Renormalization Group” Wilson,… who certainyly did Deep work.

46. Jonathan Vos Post Says:

Oh, and the hot word in PNAS papers about developmental biogenetics is EIGENGENE.