What is the name of this post?

No need to thank me for my turnabout — we’ve got work to do. There’s a new complexity class in town, and I need you to help me name it.

My new class is like P/poly, except that the polynomial-size advice can’t be trusted — you have to verify it. Or to put it another way, it’s like NP intersect coNP, except that there’s only one witness for each input length. Give me that witness, and I’ll correctly decide every input of size n. Give me a different witness, and for every input of size n I’ll either output the right answer or else say “I don’t know.”

Now, anyone could make up a name for this animal — even I could! But I want the name to be naturally extensible to further classes. For example, if (hypothetically speaking) I was able to use a new result about the learnability of quantum states to prove that AvgBQP/qpoly is contained in AvgQMA/poly, but my proof actually yielded the stronger result that AvgBQP/qpoly is contained in the subclass of AvgQMA/poly where there’s only one QMA witness for each input length, then your naming convention should immediately give me a name for that subclass.

So let the christening commence! And for extra credit, prove or (relative to an oracle) disprove that if my class contains NP, then P=NP.

50 Responses to “What is the name of this post?”

  1. Anonymous Says:

    Not sure I understand the class you are talking about, but maybe you mean something like SV-nondeterministic circuits? (see, e.g., http://www.cs.caltech.edu/~umans/papers/SU05.pdf )

  2. Scott Says:

    I just looked up the definition of SV-nondeterministic circuits, and it’s different from what I have in mind.

    Formally, I’m talking about the class of languages L for which there exists a polytime machine M such that:

    (i) For all input lengths n, there exists a poly-size string a_n such that M(x,a_n) outputs L(x) for all x in {0,1}^n.

    (ii) For all x,y, M(x,y) outputs either L(x) or “don’t know.”

    In particular, my class is contained both in NP intersect coNP and in P/poly. By contrast, SV-nondeterministic circuits represent a superclass of NP intersect coNP (being the nonuniform analogue of it).

  3. Niel Says:

    Perhaps a natural analogy to keys would work for naming. Each problem of size n is like a lock; and each witness for the size n problem is a key which either fits into the lock and opens it (allowing the problem to be easily decided) or not (it doesn’t decide the problem). A witness which unlocks all instances of size n is a skeleton key.

    All locks can be opened by disassembling them, if you have the time and energy… but it’s faster to have a key, and if you open many locks, it’s useful to have a skeleton key rather than many different keys. Of course, it may easily be that not all families of problems have “skeleton keys”: problems in NP inter coNP which don’t have any such unifying witness are like classes of locks which are complex enough not to admit a reasonably simple master key.

    For extensions, the question is what the ‘nature’ of the ‘key’ is (e.g. quantum skeleton keys).

  4. Andy D Says:

    To disprove relative to an oracle that

    NP has length-based ‘keys’ ==> P = NP:

    The oracle A will only have 1’s on lengths that are even. Queries to A will be of form (u, x) in A? i.e., the query string divided into two equal lengths.

    For each n, there will be exactly 1 ‘good’ u of length n such that some strings (u, x) are included in A.
    For a ‘good’ u, A(u, x) is defined as L(x) for L some standard PSPACE-complete set.

    This sequence of ‘good’ u’s will compose the magic keys. Namely, for a problem in NTIME(n^c), the key a_n will be the list of the first, say, n^(3c) good u’s–ample for a polytime machine to both know all the location-info the nondet. machine could access during its computation

    (once this info is explicit the nondet. machine can be reformulated as posing queries to an ordinary PSPACE oracle, since its queries to bad u locations become trivial to answer and those to good locations can be redirected to PSPACE using the oracle def’n),

    AND ample as well to pose to PSPACE the necessary queries about the behavior of the reformulated nondet. machine. (Using the fact that NP^PSPACE = PSPACE. The small blowup factor here, which I’m not going to calculate, is what motivates use of the 3 above.)

    So we have that NP languages have magic keys, for any choice of the sequence of good u’s.

    On the other hand, if these good u’s are chosen at random, then w/ prob. 1 no P^A-machine can reliably find them, and so P^A can’t compute their parity–while an NP^A machine can accept 1^n for exactly the n such that the good u of length n is odd.

    This is more or less a standard construction I think, but used in the past for things like collapsing PH to NP without having P = NP.

  5. Andy D Says:

    Just to explicitly match up the ‘magic key machine’ with Scott’s last post:

    Let L be in NTIME^A(n^c).

    M^A( x, a_n ) gets an instance ‘x in L?’ to decide, and a purported key consisting of the purported first n^(3c) good u’s for A.

    M verifies that they are indeed good by querying A on (u_i, e), for all given u_i and e a fixed easy ‘yes’ instance of the PSPACE language. If these all come back 1’s, the u_i’s must be good, and they fuel the computation described.

    Otherwise, M has no obligation to continue, and says ‘don’t know’.

  6. D. Sivakumar Says:

    Scott, your class is related to the notion of “helping” defined, IIRC, by Schoning. Also IIRC, P_help(C) is defined to be the class of languages L for which there’s a TM M such that L = L(M^A) for any A, and there is some B in C such that M^B runs in polynomial time.

    This was a precursor to interactive proofs without probability, but with verifiers that are robust nonetheless.

    It seems to me that P_help(SPARSE) is contained in your class, and all the decidable languages in your classes are in P_help(SPARSE).

    Siva

  7. Scott Says:

    OK everyone, I’ve been told by Lance himself that my class is actually P^{Tally-(NP intersect coNP)}. So the subclass of AvgQMA/poly that I’m interested in is

    AvgBQP^{Tally-Psi(QMA intersect coQMA)}/poly

    where the Psi means you return the quantum witness itself, and the oracle is a quantum oracle.

    That’s nice to know, but personally I like the name that Ryan Williams gave my class: YC (“Your Class”). (SC and AC are already taken, and AaC is a little too presumptuous…)

    Actually, I should call the classical version YP, and the quantum version YBQP. I could then state my result as follows: AvgBQP/qpoly is contained in AvgYBQP/poly.

  8. Scott Says:

    Also: Thanks very much, Andy, for the oracle where NP=YP but still P!=NP. Lance also mentioned such an oracle.

    Also, Ryan Williams sent me a proof that any self-reducible language in NP intersect coNP intersect P/poly is also in YP.

  9. Eldar Says:

    For maximum generalization, instead of dealing with the letters I would deal with the slash sign. Say, double it, so your class would be called “P//Poly” (in LaTeX I would reduce the spacing between the slashes).

  10. Greg Kuperberg Says:

    I need to think through this some more,but off the cuff I realized that the Zoo says something about the question. (Should I call it “the abandoned Zoo”?) Namely, it seems that the only listed class that contains P and is contained in both cocap.NP and P/poly (for all oracles, that is) is ZPP.

    The part that I have not thought through is whether YP contains ZPP by the same argument that P/poly contains BPP.

    If you are not content with the YP acronym, then I note that with the usual metaphors about Merlin and Arthur, YP is “advice from Merlin” rather than a proof from Merlin, or advice from an angel. The P part is a deterministic counterpart to Arthur – could he be called Percival?

  11. Scott Says:

    Should I call it “the abandoned Zoo”?

    No, you should call it “the municipal Zoo.” And I did edit it a few times lately, as can you. (Incidentally, I found that it’s easier to edit the wiki than to go through the whole uploading process every time I want to change something.)

    The P part is a deterministic counterpart to Arthur – could he be called Percival?

    Sure, why not? And the Y stands for Yaroslav. So if you don’t like “Your Polynomial-Time” (too much like an advertising slogan?), then YP stands for Yaroslav-Percival.

    The part that I have not thought through is whether YP contains ZPP by the same argument that P/poly contains BPP.

    Yes, ZPP is in YP. Yaroslav sends an r, then Percival runs the amplified ZPP algorithm using r as his random string. By a counting argument, there’s a single r that works for all 2^n inputs. And any r’!=r will lead at worst to “I don’t know.”

  12. Greg Kuperberg Says:

    I did edit it a few times lately, as can you. (Incidentally, I found that it’s easier to edit the wiki than to go through the whole uploading process every time I want to change something.)

    Really? Then I guess I see why you wanted to change it to something else, because the wiki interface isn’t very good.

    Well, okay, I could make changes. And I will. However, I hope to be able to make a whole lot of changes when the time comes, enough so that the municipal model won’t be quite the right one for a little while. Then afterwards it might be reasonable to go back to some municipal wiki-like structure.

  13. Cheshire Cat Says:

    The // notation is already taken, for a notion that’s probably more interesting: advice for randomized algorithms where the advice depends on the random string

  14. Greg Kuperberg Says:

    I also notice that the municipal Zoo has had some trouble with graffiti.

  15. Greg Kuperberg Says:

    By the way, if you want a Y name, Arthur’s mother was Ygerna. Maternal advice is perhaps an apt metaphor for YP. That is at least better than the incongruous Yaroslav.

    One question then is whether there is a “YA”, in which Ygerna speaks to a randomized Arthur rather than to the deterministic Percival.

    You could of course apply a “Y” operator to a three-valued BPP, but that might have the same trouble as the difference between ExistsBPP and MA.

  16. Scott Says:

    The // notation is already taken, for a notion that’s probably more interesting: advice for randomized algorithms where the advice depends on the random string

    Yes, I know. But in terms of interestingness, I’ll put Yaroslav-Percival up against // any day. The trouble with nonuniformity has always been its “absurd” power — a power able to put P/1 outside of EEEEEEEXP. YP tries to remedy that power, whereas // exacerbates it. Indeed, even if // weren’t taken, I didn’t want any name involving /, since to me / immediately suggests nonuniformity (and hence uncomputability and uncountability).

  17. Greg Kuperberg Says:

    Scott: I certainly agree with you about //. But why Yaroslav, who has nothing to do with the Arthurian legends?

  18. Scott Says:

    Because he emailed me a problem (see the post above this one), and was the first Y available when I needed one. Had Arthur’s mother Ygerna emailed me, I would have used her name instead.

  19. Greg Kuperberg Says:

    The more I think about this class YP, the more I like it. It is neat that it contains ZPP by a derandomization argument. And I think that I have the right generalization to YA.

    Namely, Arthur is a yes-no-dunno-valued randomized machine. If he gets good advice (from his mother, naturally), then he reports the correct answer with probability 2/3 for all inputs. If he gets bad advice, then either the probability of dunno is at least 1/2 (say), or the correct answer is more than twice as likely as the incorrect one. I am not completely sure, but as I go over it in my mind, this is not the same as applying a “Y” operator to 3-valued BPP, for the same reason that ExistsBPP is different from MA.

    Meanwhile I note that if you want to know that there is an oracle such that YP = NP but P != NP, it is enough to to know that ZPP subset YP subset NP for all oracles, and to read the Zoo entry for ZPP. The Zoo entry says that there is an oracle such that ZPP = EXP. In that world, ZPP != P, because it equals EXP instead, and everything in between ZPP and EXP also equals EXP.

  20. Scott Says:

    Meanwhile I note that if you want to know that there is an oracle such that YP = NP but P != NP, it is enough to to know that ZPP subset YP subset NP for all oracles, and to read the Zoo entry for ZPP.

    Duhhhh, thanks! That’s the Book proof I was looking for…

  21. Anonymous Says:

    Regarding names, although those mentioned seem to have stuck, I’d just call it a NP-uniform circuit. Given the circuit you can verify its validity, no?
    Gilad.

  22. Lance Says:

    Based on the comments and the fact that ZPP is in YP, I misunderstood the definition. It’s not that there is some advice that you can verify, rather there is a single advice such that for any input you verify whether or not the input is good with that advice.

    So P^{Tally-NPcap co-NP} is in YP but possibly not the other direction.

  23. Scott Says:

    Thanks, Lance! That’s a subtle distinction that flew right past me. (Though if you’d said my class was AWPP^NEXP, I probably would’ve taken you at your word…)

  24. Greg Kuperberg Says:

    I confess that I have been a bit naughty this morning in my volunteer work for the municipal Zoo. I did catch a mistake, though.

    Take a look.

    PS: It is absolutely painstaking to change the Complexity Zoo wiki page without messing it up. If you have worked on it for a long time, you may be used to it, but you can’t expect volunteers to get it right. Wikipedia is supported by a variety of “bots” that fix small problems.

    It doesn’t help that it takes 20 seconds for the page to come back after you make changes.

  25. Cheshire Cat Says:

    Scott, I agree completely with your point about the notation. And as far as interestingness goes, I do believe that YP is a more interesting class than I gave it credit for initially…

    This being a blog and all, you may find this vaguely disturbing. But that’s real life for you. Sometimes people happen to agree with you, you’ve got to grin and bear it.

  26. Scott Says:

    Sometimes people happen to agree with you, you’ve got to grin and bear it.

    I try, I really do, but my impatience wears thin…

  27. Scott Says:

    Greg: I finally had some time to look over your vandalism. 🙂 A few comments:

    (1) I’m writing the first paper involving this class, and I’m gonna call it Yaroslav. (How do you even pronounce “Ygerna” anyway?)

    (2) I don’t understand your claim that YQP=BQP. Could you sketch a proof for me? Maybe then I’ll see the ambiguity in the definition, if there is one.

  28. Venkat Says:

    Your class is related to another class that my friend and I have been thinking about. This class, which we call ONP, is similar to your class except that we consider NP in place of (NP cap coNP). ONP is a restricted version of NP: for every length n, there should exist a witness w_n such that w_n is a witness for every string of length n in the language.

    Thus, in ONP, the witness depends only on input length and not on the input itself. In other words, the witnesses are {em oblivious} of the input. So, we call this class Oblivious NP or ONP for short.

    Then your class YP = ONP cap co-ONP.

    Regardig you question of whether
    “NP subset YP ==> NP =P”. This is an interesting question and I have no comments. But, we can make easy remarks:
    (1) NP subset ONP NP subset P/poly
    (2) NP = (ONP cap coONP) ==> NEXP = coNEXP.

  29. Greg Kuperberg Says:

    (1) I’m writing the first paper involving this class, and I’m gonna call it Yaroslav. (How do you even pronounce “Ygerna” anyway?)

    If you could accept that Ygerna is also a well-motivated metaphor for advice that needs to be checked, then I promise not to delete any instance of “Yaroslav” in the Zoo. (Well, you may be entitled to that promise regardless, but still.)

    See, I have the whole Arthurian legend, or at least an adaptation to complexity theory, figured out. Arthur is a protagonist. Arthur may get a proof from Merlin, who knows everything but is manipulative and cannot be trusted for that reason. Or Arthur may get advice from an angel. The angel also knows everything, but only gives advice, not answers. Or Arthur may be replaced by Percival, who is faithful but lacks the imagination to make random choices.

    Or, in this case, Arthur may remember advice from his mother. In contrast to Merlin, his mother’s advice is impartial. In contrast to angelic advice, Ygerna’s advice is fallible. (Isn’t that the way it is for most of us?)

    Anyway, I doubt that you will go far wrong if you pronounce Ygerna like Yvonne.

  30. Greg Kuperberg Says:

    (2) I don’t understand your claim that YQP=BQP. Could you sketch a proof for me? Maybe then I’ll see the ambiguity in the definition, if there is one.

    You said in the definition of YQP that the the probability of the Maybe output for any fixed input is only allowed the value set [0,1/3] union [2/3,1]. And you also demanded that there exists an advice state for which it is 0. Since the quantum advice space is a connected manifold, the interval [2/3,1] is unreachable.

    After all, we’ve been here before. You were explaining to me the difference between MA and ExistsBPP. If you made Arthur in YQP classical, your class would be the Ygerna operator applied to BPP, much like ExistsBPP is the Merlin operator applied to BPP. It would not be YA. As it turns out, the distinction becomes even more important in the quantum case. ExistsBQP, if anyone cared to define that, would equal BQP by this continuity argument. So QMA depends even more strongly on the subtlety in MA. QYA and YA are similar.

    (Which is why I made their names parallel to MA and QMA. At one point I put in the Zoo the even cheekier remark that, given that the definition of YQP is bad, it’s a good thing that the name is bad too. 🙂 )

    (You could also have YCQA, of course.)

  31. Scott Says:

    You said in the definition of YQP that the the probability of the Maybe output for any fixed input is only allowed the value set [0,1/3] union [2/3,1].

    No, I said no such thing. When I said “BQP machine,” I thought it was obvious that I meant “syntactic BQP machine.” I’ll change the entry to say that explicitly.

  32. Scott Says:

    The problem with YA and QYA is just that they don’t “scan.” If you hadn’t never seen them before, you wouldn’t even guess they were complexity classes. And if you did figure out that A was for Arthur, you’d guess that they were “sort of like” MA and QMA respectively.

    But I don’t think they are. For one thing, unlike the MA classes they’re closed under complement. They’re also in P/poly and BQP/qpoly respectively, which to me makes them much “closer” to P and BQP than to MA and QMA. With “YQP”, I’m trying to evoke a new variant on BQP (which is basically how I see it). Likewise, I think the class you call YA should be YPP.

    In short, I’m trying to speak Theorese like a native, not like a logical and conscientious foreigner. 🙂

  33. Greg Kuperberg Says:

    No, I said no such thing. When I said “BQP machine,” I thought it was obvious that I meant “syntactic BQP machine.” I’ll change the entry to say that explicitly.

    No, not there. You said that Arthur (who you do not call Arthur) “outputs either the correct answer or ‘I don’t know’ with probability at least 2/3”. That implies that P[maybe] is not in the interval (1/3,2/3). That’s where the mistake is.

    The problem with YA and QYA is just that they don’t “scan.”

    Because they don’t end in P. I’m sure that the same was said about MA, at first. However, the quantum class that you want is clearly Arthurian.

    And it is not my intuition that YP is all that close to P, or that YQP is close to BQP. I suspect that if YP = P, there is some unlikely collapse somewhere else. As Lance said, YP contains P^{Tally-(NP intersect coNP)}.

  34. Greg Kuperberg Says:

    Likewise, I think the class you call YA should be YPP.

    So what do you want to call QCYA? (I put it in the wrong order before.) After all, we have a paper on these issues.

  35. Scott Says:

    So what do you want to call QCYA?

    CYQP?

  36. Scott Says:

    You said that Arthur (who you do not call Arthur) “outputs either the correct answer or ‘I don’t know’ with probability at least 2/3”. That implies that P[maybe] is not in the interval (1/3,2/3). That’s where the mistake is.

    No, I meant (either the correct answer or ‘I don’t know’) with probability at least 2/3.

  37. Greg Kuperberg Says:

    CYQP?

    Blech. I maintain that YQP has important features in common with QMA, in particular that it is unlikely to equal BQP.

    No, I meant (either the correct answer or ‘I don’t know’) with probability at least 2/3.

    Even if you want to call it YPP (which would be unfortunate, given its unlikelihood if equalling BPP, but hey, it’s your class), you really should have this class in the Zoo, and you really should explain the probability bounds there and not in the quantum case.

    While you define the classical randomized one, it is worth pointing out that the unintended reading of your probabilities (which is still plausible even with your amended phrasing, I think) leads to a class very much like ExistsBPP. I.e., it would be the deterministic Y operator applied to BPP.

    I think that it would be clearer to say it the way that I do for YA: Either the probability of Maybe is at least 2/3, or the correct answer is at least twice as likely as the incorrect one. Of course the thresholds in this phrasing are not exactly the same, but it is equivalent modulo amplification.

    Or if you like your thresholds better, you could say more explicitly that the sum of the probabilities of Maybe and the correct answer is at least 2/3.

  38. Greg Kuperberg Says:

    Well, I guess you don’t quite say that you think that YP = P either, only that it is similar in the sense that P/poly is similar. So I guess you have your own sort of parallelism in mind.

    Alas, the Zoo is already hopelessly inconsistent, and there is more than one mode of parallelism and it leads to conflicts. So it goes.

  39. Scott Says:

    Venkat: Thanks for your comment!

    If you’re willing to say, what led you to think about ONP? To me it seemed like ONP intersect coONP is the “more basic” class (since if there’s just one witness per input length anyway, then it seems strange to worry about accepting vs. rejecting). But I’d love to be convinced otherwise.

    Here’s a further observation along your lines: if P=YP then E=NE. I haven’t been able to prove the converse. I can, however, prove that if E^NP^NP = NE^NP^NP then P=YP.

  40. Andy D Says:

    Ryan’s result, relayed by Scott 7:36, begs to be strengthened, but probably can’t:
    here’s an oracle relative to which there’s a language L in coNP^A, NP^A, and P^A/Poly simultaneously, without being in YP^A (I won’t prove the latter but will try to make it seem reasonable; current messy proof available upon request).

    Make the following constraints:

    i) L is sparse (in fact, |L intersect {0,1}^n|

  41. Andy D Says:

    Greg said:

    I suspect that if YP = P, there is some unlikely collapse somewhere else.

    Maybe, but bear in mind: YP is in NP intsect coNP;
    in ECCC TR06-024, Buhrman, Fortnow, et al. showed an oracle where the PH is infinite yet P = NP intsect coNP (among other features).

    Which occasions a big-picture question I’ve wanted to ask complexity theorists:

    The hypothesis ‘P != NP’ historically seems to have been eclipsed by the more fertile hypothesis ‘PH is infinite’ as a guide to research. But is the latter now exhausting its fertility as a source of interesting, provable implications? If so, what are attractive candidates for the next big hypotheses?

  42. Scott Says:

    Thanks, Andy!

    As I said before, one can show that if P=YP then E=NE and EXP=NEXP.

    Regarding your “big-picture” question, right now we have more fertile hardness assumptions than we know what to do with! Here are just a few of them:

    * E!=NE (even EE!=NEE and EEE!=NEEE have been used…).
    * NP is hard on average.
    * One-way functions exist.
    * The Unique Games conjecture.
    * BPP!=BQP.
    * E requires exponential-size circuits.
    * FPT!=W[1] (the fixed-parameter tractability hypothesis).
    * Permanent does not have poly-size arithmetic circuits.

    I’m sure people can suggest others that I forgot…

  43. Andy D Says:

    Thanks, Scott. I’m aware of these, and some of their uses, which obviously represent an advance. I just want to say that none of them orient the field as comprehensively as ‘PH infinite’ seems to have had for a time, and that their number is now bewilderingly great, as you say (although there are interconnections). But maybe this is the inevitable progression.

  44. Venkat Says:

    Regarding ONP: Yes, I too agree that YP is more basic that ONP. We came up with ONP while trying to improve the Karp-Lipton theorem. Assuming NP subset P/poly, we want a class C such that PH=C and we would like to make C as weak as possible. We failed in this pursuit when we considered various known complexity classes as a choices for C. So, we concocted ONP and (via a simple proof) showed that PH = ZPP^ONP. And this this the best we could do.

    We got more interested in ONP after relaizing that ONP forms an hierachy similar to PH. Namely, P, ONP, coONP, ONP^ONP, co-ONP^ONP etc. This whole hiearchy is contained in O_2^p (the oblivious analog of S_2^p). Thus, O_2^p is like PSPACE and the above ONP hierarchy is like PH. And we found it curious to see such a situation existing inside P/poly.

    One question: I don’t see why
    YP=P ==> NE=E.
    I was able to only show that
    ONP=P ==> NE=E.
    I am sure I am missing something (and also I did not read all the comments to this post carefully).

  45. Scott Says:

    Oops! I meant that if P=YP, then E=(NE intersect coNE). Proof: Use the YP algorithm to guess all 2^n NE and coNE witnesses for length-n inputs.

    Thanks to Andy D. and Venkat for catching my oversight.

  46. Scott Says:

    Oops again! Lance has pointed out that for the other direction, I meant that if E = NE^NP^NP then P=YP. Here’s a proof:

    Suppose E = NE^NP^NP; then we can simulate a YP machine as follows. Given an input of size n, for each k from 1 up to poly(n), feed <n,k> to an E machine. The E machine answers the following question:

    Does there exist a witness w of size poly(n), such that
    (1) For all inputs x of size n, the YP machine produces an answer given x and w.
    (2) For all w’ that lexicographically precede w, there exists an x of size n such that the YP machine produces no answer given x and w’.
    (3) The k^th bit of w is a 1.

    From the answers to these questions, we can read off the
    lexicographically first valid YP witness, then use that witness to
    decide any input of size n we want.

  47. Venkat Says:

    E=NE^NP^NP ==> YP =P.
    That’s a nice downward collapse.
    But, I could not understand the proof. I guess I am missing something in the above proof.

    How does the P machine (that tries to simulate the YP computation) run the E machine in polynomial time?

    Moreover, where is the assumption E=NE^NP^NP used? It seems the specified computation on (n,k) can be carried out by an E machine (without any assumptions).

    Did you mean if E=NE^NP^NP then
    we get some collapse involving YE?

  48. Scott Says:

    No. <n,k> is an input consisting of O(log n) bits. So if E=NE^NP^NP, then it can be fed to an E machine, which then solves the search problem I described in polynomial time.

  49. Anonymous Says:

    If I’m not mistaken, that complexity class could be named BV for obvious reasons.

  50. venkat Says:

    Oops, sorry. Got your proof. 🙂