How large would the key have to be to make the NFS precomputation not feasible, even for a state entity with lots of resources. With whatever hardware scale up rates we foresee for, say, the next decade.

]]>While NP being in P/poly doesn’t imply that P=NP, NP being in P/poly would be almost as shocking as finding out that P=NP. If we found out that NP was in P/poly we’d have to so fundamentally reevaluate our entire approach to things that any degree of confidence that P != NP would have to put aside until we had very carefully examined the proof that NP is in P/poly and understood exactly where it was coming from and what the limitations of the technique were.

]]>Standard primes have the big advantages that they can be specifically studied for weaknesses and implementation tricks. There are even standards for what they should be: https://tools.ietf.org/html/rfc3526

As others who beat me to making these points earlier have pointed out, the one and only thing which is really needed here is increasing the sizes.

]]>For arbitrary OWFs to be efficiently invertible I’d regard as *much* more surprising, almost as surprising as P=NP.

As a final remark, I doubt there’s a great deal of difference between the uniform and nonuniform cases (i.e., between P and P/poly) for any of these questions.

]]>In any case do you think Integer factoring, DLog being in P/Poly is a reasonable possibility? Or is this also like ‘when pigs shall fly’? Either Integer factoring or DLog being in P/Poly does not seem to affect other complexity classes. In general could every OWF be inverted in P/Poly a reasonable possibility?

]]>