The SSL Certificate of Damocles

Ever since I “upgraded” this website to use SSL, it’s become completely inaccessible once every three months, because the SSL certificate expires. Several years in, I’ve been unable to find any way to prevent this from happening, and Bluehost technical support was unable to suggest any solution. The fundamental problem is that, as long as the site remains up, the Bluehost control panel tells me that there’s nothing to do, since there is a current certificate. Meanwhile, though, I start getting menacing emails saying that my SSL certificate is about to expire and “you must take action to secure the site”—never, of course, specifying what action to take. The only thing to do seems to be to wait for the whole site to go down, then frantically take random certificate-related actions until somehow the site goes back up. Those actions vary each time and are not repeatable.

Does anyone know a simple solution to this ridiculous problem?

(The deeper problem, of course, is that a PhD in theoretical computer science left me utterly unqualified for the job of webmaster. And webmasters, as it turns out, need to do a lot just to prevent anything from changing. And since childhood, I’ve been accustomed to countless tasks that are trivial for most people being difficult for me—-if that ever stopped being the case, I’d no longer feel like myself.)

37 Responses to “The SSL Certificate of Damocles”

  1. Noah Lidell Says:

    https://letsencrypt.org/getting-started/

    Let’s Encrypt is the way to go. It auto renews the cert for you. Generally easy to use and manage. You should tell your hosting support people about this as they are dropping the ball if they are not using let’s encrypt and their users have to manually renew their own certs all the time.

    If you have shell access to the box it is easy to do yourself.

  2. Scott Says:

    Noah #1: Thanks! But I tried Let’s Encrypt, and had exactly the same issue of the certificate expiring over and over. Given many experiences over the past 13 years, my guess is that the problem lies with Bluehost. Maybe I’ll try again…

  3. Hal Says:

    The useful thing about Let’s Encrypt is that there are plugins that manage SSL certificate renewals. There’s one for CPanel, which is I believe what you’re using to manage your account with Bluehost. If so, have you enabled:
    (WHM >> Home >> SSL/TLS >> Manage AutoSSL)

    Otherwise, it’s just a Bluehost problem. Or quantum interference. Sorry.

  4. Danijel Kecman Says:

    Yeah, Bluehost sucks. I’ve hosted once with them and ran away in horror. Great thing I’ve found is static site generators, for website and blog, with Github. Hugo is great engine. There’s also WordPress to Hugo exporter or any other static site generator. Write in Markdown and you can render LaTeX with some of LaTeX engines. https://katex.org/

  5. Carl Says:

    For what it’s worth, I made a command line utility that will tell you when a certificate expires. If you run it in a cron job you can have it yell at you if the expiration date is too soon. https://github.com/carlmjohnson/certinfo

  6. Job Says:

    Meanwhile, though, I start getting menacing emails saying that my SSL certificate is about to expire

    Are these emails from BlueHost, or another service? Seems weird that your host can’t help you with this, and kind of suggests it’s something else.

    Do you use any type of reverse-proxy stuff for caching and performance?

    What error do your visitors see when this happens? Is it a “495 Certificate Error”? I don’t remember seeing cert errors on your site, but i have seen what looked like caching related problems from time to time.

    Your site is using NginX. Did you do any type of setup related to NginX or is it managed by BlueHost?

    Basically, how custom is your hosting setup?

  7. theotherplanb Says:

    I migrated https://lvaas.org from Bluehost to DreamHost last year precisely because Bluehost had poor support for SSL. Dreamhost supports Let’s Encrypt and they just take care of it for me, automatically, and so far flawlessly. I am super-happy with DreamHost and highly recommend them.

  8. Scott Says:

    Danijel #4 and theotherplanb #7: Thanks. For probably a decade, I have … err … dreamed of migrating to Dreamhost, just as soon as I have a spare month to deal with everything on my site that will inevitably break when I migrate. That spare month has never arrived.

  9. ArshadM Says:

    Scott #8: It is a pain to switch over, probably easiest is to register a temporary domain get everything copied over and working and then redirect the existing domain.

    If you need any help, do reach out.

  10. James Hiew Says:

    Cloudflare (https://cloudflare.com/) seems to handle all the SSL certificates for you automatically indefinitely, once you’ve put the legwork in to switch over to their DNS servers.

  11. Sandro Says:

    @Scott #2, Bluehost is probably using Let’s Encrypt as well, because the certificates for Let’s Encrypt are specifically designed to expire every three months. It sounds like the automated renewal scripts aren’t running properly.

    I’m not sure what kind of control panel Bluehost has or how much control you have, but there are many good clients that should explain how to automate the renewal process (or do it for you in some cases).

  12. asrp Says:

    Even if tech support says its not possible, it doesn’t mean that it really is. But you probably have to do it yourself. I would

    1. Try Hal #3’s AutoSSL suggestion.
    2. If that doesn’t work, try Noah #1’s suggestion. It looks like Bluehost does have SSH access (https://my.bluehost.com/cgi/help/180), you just need to enable it. (If you want, you can disable SSH access once you have a periodic automatic renewal script set up.)

    For 2, I use acme-tiny (https://github.com/diafygi/acme-tiny) but there are plenty of renewal scripts for Let’s Encrypt around. There are all slightly complex in terms of number of steps. I don’t know if there are as many ready-made tools for Sectigo. I’ve never used them.

    3 months is reasonable for SSL certificate expiration but unreasonable to be unable to renew them beforehand. If this really happens that often, you should have enough time banked in your Ski Rental Problem to explore these solutions or try to switch hosts.

  13. James Cross Says:

    Make your life easy. Move to WordPress.

    I am not a paid endorser or in any benefit from the endorsement.

  14. Oliver Friedmann Says:

    Hi Scott,

    Use https://www.cloudflare.com – you can configure it as sort of SSL proxy in front of your non-SSL site. CloudFlare will take care of automatically issuing certificates and renewing them.

    Cheers,
    Oliver

  15. Andrew B. Says:

    CloudFlare is the correct answer. I’ve done this a million times and happy to help. Three steps. (a) Sign up at CloudFlare and add your domain. They’ll automatically port over your DNS settings. (b) Sign in to your domain registrar, and switch the domain’s DNS servers over to CloudFlare. (c) You now support https! If you want *all* requests to be https, there’s a toggle for that under the ‘Crypto’ tab of your CloudFlare settings. No more periodic tasks—no more managing your own certs—it’s just done!

  16. Scott Says:

    Hi everyone, I’m now at Google’s annual quantum computing conference, on a farm outside Santa Barbara.

    Dave Bacon—“quantum computing’s elder clown,” a major early influence on this blog, and now a full-time software engineer at Google—was telling me this morning about how much trouble he’s been having with renewal of SSL certificates every three months. And he hadn’t even seen this post.

    I can barely express how much better this made me feel.

  17. Scott Says:

    Thanks, everyone—it sounds like Cloudflare might be the answer! I have a good friend in Austin who works at Cloudflare, so I’ll ask him if I need help.

  18. Tim McCormack Says:

    While Cloudflare can solve the problem of visitors seeing cert expirations, it introduces a number of new problems:

    – Cloudflare’s default settings, last I heard, include making it difficult for Tor users (and random other people using privacy-conscious browsing configurations) to view your site
    – It also obscures anything that looks like an email address, which is irritating to anyone not running arbitrary Javascript from websites
    – The connection from Cloudflare back to *your* server is not secured
    – And finally, it mean they become a man-in-the-middle for yet a slightly largely fraction of the web, capable of centralized surveillance or censorship. Cloudflare has direct read/write access to a massive portion of the web’s traffic.

    (The first and last of those are of the most concern to me, personally.)

    If BlueHost is failing to automate Let’s Encrypt cert renewals, it sounds like they’re a pretty bad host—this is an *extremely easy* thing for a web host to do! (With the service I use, it’s not push-button precisely, but you just SSH in and run a single command, once.) I’ll echo what some others have said and encourage you to take the route of getting cert renewal working.

  19. Diego Says:

    Tim #18, your first item isn’t correct. Please see https://support.cloudflare.com/hc/en-us/articles/203306930-Does-Cloudflare-block-Tor- and also https://blog.cloudflare.com/cloudflare-onion-service/

  20. Uncle Brad Says:

    I have a completely off topic question.

    If the current state of the universe is determined by the initial conditions, does that mean that the people who don’t believe in evolution are right? That would really burn my biscuits.

  21. Brock Says:

    Only three months before certificate expiration? That’s a really short period. You don’t have a one-year option?

  22. Douh Says:

    Until singularity hits or quantum computing renders SSL useless try buying certificates for duration of 1 year at least. You can be trouble free for 2.5 years https://www.ssl.com/blogs/ssl-certificate-maximum-duration-825-days/

  23. Scott Says:

    Uncle Brad #20:

      If the current state of the universe is determined by the initial conditions, does that mean that the people who don’t believe in evolution are right?

    I have difficulty understanding why this is even a question. Why couldn’t it be simultaneously true that
    (1) the initial state of the universe determines its entire future evolution, and
    (2) the evolution of the universe includes … well, evolution? 🙂 I.e., the process that Darwin described?

    Note that, in a world where (1) and (2) both hold, the initial state of the universe wouldn’t need to encode any knowledge at all about living organisms—any more than a listing of the axioms of set theory “encodes the knowledge” of why Fermat’s Last Theorem is true. Life could arise later. That’s the whole point of Darwinism.

  24. ppnl Says:

    Scott #23

    What definition of “encoded” are you using?

    With a powerful enough computer, I could derive all the biological information about all species that ever evolved. In that sense, it is encoded. It is just encrypted. Evolution is the method of decryption.

    The only difference between a deterministic universe and a quantum probability universe is one effectively uses a pseudorandom number generator and the other uses real randomness.
    Maybe.

  25. Scott Says:

    ppnl #24: We could have a long and interesting debate about the semantics of “encoded,” but it wouldn’t have any bearing at all on the question I was asked, for the following reason. If you believe that the whole history of life was “encoded” into the simple, low-information initial state of the universe, in the sense that the entire subsequent history of the universe (including the evolution of life, which unfolded exactly as Darwin said it did) could be seen as a “decoding” of the initial state … well, that doesn’t make you a creationist. It just makes you an evolutionist who uses words in a weird and nonstandard way.

  26. ppnl Says:

    Scott #25,

    I don’t think we disagree on the answer to Uncle Brad’s question. It is just the way you worded it left me confused about some issues.

    First, if the universe is deterministic then the information content of the early universe must have been the same as it is today. Probably not low. A running deterministic program cannot increase its information content. Ditto a deterministic universe. Think Kolomogrove complexity.

    If an all-powerful deity created a deterministic causal universe and set the initial conditions such that we and our entire history evolved then I would say it encoded us. It would be like it wrote a program to create us. If the same universe happened without the intent of a supreme being it would still be identical. I would still say we were encoded in the initial conditions. In both cases, the information content would remain constant.

    And neither has any bearing on our understanding of evolution so I guess it is a change of subject. Sorry.

  27. JimV Says:

    It seems to me, not that I’m an expert, that the Uncertainty Principle would rule out a precise enough setting of initial conditions to produce all the circumstances (including an asteroid wiping out the dominant life forms to leave room for mammals to develop) which over 13 billion years later produced humans. So according to QM as we know it, that initial setting itself should be considered a random event.

    Of course if you believe in a magic universe-creator, that won’t stop you, but you are left with said U-C setting things in motion over 13 billion years ago (in our reference frame) to create a universe of which roughly 0% is habitable by humans, so that in one infinitesimal bit of it, some humans could develop and have a civilization until they plunder all that bit’s resources and pollute it. So I think the standard creationist theory is that everything was created in much its current state 10,000 years or so ago and will end within another 1000 years or so, all by magic, and they don’t need no stinking science. Or if they do try to incorporate some science they just don’t think it through.

    (As a dedicated user of this site, what happened a couple days ago is that I tried to connect to it and my virus-protection software wouldn’t let me because it wasn’t certified.)

  28. Scott Says:

    ppnl #26: There are different notions of “information content” relevant here. If you want to talk about Kolmogorov complexity, then it actually CAN increase with time—albeit, only logarithmically. However, probably more useful is some notion of time-bounded Kolmogorov complexity, which can shoot up much more rapidly with time as complicated structures evolve (cf my paper with Sean Carroll, Lauren Ouellette, et al).

    But the easiest way to dispel confusions—easier by far than arguing about abstract definitions of information content—is to consider a concrete example, like Conway’s Game of Life. If you started a life board in some simple initial state (like only 50 or 100 alive cells) and let it evolve for long enough, on a large enough board, it’s plausible that you’d eventually see complex structures evolve that were subject to Darwinian evolution. But were these creatures “encoded” in the initial state? Only in the sense, perhaps, that the proof of Fermat’s Last Theorem is “encoded” in the axioms of set theory, or only in the sense that your whole identity was “encoded” in your DNA at the moment of conception. I.e., only in a—dare I use the word?—reductive sense, one that short-circuits the entire story by which the thing we’re trying to explain came into being, after that story’s first few lines.

    And yes, JimV #27, of course it’s true that in our quantum-mechanical universe, the evolution of what we see around us is fundamentally indeterministic—it’s only the evolution of the entire wavefunction that’s deterministic. By talking about what would still be true even if the world had been classical, I was trying to make the conclusion only stronger.

  29. ppnl Says:

    Scott #26

    There are different notions of “information content” relevant here. If you want to talk about Kolmogorov complexity, then it actually CAN increase with time—albeit, only logarithmically.

    Really? I take your word for it as you no doubt know this stuff far better than I. But I can’t for the life of me figure where my intuition is failing me. A trillion digits of pi, for example, cannot have a higher Kolmogorov complexity than the relatively small program that produced it. This follows simply from the definition of Kolmogorov complexity. This should be true of any program at all that produces any output at all.

    In a deeper sense, Kolmogorov complexity should be strongly related to Shannon information. After all, taking a large string and deriving a shorter program to produce it is just data compression right? So where am I going wrong?

    But the easiest way to dispel confusions—easier by far than arguing about abstract definitions of information content—is to consider a concrete example, like Conway’s Game of Life. If you started a life board in some simple initial state (like only 50 or 100 alive cells) and let it evolve for long enough, on a large enough board, it’s plausible that you’d eventually see complex structures evolve that were subject to Darwinian evolution. But were these creatures “encoded” in the initial state?

    Well yes, obviously. (Actually, I think it is inconceivable that that small amount of information could produce that apparent complexity. But if it did… ) For example, if I start with the large state and derive a smaller state that creates it then that is just data compression. I have encoded the large state into a smaller state. I can only do this if the large state is not random and again we have the connection between Kolmogorov complexity and Shannon information. Both the Kolmogorov complexity and the Shannon information of the large state cannot exceed that of the smaller set. Again, what am I doing wrong?

    Only in the sense, perhaps, that the proof of Fermat’s Last Theorem is “encoded” in the axioms of set theory,…

    Again I would say yes, obviously. The choice of axioms is the seed that produces the vast chaotic and computationally intractable set of all that is true in that system as a vast fractal of truth. As an analogy, you can think about how a tiny program can produce the vast chaos of the Mandelbrot set.

    Now the Mandelbrot set has been described as the most complex object in math. Yeah, well… anyway, I think it was Chaitin who pointed out that the study of chaos is the study of simplicity rather than complexity. It is the study of how objects like the Mandelbrot set that look very complex can have very simple structures. He points out that Kolmogorov complexity is the study of complex looking objects that are incompressible and so are really complex. That is the real study of complexity.

    But my intuition seems to be leading me wrong somewhere.

  30. ppnl Says:

    Scott #28,

    I forgot to comment on one thing and I’m back because I think it is important.

    …or only in the sense that your whole identity was “encoded” in your DNA at the moment of conception.

    No, absolutely not. I am not a closed system. For example, a computer program connected to the internet can produce output with arbitrarily high Kolmogorov complexity simply by printing out random web pages. I can also decrease by entropy for the same reason.

  31. Scott Says:

    ppnl: The Kolmogorov complexity of the state of a cellular automaton after t steps will in general grow like log(t), because to specify the state you need to specify t.

    Reducing a person to their DNA is an imperfect analogy for reducing everything that happens in a given universe to its laws and initial state, since as you rightly point out, a person is not a closed system. But it fits the theme of trying to reduce a large unfolding process to a single originating seed. Sometimes that makes sense, but not when aspects of the process are explicitly what interests you (as is the case with evolution).

  32. ppnl Says:

    Scott #31,

    Yeah. I read your paper last night and noticed that you used gzip to estimate the complexity of the coffee. The first thing that occurred to me is that you don’t need to estimate the complexity since you have a program that produced the output. It can act as an upper bound for complexity. All you have to do is specify the number of generations… yeah, log(t). So simple. I really should have clicked to this. I just wasn’t thinking about how small log(t) is.

  33. AdamT Says:

    Scott #28, “…it’s only the evolution of the entire wavefunction that’s determination istic. .”

    That assumes there is such a thing as a universal wavefunction. Roselli’s relative QM interpretation says no such universal wavefunction exists. That is simply to say there is no such thing as the universal “state of the universe” at any time T. Another way of saying there is no such thing as a universal privileged observer.

  34. Scott Says:

    ppnl #32: No, in the case of our coffee paper we didn’t have the log(t)+O(1) upper bound—firstly because the evolution rule was random, and secondly because we cared about time-bounded Kolmogorov complexity and versions of time-bounded sophistication. It holds if you’re looking at traditional Kolmogorov complexity under a deterministic evolution.

  35. Scott Says:

    AdamT #33: Right, Copenhagenists also deny the existence of a universal wavefunction (indeed, I don’t entirely understand how Rovelli’s view is different).

    Personally, while you can deny that a universal wavefunction is useful or needed, I don’t understand on what ground you could prevent someone else from considering it as a consistent mathematical construct, as long as you accept the universal validity of QM.

    But that’s beside the point. For what I was saying above, it would suffice to retreat to the weaker claim: if there’s any sense in which the evolution of the universe is still deterministic, even in QM, then it’s that of unitary evolution of the universe’s wavefunction.

  36. ppnl Says:

    AdamT #33,

    I looked at the wiki page on this.

    How is the relative QM interpretation different from decoherence? I think it has long been known that QM has a subjective element to it in that different observers can have different views of the world – the cat in the box for example. I think that is one of the motivations behind many-worlds. Decoherence allows us to analyze this in terms of the thermodynamic contact or lack of it between observers. This creates a kind of informational frame of reference where people in different informational frames can disagree on elements of reality. This seems just like what the relational interpretation is saying but with different words.

    I’m not sure why he makes a point of rejecting the universal wave function. There may be practical difficulties with isolating yourself from the entire universe but if you could you would have to deal with the universal wave function. There is simply no mileage in worrying about it at all.

  37. ppnl Says:

    Scott #34,

    Will have to look at the paper again to see what I’m missing but it is a little above my pay grade. This is just a hobby.

Leave a Reply

Comment Policy: All comments are placed in moderation and reviewed prior to appearing. Comments can be left in moderation for any reason, but in particular, for ad-hominem attacks, hatred of groups of people, or snide and patronizing tone. Also: comments that link to a paper or article and, in effect, challenge me to respond to it are at severe risk of being left in moderation, as such comments place demands on my time that I can no longer meet. You'll have a much better chance of a response from me if you formulate your own argument here, rather than outsourcing the job to someone else. I sometimes accidentally miss perfectly reasonable comments in the moderation queue, or they get caught in the spam filter. If you feel this may have been the case with your comment, shoot me an email.