don't encrypt all the things
A while back, I observed that https is a sign of serious business. Google recently decided something similar. At the time, it was mostly a curiosity. “Hey, you got your not serious lolcats in my serious dogecoins!” After a few recent developments, I’ve been thinking about it a bit more.
Long ago, SMTP relay traffic was unencrypted. Then came the great NSA freak out. People in submarines were tapping undersea cables and reading my email. So I did what any sensible lemming would do. I created some certs and turned on TLS. Then came Heartbleed. Suddenly the set of people who could read my email went from “people in submarines” to “people who can access github”. Not strictly an improvement.
Heartbleed was certainly an anomaly, but the fact remains that the two crypto standards people turn to, TLS and PGP, are insanely complex. There are bugs in the standards and bugs in the implementations. Enabling them exposes a great deal more attack surface to the internet at large. Worst case scenario: net security negative.
More recently, Genkin et al are back with a power attack, building on their previous acoustic attack. (The acoustic attack definitely triggered an “Inconceivable!” response from me.) One thing to note is that the attack scenario depends on having gnupg configured to automatically decrypt email. What? Why would you do this?
Apparently, these guys are living in the future. All their email is encrypted. Even their spam is encrypted. Therefore, in order to apply spam filtering to it, it has to be decrypted. It would be very burdensome indeed to do so by hand; instead everything is decrypted automatically. Setting them up for the attack. A few unrelated thoughts on encrypted spam filtering.
“Enigmail automatically decrypts incoming emails by passing them to GnuPG, which uses Libgcrypt as its cryptographic engine.” From A Microarchitectural Side Channel Attack on Several Real-World Applications of Curve25519.
On the other hand, if I receive only a very limited amount of encrypted email, I’ll probably deal with it by hand. Meaning I’m probably going to notice when a barrage of encrypted junk mail arrives. I may even decide that since encrypted email is serious business, I’m not going to read it right here and now in the coffee shop while some weirdo keeps petting my laptop.
The more encryption there is, the less human interaction there can be. The computer runs the show. Computers, however, are not very good at identifying anomalous patterns. Automation increases what I’ll call temporal attack surface. My microphone/ground wire attack only works if I can induce your computer to perform certain operations within a window of opportunity. In theory, our private key is sensitive information, but when we use it to automatically decrypt all manner of nonsense being thrown at us, it is desensitized.
Maybe one could only decrypt email that’s signed by somebody you trust. Or somebody in your web of trust. I have little confidence in that idea. Let’s call a web of trust what it is: a social network. Oh yeah, lots of trust.
Consider the related issue of code signing infrastructure. For one example, Adobe lost control of their signing cert not because the signing machine was directly compromised, but because in the interest of making things easier than physically carrying CDs across the room to sign them, they built a system where a compromised build server could request arbitrary badness to be signed. They didn’t lose their cert, but they had to revoke it anyway. We generally think of the private key’s bits as the important part, but really it’s “signing capability” that needs protection. Adobe’s private key was not exposed to the internet, but their signing capability was. Game over.
Same thing happened to HP. I have a hard time understanding how HP’s signing infrastructure could be “100 percent intact” when there is malware with HP’s signature on it. How was it possible for a developer to accidentally sign malware, except by making that operation too easy? Effortless process, careless results.
Turning now to What’s the matter with PGP?, just published. PGP/GnuPG are terrible in many ways. A big part of the problem is key management and making things easy. I’ve been wondering for some time now, though, if we should be making things easy.
Making things easy means making them transparent. It means pushing the crypto away from the user. That’s how we end up with our lolcats and dogecoins all mixed up together.
And how this happens. UI transparency led to unintentional disclosure of sensitive information. If sensitive messages had been sent as encrypted attachments, and not handled by the mail client, the user would have known the original had been encrypted and would have additionally been aware they were about to transmit in the clear before clicking send.
Here’s a doozy. Outlook sent plaintext versions of encrypted emails.
Maybe some things should be hard, to remind us that they are important.