Planning defeat

Today’s cryptogram letter by Bruce Schneier contains a very interesting piece on digital eavesdropping capabilities and how governments put pressure on manufacturers through new laws to integrate surveillance systems in end-user products.

Planning a back door in your product is kind of like putting a self destruct mechanism on a plane. Now, not only do you have to make sure normal operations are secure, but you also have to be damn sure that this new system is as well. In short, 3 times the amount of work for (ideally) the same result.

Recently, a friend of mine locked her door nice and tight and forgot to close the window. Everybody laughs at that. Why wouldn’t they about putting a back door in a piece of software?  Common sense says that it’s exactly the same problem: it gives two ways to access the contents, with different security systems. Granted, the security systems should be good on both, but they are by essence different. It means that the weak link in the security (the human) is burdened twice. And that the potential bad guy has an extra mean of access. It does seem like a loose-loose situation, doesn’t it?

So why do people keep doing it?

Well there are two main reasons, and a few corollary ones:

• It’s reassuring.

In case the main door is unavailable, for instance like when you have a castle and the attackers got inside and control the main access points, you still have a way to access the inside. People have been doing that for millenia. All the legends, and all the stories, talk about a “secret entrance” that the bad guys don’t know. Sometimes it’s a secret exit, but the principle is the same. Its very security, however, depends completely on secrecy. You can’t tell anyone about it. The very same guards that swore to protect your castle can’t know about it, because, even if they are loyal, their watching this secret entrance will attract attention to it. But it also means that you won’t know about it if anyone else gets privvy to the information. And therefore if anyone else knows about your secret entrance/exit, you are defenseless against them if they decide to use it for “bad” goals.

On the one hand, backup is always good. You have something to fall back to in case everything goes pear shaped. On the other hand, backup is very often incompatible with security in the sense of privacy/secrecy of the contents.

But I can hear you say “I could have a very good security system on my backup plan, so that anyone other than me cannot access it!”. Yes, you could. Again, remember Ali Baba. The cave is protected by a password, in addition to be secret. But you would have to make the password/key/etc completely secure. Which means you wouldn’t be able to open it easily. Which means, given human nature, that after using it 3 or 4 times, you would revert to a somewhat simpler security scheme. And rely almost exclusively on the secret part to protect it. That’s how the 40 thieves got stolen from.

• It’s a power trip.

You and you alone know a secret access to a place that might or might not be yours. Like the old myth of invisibility, you have a way to control and/or check what others think of your whereabouts.
Everyone might assume you are safely locked in the dungeon, but you are having fun at the nearest pub. Or the people inside the dungeon may think they can keep you out, but in reality, you could surprise them inside any time you like.

This is the main argument in the laws mentioned in Cryptogram: the bad guys don’t know that you are listening in on everything they say. Therefore they will chat quite freely about their mischievous plans to take over the world. And the arm of the law will catch them with their pants down.

That’s all good in theory, but don’t they think that maybe the bad guys know this already? After all, when there is a war going on, even the “good guys” (ie always “us” as opposed to them), everyone talks in code, so that even if the conversation is intercepted, the “bad guys” (same’o but reversed) won’t understand it. So, if the bad guys are aware that their calls might be monitored (and I don’t see how they would not be aware of that), they will take steps to ensure security. Which means that only people who are not aware they could be monitored (aka people who don’t think they are doing anything wrong enough to warrant the police’s attention) will be caught. Are these really the kind of bad guys we (as society) want to catch at all costs? Even if it means giving a potential way into a very sensitive place to the real bad guys?

[Fast forward]

In terms of software, the back door is even more elusive, because there’s very little that leaves a trace that can’t be erased. Therefore, if a bad guy has the secret key, there’s a good chance they can perpetrate a lot of bad deeds without anyone knowing. Can anyone risk that just for an ego trip?

  

Leave a Reply