Security expert Brian Spector looks at how easily General Petraeus' confidential information was stolen and the problems with the current way we store information online.

Petraeus
Authorities had no problem accessing confidential emails belonging to Gen. Petraeus because of the way they were 'secured'

I have been watching the Petraeus affair closely over the past few weeks and, I have to say, it has all the ingredients of a first-class thriller.

The chiselled military hero, the hi-tech espionage, the secret service infighting, the wanton peccadillo; it's basically Skyfall served up in a sauce américaine. And, of course, it actually happened, which is a lot more than you can say about Skyfall.

But what is particularly striking for me is that, many thousands of years into the evolution of civilisation, the Petraeus affair shows quite clearly that we still haven't learnt one fundamental lesson:

"If you can open a window to get in to your house for legitimate reasons, a burglar can do exactly the same thing for non-legitimate reasons."

Translation: if the thing that requires opening can be opened by anyone apart from you, it can, perforce, be opened by people who should not wield that power.

What's this got to do with Petraeus?

This isn't just to do with Petraeus; it is to do with the global senior business, defence, political, cultural, technical and scientific communities of which Petraeus and many others are a part.

These figures prize confidentiality and privacy extremely highly - for professional reasons, certainly, but perhaps for other reasons too - so they ensure that they use various methods of "secure" communication.

But this really is nonsensical. Basically, almost all forms of secure email and file transfer still rely on a stored encryption key.

This is ostensibly designed to keep data private, but instead it actually puts the ability to decrypt and read data squarely in the hands of a third party (namely, the vendor that has supplied the security or encryption software or service that stores the keys).

I've written about this more extensively elsewhere, but to summarise, government agencies can force the vendors to hand over the stored encryption keys so that they can decrypt the messages themselves.

As simple as that, General - they can get in through your window and look at your stuff just like you can.

Storing up a storm...

Now there's a line from spoof spy movies that I love, not only for its camp pomposity, but also because it reveals a really important underlying point: "I could tell you, but I'd have to kill you."

The point is this: anybody or anything that stores sensitive information is a risk.

I tell you my secret, you store it up in your head. Unless I then "neutralise" you, you can betray that information, whether willingly or no.

You become both a liability to me and a target for somebody else. Security vendors , as we have hinted above, routinely store keys (in fact, their systems can't work any other way). So they're not helping.

Now consider this.

A system of encryption where nothing relating to the user's identity is stored; where the keys are created within a browser, without any use of an external security vendor (in the form of a Trusted Authority or keystore, or similar); where the master secret is split across servers, so that it can never be reassembled; where the encryption keys are created using a calculation that works in one direction but not in the other, so that it can't be reverse-engineered to get at the original message or file.

And immediately, the landscape changes. In this scenario, only sender and recipient can ever open and read the messages and files that are sent. Nobody else. And that includes the security vendor themselves, hackers, whistleblowers, aggrieved ex-employees, your boss, and Uncle Tom Cobbley and all. Oh, and the FBI and their international buddies.

Redefining privacy

But is privacy really so precious that technology should be able to protect it at any cost?

It's a thorny question, but you have to see it against the backdrop of the way the online community now communicates. Opinions and conversations that were previously private have gone public.

People talk about their ingrowing toenails on Facebook. They tweet when they have been let down by their train company or served bad food in a restaurant.

Truly private discourse has ever-dwindling currency in the online world. This, in turn, makes it a more precious item, requiring more radical measures to protect it.

Alas, whether that discourse is ultimately for good or ill is not ours to influence. But we at least owe it to people and businesses to make sure they understand the difference between protecting their communications and inviting a third-party to step in through an open window and join the conversation when instructed.

A difference that a battle-hardened General really should have understood.

Brian Spector is CEO of online security company Certivox.