smartphone security
If it's that easy for the government to get into your private data, would you still feel safe using that company's device or service? Reuters

Technology, internet-based and social media companies like Google, Facebook and Apple are to be banned from offering encryption that is too advanced for law enforcement agencies to track, as part of new laws to be unveiled under the Investigatory Powers Bill on Wednesday 4 November.

The Investigatory Powers Bill, also known as the "Snooper's Charter", is a hotly-contested set of legislations that many feel will endanger the privacy of UK citizens. The UK government has been pushing for this bill to be approved since May as it believes that failure to have access to people's phone and internet communications will means intelligence agencies are unable to prevent imminent terrorist attacks.

On 30 October, it was reported that the new bill would give the police the power to access the web browsing history of anyone in the UK; however, after a great deal of criticism from the media and outrage from civil liberties groups, on 1 November Home Secretary Theresa May announced that the UK government was backtracking.

Instead, May said that the UK government would only grant law enforcement the power to view internet connection records rather than exact browser history, and that this power was being granted with the intent of catching paedophiles and child abusers, rather than spying on innocent citizens.

What exactly is a "backdoor" and why should I care?

The UK government is also claiming that it has no plans to restrict tech companies from encrypting material on the internet, but this is not totally true, as the Telegraph claims it has been told that companies will be required under warrant to provide unencrypted communications to the police.

Essentially, what this means is that the UK government is saying that while we are entitled to security to protect our data and our financial information, that encryption must still be breakable by law enforcement agencies if they suspect that you might have done something wrong.

The concept of allowing a government agency independent access to users' private communications in a system is known in the computing world as a "backdoor", however confusingly the UK government has said that it doesn't want a backdoor – it just wants your security to be easy to unencrypt when it wants to see it.

Less encryption just means c**p encryption

In 2014 during the Oscar Pistorius trial, South African prosecutors were unable to get into his iPhone to access his WhatsApp messages, and had to fly out to New York to beg Apple to unlock the phone. The new UK laws would mean that in this situation, Apple wouldn't be able to say no, and it would have to instantly provide a way for the suspect user's data to be accessible to the police.

Oscar Pistorius
During Oscar Pistorius' trial in 2014, South African police had to fly to New York to beg Apple to unlock his iPhone so that they could gain access to his WhatsApp messages Reuters

NSA whistleblower Edward Snowden argues that providing government agencies with access to private citizens' communication still equates to a backdoor, no matter how you call it. However, what the Telegraph is describing, is actually far worse than a backdoor.

On its website, Apple states that it has "no way to decrypt iMessage and FaceTime data when it's in transit between devices" and that "unlike other companies' messaging services, Apple doesn't scan your communications, and we wouldn't be able to comply with a wiretap order even if we wanted to".

Well, so if the UK government doesn't want independent access to Apple's iCloud, and Apple isn't able to access users' data by the virtue of how its system has been created, then how will the UK government get access to a suspect's data?

There is no magic wand that Apple can wave to fulfil the UK government's demands and there is no such thing as "so-so encryption". Encryption is either good and it keeps other people out, or it's bad, and possible to hack.

This means that in another situation with a criminal trial, Apple would simply have to build another system that instead offers really c**p encryption so that it's easy enough to access a user's data whenever the UK police come calling.

If the encryption is c**p, why would you use the service?

But if tech companies have to build lousy encryption into their systems so that the UK government can gain access, well then why should we use their services anymore?

If the encryption has to be easy enough for Apple to instantly allow the UK government access, then that means it's no good. So then what's to stop anyone from hacking your account or your iPhone to gain access to your data?

And if your data isn't actually secure on your iPhone, or Facebook, or Android phone, then would terrorists really continue to use these services to plan their attacks, or would they just move to more secure encrypted communications? While some bad people in the world are stupid, like the alleged state hackers caught on the web this year, real criminal masterminds and evil doers aren't chatting about their plans willy-nilly on Facebook and over iMessage (read: Dissidents turn to bitcoin-like cryptocurrency to communicate free from state surveillance).

If the leaked information about the Investigatory Powers Bill turns out to be true on 4 November, then the UK government really needs to wake up. If tech companies make services with bad security, people won't use them. So then what's the point in such a law?