Facebook-owned messaging service WhatsApp reportedly rejected a demand from the British government earlier this year for it to create a backdoor into its secure network.

Such a system would allow UK security services to access messages sent through the service, which protects its users' communications with end-to-end (E2E) encryption.

According to Sky News, British officials during the summer demanded WhatsApp come up with "technical solutions" to hand over messages believed to be linked to criminal or terrorist investigations – which they argue have been hampered by the software.

But WhatsApp, citing the basic principles of encryption, said that to facilitate special access would only serve to weaken the protection provided to all of its users.

The move, it said, would put its security at risk.

E2E works by scrambling messages so that content cannot be intercepted in a readable format and is only open to sender and receiver.

Tech companies which use strong encryption do not have access to messages – a process that became increasingly popular following the Snowden revelations back in 2013.

His leaks revealed how British spies were scooping up and retaining text and phone call metadata of the population, including British citizens not suspected of committing crimes.

WhatsApp fined €3m over data sharing
WhatsApp is again under fire over its strong encryption Getty Images/Carl Court

WhatsApp, Sky News reported, complies with metadata requests, giving access to some account information such as account names, message dates and associated email addresses.

But it says it cannot hand over what it doesn't store – the content of communications.

In an FAQ on its website, the company states: "WhatsApp has no ability to see the content of messages or listen to calls on WhatsApp. That's because the encryption and decryption of messages sent on WhatsApp occurs entirely on your device.

"Naturally, people have asked what end-to-end encryption means for the work of law enforcement.

"WhatsApp appreciates the work that law enforcement agencies do to keep people safe around the world. We carefully review, validate, and respond to law enforcement requests based on applicable law and policy, and we prioritise responses to emergency requests."

The rejection comes as UK prime minister Theresa May – and other government officials such as home secretary Amber Rudd – are becoming increasingly opposed to strong encryption, a stance only bolstered following a number of terrorist attacks on British soil this year.

Critics of such a stance previously branded it "technologically illiterate rubbish."

In New York this week, during the UN General Assembly, May will demand that global technology companies bend the knee to the will of governments around the world.

The PM, the government revealed before the meeting, will claim that technology companies now need to do more to help combat the presence of terrorist content on their platforms.

Officials from Facebook, Microsoft, Twitter and Google will be listening intently.

May will ask them to "develop new technological solutions to prevent such content being uploaded in the first place." It was not confirmed if this would extend to encryption.

She will say: "We need a fundamental shift in the scale and nature of our response – both from industry and governments – if we are to match the evolving nature of terrorists' use of the internet.

"This is a global problem that transcends national interests.

"Governments must work with and support the efforts of industry and civil society if we are to achieve real and continuing progress and prevent the spread of extremism [in] cyberspace."

After the snippet of the speech was released, UK human rights defenders hit back.

"We need to recognise the limitations of relying on automated takedowns," said Jim Killock, executive director of the Open Rights Group in a statement. "Mistakes will inevitably be made – by removing the wrong content and by missing extremist material.

Killock continued: "Given the global reach of these companies, automated takedowns will have a wide-reaching effect on the content we see, although not necessarily on the spread of extremist ideas as terrorists will switch to using other platforms.

"This move by the British, French and Italian governments could also be used to justify the actions of authoritarian regimes, such as China, Saudi Arabia and Iran, who want companies to remove content that they find disagreeable."

On Twitter, the Electronic Frontier Foundation (EFF) wrote: "UK government wants secret backdoor from WhatsApp, but can't even keep its own demands from leaking."