Social media giants Facebook, Twitter and Google were grilled by MPs for failing to take a tougher stance on abusive content and hate speech on their platforms. Executives from the social media companies were criticised by the Commons Home Affairs committee in a hearing on Tuesday (14 March) about how they deal with online abuse and hateful content.

Yvette Cooper, chairwoman of the committee, questioned why Google's YouTube refused to take down a video by former Ku Klux Klan leader David Duke called "Jewish People Admit Organising White Genocide." She expressed disbelief when Google's vice president Peter Barron said it did not "breach its guidelines".

The nearly 15-minute YouTube video accuses "Zionists" of having "ethnically cleansed the Palestinians" and planning to do the same to Europeans and Americans.

"You allow David Duke to upload an entire video which is all about malicious and hateful comments about Jewish people. How on earth is that not a breach of your own guidelines?" Cooper told Barron, the Telegraph reports. "I think most people would be appalled by that video and think it goes against all standards of public decency in this country."

She also noted that Google only took down two other videos, including one posted by the neo-Nazi group National Action which is banned in Britain, after the committee raised the issue.

Labour MP David Winnick even accused the company executives of "commercial prostitution" and repeatedly questioned whether they felt any shame.

Barron said YouTube relies on "notification and self-policing" from its community of over one billion people to detect and flag harmful content. Although he admitted that the Duke video was "anti-Semitic, deeply offensive and shocking," he said it does not "meet the test for removing under our guidelines".

"We are in favour of free speech and access to information," Barron said.

The social media executives defended their companies noting that the immense volume of content on their platforms made it impossible to proactively search and detect such material. However, they said they are rolling out new tools and measures to address these issues and quickly remove illegal content.

"To suggest we are in some way negligent or not caring about this issue is simply not true," Facebook's Simon Milner said.

Yvette Cooper
Yvette Cooper blasted the social media giants over their failure to act on reports of hate speech, offensive and abusive content online Reuters/Darren Staples

Twitter's head of public policy and government Nick Pickles said: "The positive benefits our platforms bring — and technology brings — comes with serious challenges. Yes, it brings out some of the worst in society, but it brings to light things that we all rather did not happen. But the idea that you can pre-emptively detect things and remove them before they are posted, we're never going to get to that point."

Cooper said she has personally reported a Twitter user who posted a "series of racist, vile and violent attacks" against political leaders including German Chancellor Angela Merkel, London Mayor Sadiq Khan and Gina Miller.

Pickles apologised to Cooper for not looking into these reports and promised to have them looked at by the end of the day. He also acknowledged that Twitter is "not doing a good enough job" at responding to user reports.

The hearing comes as Germany proposes new laws that could see technology companies fined up to €50m (£44m, $53m) if they fail to swiftly delete illegal content such as hate speech and fake news from their sites.

Cooper said she did not find the executives' responses to her questions "particularly convincing".

"Don't you feel any sense of responsibility as a multi-billion pound organisation to at least check that you are not distributing material from proscribed organisations?" she questioned the executives. "We understand the challenges you face and technology changes very fast, but you all have millions of users in the US and you make billions of pounds from these users.

"You all have a terrible reputation among users for dealing swiftly with problems in content even against your own community standards. Surely when you manage to have such a good reputation with advertisers for targeting content and for doing all kinds of sophisticated things with your platforms, surely you should be able to do a better job in order to be able to keep your users safe online and deal with this kind of hate speech."