Islamophobes are targeting Muslim women in online hate campaigns, according to a new study.
A Birmingham City University study examined hundreds of Facebook pages, posts and comments as part of an extensive survey of the spread of anti-Islam hate speech online, including those associated with far right groups Britain First and the English Defence league.
They found 500 instances of Islamophobic abuse, in which Muslims were branded terrorists and rapists, alleged to be waging "war" on non-Muslims, and in which calls made for Muslims to be deported, as part of a campaign to "incite violence and prejudicial action." Women wearing Islamic dress are branded a "security threat."
There is evidence of the hatred spilling into attacks and real life abuse, with a 326% surge in Islamophobic incidents recorded last year, and more than half of the victims women.
Researcher Imran Awan said that the recent murder of MP Jo Cox and the surge of racist attacks in the wake of the Brexit vote showed the urgency of tackling online hate speech. "What is has shown is that the far right and those with links and sympathies with the far right were using Facebook and social media to in effect portray Muslims in a very bad and negative fashion," Awan said.
"After Brexit people have felt much more empowered and confident to come and target Muslims and others in racist hate attacks. This was all playing on social media but no one looked at it. If Facebook had been monitoring this racism, then I'm not saying they could have stopped the racist attacks, but it certainly could have given them an insight into the racist people using their platforms."
It found that 80% of the abuse was carried out by men, who singled out Muslim women for attacks, with 76 posts portraying women wearing the niqab or hijab as a "security threat." The next most frequent form of abuse called for Muslims to be deported, with 62 instances recorded.
It identifies five kinds of online Islamophobe, from the 'producers' and 'distributers' seeking to create "a climate of fear, anti-Muslim hate and online hostility," to the 'opportunists' who will spread anti-Muslim hate speech in response to a specific incident, such as atrocities committed by terrorist group Isis.
Also responsible are 'deceptives', who will concoct rumours and false stories to whip up Islamophobic hatred, such as the rumour Muslims wanted to ban cartoon character Peppa Pig, and 'fantasists', who fantasise about Islamophobic violence and make direct threats against Muslim communities.
On Tuesday, Home Secretary Amber Rudd announced the launch of a campaign to combat hate crime in the UK, with Her Majesty's Inspectorate of Constabulary to review the way hate crimes are reported and investigated by police in England and Wales.
It comes with more than 6,000 hate crimes recorded by police in the wake of the 23 June EU Referendum. The Muslim Council of Britain recorded 100 crimes in the weekend after the referendum.
Islamophobia monitoring group Tell MAMA found a 326 increase in Islamophobic incidents last year, with Muslim women "disproportionately targeted by cowardly hatemongers."
"We have known that visible Muslim women are the ones targeted at a street level, but what we also have seen in Tell MAMA, is the way that Muslim women who are using social media platforms, are targeted for misogynistic and anti-Muslim speech. In particular, there is a mix of sexualisation and anti-Muslim abuse that is intertwined which also hints at perceptions and attitudes towards women in our society," said Tell MAMA director Fiyaz Mughal.
"We are also aware from our work in Tell MAMA, that the perpetrators age range has dropped significantly from 15-35 to 13-18 showing that anti-Muslim hate in particular is drawing in and building a younger audience which is daunting for the future. We need to redouble our efforts if we are to have social cohesion in our society and we also need to ensure that women feel protected and confident enough to report in such hate incidents."
Facebook needs to do more to tackle race hatred
Facebook recently signed up to a new European Union code of conduct obliging it to remove hate speech from its European sites within 24 hours. Awan said that UK authorities and Facebook needed to do more to combat the problem.
"I think police have a really tough job in the sense that in my understanding it is like finding a needle in a virtual haystack, and they are not clued up enough. I don't think they have enough training to look at social media posts, police need to be trained on what to look at," he said.
A College of Policing spokesman said: ""We are working with the Crown Prosecution Service, partners and police forces to raise awareness and improve the policing response to hate crime. This will ensure offenders can be bought to justice and evidence of their hostility can be used to support enhanced sentencing.
"The College has developed training for police forces to issue to officers and staff and published Authorised Professional Practice, which is national guidance, for those responding to hate crime.
"In addition, more than £500,000 has been awarded to the University of Sussex and the Metropolitan Police through the Police Knowledge Fund to pilot a study that will examine the relationship between discussions of hate crime on social media and data relating to hate crime that has been recorded by police. The fund allows officers to develop their skills, build their knowledge and expertise about what works in policing and crime reduction, and put it into practice."
Facebook says that it will not tolerate content that directly attacks other directly based on race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition, and its policies try to strike the right balance between giving people the freedom to express themselves and maintaining a safe and trusted environment.
It said it has rules and tools people can use to report content that they find offensive.
IBTimes UK has contacted Facebook for comment.