Activists in Myanmar have accused Facebook of removing their posts documenting the ethnic cleansing of Rohingya Muslims in Myanmar and suspending their accounts. Rohingya activists told The Daily Beast that their Facebook accounts were often taken down or suspended after they posted videos and photos portraying the persecution and violence faced by the ethnic minority group from the country's military.

The latest spate of violence began in Myanmar's Rakhine state in August after an armed attack on police posts and an army camp were blamed on Rohingya insurgents. The deadly attacks led to a massive crackdown by Myanmar's security forces prompting condemnation by human rights monitors. A UN official called the campaign "a textbook example of ethnic cleansing".

At least 430,000 Rohingya Muslims have since fled to Bangladesh to escape the violence and persecution, narrating horrific stories of their villages being burned and mass killings at the hands of the military.

A number of activists within and outside Myanmar who spoke with The Daily Beast said their posts about the violence were frequently taken down by Facebook.

One person named Aung Tin said his account was shut down more than 10 times and his account was frozen for a month after he wrote a critical post about the Myanmar home minister. In another post, he wrote that Myanmar's government had "poisoned the whole country".

He said he was banned for one month for that comment.

A Facebook spokesperson said the company was looking into the issue and "carefully reviewing content" that violate its Community Standards.

"We want Facebook to be a place where people can share responsibly, and we work hard to strike the right balance between enabling expression while providing a safe and respectful experience," Facebook spokesperson Ruchika Budhraja told The Daily Beast. "That's why we have Community Standards, which outline what type of sharing is allowed on Facebook and what type of content may be reported to us and removed.

"Anyone can report content to us if they think it violates our standards. In response to the situation in Myanmar, we are carefully reviewing content against our Community Standards."

Facebook has previously faced criticism over its moderation policies as it struggles to strike a balance between free speech and censorship. Over the past year, the social media giant has come under fire after it deleted various "explicit" images and videos that it initially said violated its standards.

These included the iconic Pulitzer Prize-winning "napalm girl" photograph taken during the Vietnam War, a breast cancer awareness video from the Swedish Cancer Society and a photo of the 16th century Renaissance statue of the sea god Neptune.

More recently, Facebook has come under intense scrutiny over its ad practices after it revealed that it sold $100,000 (£74,470) worth of ads with politically divisive content to fake accounts likely linked to Russia during the 2016 election. It was also reported that Russian operatives used the platform to remotely organise rallies and protests on US soil, including one anti-refugee rally in Idaho.

Last week, it was reported that Facebook allowed advertisers to target their promoted posts towards users who fell under controversial categories such as "Jew hater" and "Nazi Party" and were interested in topics such as "How to burn Jews". The offensive categories were eventually taken down.

rohingya
Activists in and outside Myanmar have accused Facebook of censoring posts documenting the ongoing Rohingya crisis Paula Bronstein/Getty Images