Mark Zuckerberg
Some Facebook employees reportedly argued that Republican presidential nominee Donald Trump's posts on the platform should be designated as hate speech and removed Justin Sullivan/Getty Images

Some Facebook employees reportedly argued that certain posts of Republican presidential candidate Donald Trump on the platform about banning Muslims from entering the US should be deemed hate speech and removed from the platform. Citing "people familiar with the matter," the Wall Street Journal reported on Friday that chief executive Mark Zuckerberg, however, ruled against banning the posts saying it would be inappropriate to censor a presidential candidate.

The heated internal discussions reportedly began after Trump posted a link to a 7 December campaign statement "on preventing Muslim immigration" that called for a "total and complete shutdown of Muslims entering the United States until our country's representatives can figure out what is going on."

Since then, Trump has seemed to back away from the controversial proposal saying his national security policies would block immigrants from "countries with great terrorism."

Statement on Preventing Muslim Immigration:

Posted by Donald J. Trump on Monday, December 7, 2015

The Facebook post, however, sparked major backlash with users flagging it as hate speech. It also apparently drew complaints within the company with an unspecified number of employees arguing that it violated the platform's community standards. Some employees responsible for reviewing Facebook content even reportedly threatened to quit.

During a weekly "town hall" meeting with staff members, Zuckerberg reportedly defended his decision.

"Mr Zuckerberg acknowledged that Mr Trump's call for a ban [of Muslims] did qualify as hate speech, but said the implications of removing them were too drastic," WSJ reported, citing two people who attended the meeting. The publication reported that many employees supported Zuckerberg's decision with one saying, "banning a US presidential candidate is not something you do lightly."

"When we review reports of content that may violate our policies, we take context into consideration. That context can include the value of political discourse," a Facebook spokeswoman told WSJ. "Many people are voicing opinions about this particular content and it has become an important part of the conversation around who the next US president will be."

On Friday, Facebook announced changes to its editorial policies saying it will allow more explicit posts if they are "newsworthy, significant or important to the public interest."

"Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them," Joel Kaplan, vice president of global public policy, and Justin Osofsky, vice president of global operations and media partnerships, wrote in a blog post.

The news follows a series of controversies over deleted content and heavy, widespread criticism targeting Facebook over censorship.

This week, the company apologised after removing a video from the Swedish Cancer Society that promoted breast cancer awareness that showed simple animations of the female body. The video was previously termed "offensive" under the company's policies.

In September, the platform's censorship of the iconic Pulitzer Prize-winning 'napalm girl' photo triggered widespread criticism from news organisations, Norwegian politicians and experts. Espen Egil Hansen, editor-in-chief of Norway's largest newspaper Aftenposten, accused Zuckerberg of "abusing his power" and "limiting freedom" as "the world's most powerful editor."

The social media giant later reversed its decision.

"We try to find the right balance between enabling people to express themselves while maintaining a respectful experience for our global community," a Facebook spokesman told the Guardian at the time. "Our solutions won't always be perfect, but we will continue to try to improve our policies and the ways in which we apply them."