Mark Zuckerberg rubbished the notion that fake news on Facebook helped Donald Trump secure victory in the US presidential race. Calling such reports as "pretty crazy idea", the tech giant's founder has said that the final say rests with the American voters.
Speaking at the Technomy conference in California, Zuckerberg said the "small amount" of fake news on the social media platform did not influence the outcome of the election.
The CEO of Facebook said: "Personally, I think the idea that fake news on Facebook – it's a very small amount of the content – to think it influenced the election in any way is a pretty crazy idea." In his defence, Zuckerberg said even if mock-up news items were shared it was quite possible that both the rival parties would have utilised the idea.
His comments come two days after Trump won the White House race upsetting pollsters and pundits – who had widely predicted Hillary Clinton to clinch the election.
"Voters make decisions based on their lived experience. There is a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news. If you believe that, then I don't think you internalised the message that Trump voters are trying to send in this election," he said during the interview with David Kirkpatrick, who had authored the book The Facebook Effect.
However, when the interviewer interjected to ask what the message was the Facebook founder sidestepped the question.
The 32-year-old further continued to say that the Facebook comprises a multitude of users with different political leanings. He added: "Even if 90% of your friends are Democrats, probably 10% are Republicans. Even if you live in some state or country you will know some people in another state, another country. That means that the information you are getting through the social system is going to be inherently more diverse than you would have gotten through news stations."
Facebook, which has about 1.79 billion users, is increasingly becoming a social media platform through which several adults get their political views, studies suggest.
Fake news on Facebook has become an ongoing problem for the social media giant over the last few months.
Back in August, Facebook made the decision to fire its team of human "news curators" in lieu of an algorithm-based process to determine which stories are included in its Trending News box. Still, multiple fake news stories and conspiracy theories managed to make their way through. In response, Facebook decided to create a new "review team" to deal with the issue.
Both Twitter and Facebook announced in September that they would join a network of over 30 tech and news companies to help tackle fake news and improve the quality of information on their platforms.
Just before Election Day, President Barack Obama weighed in on the issue as well saying people are beginning to believe "outright lies" just because they see it on social media, adding that it "creates this dust cloud of nonsense."
Responding to widespread post-election criticism that Facebook allowed fake news stories to circulate in its News Feed over the course of the campaign, the company responded saying, "there's so much more we need to do."
"We take misinformation on Facebook very seriously," Facebook VP of product management Mosseri told TechCrunch in a statement. "In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution.
"In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation."