Facebook has announced that it will no longer use those red "Disputed Flags" next to fake news articles and posts in its battle against hoaxes and misinformation on the platform. In response to political and public scrutiny over the spread of fake news stories on the site, Facebook had rolled out the tool a year ago to make it easier for users to identify dubious hoax articles on their News Feeds.
However, academic research has found that the red Disputed Flag may have actually conveyed the wrong message to curious users, Facebook said.
"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs - the opposite effect to what we intended," Facebook product manager Tessa Lyons said in a statement.
According to Facebook product designer Jeff Smith, the disputed flag alerted users to potentially false information, but "it wasn't easy for people to understand exactly what was false."
"It required too many clicks, and those additional clicks made it harder for people to see what the fact-checkers had said about the disputed story," Smith explained in a Medium post. He also noted that "dispelling misinformation is challenging" and these flags could even backfire.
"Just because something is marked as 'false' or 'disputed' doesn't necessarily mean we will be able to change someone's opinion about its accuracy," Smith said. "In fact, some research suggests that strong language or visualizations (like a bright red flag) can backfire and further entrench someone's beliefs."
Instead, the company will now use a "Related Articles" section next to the false news story to give people more context about the story.
"Our research has shown is a more effective way to help people get to the facts," Lyons said. "We've found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown."
Lyons said demoting false news stories causes the article to lose 80% of their traffic, therefore, slashing economic incentives for spammers and troll farms.
Facebook is also starting a new initiative to understand how people decide what information is accurate or not "based on the news sources they depend upon", engage with and follow on Facebook.
"This will not directly impact News Feed in the near term," the company said. "However, it may help us better measure our success in improving the quality of information on Facebook over time."
The changes come as social media giants like Facebook, Twitter and Google face intense political criticism for letting misinformation and fake news stories spread on their platforms, particularly in the months leading up to elections. Lawmakers in the US and UK are currently investigating the role their technology played in Russian operatives' misinformation campaigns to sway political opinion and influence elections.
Facebook admitted it sold hundreds of thousands of dollars worth of politically-divisive ads covering a wide range of hot topics to inauthentic accounts created by the Kremlin-linked Internet Research Agency.
"False news undermines the unique value that Facebook offers: the ability for you to connect with family and friends in meaningful ways," Lyons said. "It's why we're investing in better technology and more people to help prevent the spread of misinformation. Overall, we're making progress."