NEW YORK — Facebook says it is expanding its fact-checking program to include photos and videos as it fights fake news and misinformation on its service.
Malicious groups seeking to sow political discord in the U.S. and elsewhere have been embracing images and video to spread misinformation.
The company has been testing the image fact-checks since the spring, beginning with France and the news agency AFP. Now, it will send all of its 27 third-party fact-checkers disputed photos and videos to verify. Fact-checkers can also find them on their own.
Facebook will label images or video found to be untrue or misleading as such.
Facebook says the fact-checkers use visual verification techniques such as reverse image searching and analyzing image metadata to check the veracity of photos and videos.
Last month, the social network removed 652 pages, groups, and accounts linked to Russia and, unexpectedly, Iran, for "coordinated inauthentic behaviour" that included the sharing of political material.
Facebook has significantly stepped up policing of its platform since last year, when it acknowledged that Russian agents successfully ran political influence operations on its platform that were aimed at swaying the 2016 U.S presidential election. Other social media networks have done likewise, and continue to turn up fresh evidence of political disinformation campaigns.