Facebook is rolling out new tools designed to crack down on the spread of revenge porn on its platform including new photo-matching software to prevent intimate images from being reshared. In a blog post on Wednesday (5 April), the social media giant said its new photo-matching system will prevent people from attempting to share images after it has been reported and removed.

The new artificial intelligence technology will be used to prevent the same image from being posted on Facebook, Messenger and Instagram. If someone tries to share the already flagged-image, the site will warn the user that it violates their policies and will block them from sharing it.

Users will be able to flag any intimate images that seem to be shared without permission on the site using the Report tool. These flagged images will then be reviewed by specially trained representatives from its Community Operations team and will remove it if it violates the company's Community Standards.

Facebook added that "in most cases" it will disable an account that shares intimate images without permission, but will offer an appeals process "if someone believes an image was taken down in error".

"These tools, developed in partnership with safety experts, are one example of the potential technology has to help keep people safe," Antigone Davis, Facebook's Head of Global Safety wrote in a blog post.

The company worked with the National Network to End Domestic Violence, Center for Social Research, the Revenge Porn Helpline in the UK and the Cyber Civil Rights Initiative to develop the tools. It also consulted over 150 other online safety organisations and experts across the globe over the past year for feedback on the system.

"This new process will provide reassurance for many victims of image based sexual abuse, and dramatically reduce the amount of harmful content on the platform," Laura Higgins, founder of the Revenge Porn Helpline UK, said in a statement. "We hope this will inspire another social media companies to take similar action and that together we can make the online environment hostile to abuse."

Facebook's new measures come a month after it was revealed that active-duty and retired US Marines were sharing nude and private pictures of thousands of female service members in a private Facebook group. The high-profile incident attracted severe criticism and prompted a Congressional hearing and a Defense Department investigation.

Last year, a judge ruled against dismissing a lawsuit filed against Facebook that involved a 14-year-old girl from Northern Ireland, whose naked photos were repeatedly posted on Facebook. The teen's lawyers claimed the photo was extorted from her and was published on a "shame page" over and over again in an act of revenge. The plaintiff argued Facebook did nothing to permanently block the image.

Meanwhile, CEO Mark Zuckerberg said Facebook is focusing on "building a community that keeps people safe" which means "building technology and AI tools to prevent harm".

"Revenge porn is any intimate photo shared without permission," Zuckerberg wrote in a Facebook post. "It's wrong, it's hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared across all of our platforms."

According to a study by the Cyber Civil Rights Initiative, 93% of US victims of non-consensual intimate images report significant emotional distress, while 82% report significant impairment in social, occupational or other important areas of functioning in their lives.

"We look forward to building on these tools and working with other companies to explore how they could be used across the industry," Davis wrote.