Facebook is to hire an additional 3,000 people over the next year to review reports of graphic, disturbing and inappropriate videos uploaded to the social network and broadcast using Facebook Live.
Chief executive Mark Zuckerberg made the announcement on his own Facebook page, and explained that the reviewers will improve the social network's ability to provide help to users who need it, and "get better at removing things we don't allow... like hate speech and child exploitation."
The 3,000 people will join a team of 4,500 already employed by Facebook to remove content which breaches the company's guidelines.
Zuckerberg said the reviewers will be used to quickly remove inappropriate content, but also get help to those who need it. The announcement comes just a few days after 49-year-old James Jeffrey shot and killed himself during a Facebook Live broadcast to his friends. The video was viewed around 1,000 times in the two hours before it was removed.
Also broadcast live on Facebook in 2017 was the murder of 74-year-old pensioner Robert Godwin on 16 April, and the murder of an 11-month-old baby girl by her father Wuttisan Wongtalay, who later killed himself during the broadcast.
Acknowledging these incidents, Zuckerberg said: "Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community.
"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action soon – whether that's responding quickly when someone needs help or taking a post down."
Facebook chief operating officer Sheryl Sandberg replied to Zuckerberg's post: "Keeping people safe is our top priority. We won't stop until we get it right."
The reviewers will also help Facebook work "with local community groups and law enforcement who are in the best position to help someone if they need it – either because they're about to harm themselves, or because they're in danger from someone else."
Zuckerberg concluded: "This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate."