Facebook and Twitter could be asked to increase their site moderation under new guidelines about online abuse, which will follow a consultation process led by the Director of Public Prosecutions.
Keir Starmer QC will consult with lawyers, journalists and police on the potential new guidelines, following a sharp increase in arrests under section 127 of the 2003 Communications Act which makes it an offence to send abuse online.
Mathew Woods was this week sentenced to 12-weeks in prison for posting "grossly offensive" jokes on his Facebook page about missing five-year-old April Jones. The following day, 20-year-old Azhar Ahmed was fined £300 and sentenced to 240 hours of community service for a Facebook post which said all soldiers should "die and go to hell" following the deaths of six British soldiers in Afghanistan.
Last month, a man from Liverpool was arrested after setting up a Facebook page which praised the alleged killer of two police officers in Greater Manchester. On that occasion, Facebook worked quickly to take down the offending site as it violated the companies' statements of rights and responsibilities.
The decision to create new guidelines was first announced after the Crown Prosecution Service (CPS) decided not to take action against Daniel Thomas, a semi-professional footballer who sent homophobic tweets regarding Olympic Divers Tom Daley and Peter Waterfield.
In a statement announcing the decision, Starmer asserted that the CPS decided to refrain from action against Thomas because he intended the message to be humorous, he did not intend it to be seen as widely as it was and later expressed remorse for the tweet.
Starmer also said: "Social media is a new and emerging phenomenon raising difficult issues of principle, which have to be confronted not only by prosecutors but also by others including the police, the courts and service providers.
"The fact that offensive remarks may not warrant a full criminal prosecution does not necessarily mean that no action should be taken.
"In my view, the time has come for an informed debate about the boundaries of free speech in an age of social media."
Following the announcement of potential new guidelines, a spokesman for Facebook said: "Facebook has a comprehensive system of reporting tools so people can report offensive content to Facebook - something that doesn't exist across the wider web.
"Once received by Facebook's User Operations team these reports are prioritised, with the most urgent dealt with inside 24 hours - faster still in many cases."