Facebook has bowed to an outcry over content promoting violence against women after advertisers pulled ads in protest.
The company said on Tuesday it would update its policies on hate speech, increase accountability of content creators and train staff to be more responsive to complaints, marking a victory for women's rights activists. "We need to do better – and we will," it said in a statement.
The climbdown followed a week-long campaign by Women, Action and the Media, the Everyday Sexism Project and the activist Soraya Chemaly to remove supposedly humorous content endorsing rape and domestic violence.
Examples included a photograph of the singer Rihanna's bloodied and beaten face, captioned with "Chris Brown's Greatest Hits", a reference to the assault by her ex-boyfriend.
A photograph of a woman in a pool of blood had the caption "I like her for her brains".
Another photograph, of a man holding a rag over a woman's mouth, was captioned "Does this smell like chloroform to you?".
More than 100 advocacy groups joined the protest and demanded Facebook recognise such content as hate speech and train moderators to remove it.
Facebook, which is based in Menlo Park, California, initially rebuffed the complaints, citing freedom of speech. A spokesman told Huffington Post UK: "As you may expect in any diverse community of more than a billion people, we occasionally see people post distasteful or disturbing content, or make crude attempts at humour. While it may be vulgar and offensive, distasteful content on its own does not violate our policies."
The campaign gathered momentum, however, when tens of thousands of tweets and emails using the hashtag #Fbrape were sent to the social network's advertisers.
At least 15 pulled their ads, Women, Action and the Media said, including Nissan UK, Nationwide UK, J Street and WestHost.
Facebook bowed to the pressure in a lengthy statement which stressed its effort to balance free speech with a policy of banning hate speech.
"We prohibit content deemed to be directly harmful, but allow content that is offensive or controversial. We define harmful content as anything organising real world violence, theft, or property destruction, or that directly inflicts emotional distress on a specific private individual (eg bullying)."
It said it had miscalculated the balance. "In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate. In some cases, content is not being removed as quickly as we want. In other cases, content that should be removed has not been or has been evaluated using outdated criteria. We have been working over the past several months to improve our systems to respond to reports of violations, but the guidelines used by these systems have failed to capture all the content that violates our standards."
Facebook promised to review and update guidelines, improve moderators' training, establish more formal lines of communication with advocacy groups and increase accountability of the creators of content which is cruel or insensitive but does not qualify as hate speech. One recent innovation that obliges such creators to supply their authentic identity has already created a "better environment" and will continue to be developed, it said.
Jaclyn Friedman, executive director of Women, Action and the Media, praised Facebook's response and called the company admirable. "We hope that this effort stands as a testament to the power of collaborative action."
guardian.co.uk © Guardian News and Media Limited 2010