Facebook has begun placing warning messages over videos and photos which it deems to contain graphic images of violence that could “shock, offend and upset”.
The warning, which has been being rolling out since December, was confirmed by Facebook as a way to prevent potentially distressing videos from being viewed automatically. Previously such content would have auto-played on the site.
The social network is appending the warnings only to content that is violent in nature, meaning that other explicit videos that may also offend are not covered by the warning.
The warning is only placed on content that has been flagged by users and which is not deemed in breach of the site’s terms and conditions after review by Facebook. The site’s terms specifically stipulate that content “shared for sadistic pleasure or to celebrate or glorify violence” is banned. In contrast, violent content that is shared by users condemning or reporting on it is allowed as permitted.
‘Share it responsibly’
“When people share things on Facebook, we expect that they will share it responsibly, including choosing who will see that content,” said a Facebook spokesperson. “We also ask that people warn their audience about what they are about to see if it includes graphic violence. In instances when people report graphic content to us that should include warnings or is not appropriate for people under the age of 18, we may add a warning for adults and prevent young people from viewing the content.”
The move follows pressure from both Facebook’s internal and external safety advisors after graphic videos posted by terror groups showing beheadings of hostages and shocking propaganda were shown to users under 18 years old.
The social network permits users to sign up for accounts from the age of 13 years old, but younger users can circumvent age verification to create accounts and video content.
Click to play
The graphic video warnings require the user to click on the video to confirm that they still want to see it after the advice.
Facebook has faced continual criticism for allowing violent and graphic content to remain on the site because it deems it to be of public interest.
The social network permits videos and images from news reports and documentaries depicting abuse, murders and terrorist activities for instance.
David Cameron called Facebook “irresponsible” for allowing beheading clips to remain available, forcing the social network to place limited captions warning about the content in front of videos depicting a killing in Mexico in 2013. The videos were later removed.
Critics will claim that this is the latest move to blur lines between publisher and service provider, where Facebook professing to be neutral and not responsible for user published content and merely a platform for that content, begins to monitor and filter content.
One of the first videos to be affected contain footage of murdered Paris policeman Ahmed Merabet, who was shot in last week’s Charlie Hebdo attacks.
Facebook recently announced that it had passed 1bn video views per day on its site from its 1.3 billion users, as it attempts to challenge Google’s YouTube to become the dominant video platform.
This article was written by Samuel Gibbs, for theguardian.com on Tuesday 13th January 2015 17.16 Europe/Londonguardian.co.uk © Guardian News and Media Limited 2010