In Brief

Facebook leak reveals rules for controversial content

Moderator training manual unveils fine line deciding what can be shown on the social network

170329_-_facebook_logo.jpg

Videos depicting gory violence and even deaths may be allowed to remain on Facebook under some circumstances, according to leaked training manuals for the site's moderators.

In an exclusive report, The Guardian reveals the guidelines explicitly detailing the site's response to content containing sex, nudity, violence or death - a balancing act between protecting users from graphic material and maintaining Facebook's status as an open platform which can play a vital part in spreading awareness of issues.

Among them is a proviso stating "videos of abortion only violate if they contain nudity", while another says adult nudity is permissible in content showing Nazi concentration camps.

According to the manual on graphic violence, "sadism" is the key factor in deciding whether violent footage should be removed.

If images or videos show the wounding or death of a person or animal with a title or caption suggesting enjoyment or celebration, the content should be deleted, says Facebook.

However, while videos of violence or death are "disturbing", moderators are told to be aware that they may also be a means of "documenting atrocities" and can "help create awareness" of issues such animal cruelty or non-sexual child abuse.

Videos in this category are marked as "disturbing" and do not auto-play. They also contain a warning screen and are not accessible to users under the age of 18.

Content related to self-harm must always be flagged up to senior managers, the Guardian reports. A two-week period last summer saw 4,531 such cases, of which 63 prompted Facebook to contact the authorities.

The training manual also says Facebook Live streams showing self-harm or suicide attempts should not be shut down, despite the service being criticised for being used to broadcast sexual assaults, torture, murder and suicide.

Facebook says it is wary of cutting users off from friends who may be able to offer support or intervene to keep them safe in a time of trouble.

"We don't want to censor or punish people in distress who are attempting suicide," says a memo sent to site moderators.

Another document adds: "Users post self-destructive content as a cry for help, and removing it may prevent that cry for help from getting through."

However, moderators are told the content should be deleted after "there is no longer an opportunity to help the person" and minimise the risk of encouraging copycat behaviour, unless they are considered particularly newsworthy, such as footage of 9/11 which shows people leaping from the Twin Towers.

The leak "gives new insights into the uncomfortable role the social media giant now plays as a content regulator," says CNN

Recommended

‘Dom’s a text maniac’
Today's newspaper front pages
Today’s newspapers

‘Dom’s a text maniac’

Maskentrottel: Germany coins 1,200 new Covid words
wd-earth_from_space.jpg
Tall Tales

Maskentrottel: Germany coins 1,200 new Covid words

Where the £4bn cuts in UK foreign aid might fall
Dominic Raab, foreign secretary
Why we’re talking about . . .

Where the £4bn cuts in UK foreign aid might fall

The coronavirus vaccines
An elderly lady receives a Covid vaccine in San Juan, Philippines
Fact file

The coronavirus vaccines

Popular articles

Ten Things You Need to Know Today: 21 April 2021
10 Downing Street
Daily Briefing

Ten Things You Need to Know Today: 21 April 2021

What is Donald Trump doing now?
Donald Trump
In Depth

What is Donald Trump doing now?

London mayoral race 2021: who will win?
Night Tube Sadiq Khan
In Depth

London mayoral race 2021: who will win?