In Brief

Facebook leak reveals rules for controversial content

Moderator training manual unveils fine line deciding what can be shown on the social network

170329_-_facebook_logo.jpg

Videos depicting gory violence and even deaths may be allowed to remain on Facebook under some circumstances, according to leaked training manuals for the site's moderators.

In an exclusive report, The Guardian reveals the guidelines explicitly detailing the site's response to content containing sex, nudity, violence or death - a balancing act between protecting users from graphic material and maintaining Facebook's status as an open platform which can play a vital part in spreading awareness of issues.

Among them is a proviso stating "videos of abortion only violate if they contain nudity", while another says adult nudity is permissible in content showing Nazi concentration camps.

According to the manual on graphic violence, "sadism" is the key factor in deciding whether violent footage should be removed.

If images or videos show the wounding or death of a person or animal with a title or caption suggesting enjoyment or celebration, the content should be deleted, says Facebook.

However, while videos of violence or death are "disturbing", moderators are told to be aware that they may also be a means of "documenting atrocities" and can "help create awareness" of issues such animal cruelty or non-sexual child abuse.

Videos in this category are marked as "disturbing" and do not auto-play. They also contain a warning screen and are not accessible to users under the age of 18.

Content related to self-harm must always be flagged up to senior managers, the Guardian reports. A two-week period last summer saw 4,531 such cases, of which 63 prompted Facebook to contact the authorities.

The training manual also says Facebook Live streams showing self-harm or suicide attempts should not be shut down, despite the service being criticised for being used to broadcast sexual assaults, torture, murder and suicide.

Facebook says it is wary of cutting users off from friends who may be able to offer support or intervene to keep them safe in a time of trouble.

"We don't want to censor or punish people in distress who are attempting suicide," says a memo sent to site moderators.

Another document adds: "Users post self-destructive content as a cry for help, and removing it may prevent that cry for help from getting through."

However, moderators are told the content should be deleted after "there is no longer an opportunity to help the person" and minimise the risk of encouraging copycat behaviour, unless they are considered particularly newsworthy, such as footage of 9/11 which shows people leaping from the Twin Towers.

The leak "gives new insights into the uncomfortable role the social media giant now plays as a content regulator," says CNN

Recommended

Tik-tots: the rules for children on social media
The latest on . . .

Tik-tots: the rules for children on social media

Rowan Atkinson fireworks made locals think Cotswolds was under attack
A firework display
Tall Tales

Rowan Atkinson fireworks made locals think Cotswolds was under attack

‘Time for a rescue plan, Rishi!’
Today's newspaper front pages
Today’s newspapers

‘Time for a rescue plan, Rishi!’

Quiz of The Week: 14 - 20 May
Speculation is mounting over the publication date of Sue Gray’s Partygate report
Quizzes and puzzles

Quiz of The Week: 14 - 20 May

Popular articles

Is Vladimir Putin seriously ill?
Vladimir Putin
Why we’re talking about . . .

Is Vladimir Putin seriously ill?

Will Russia’s ‘Terminator’ tanks break Ukraine’s resistance?
A BMPT ‘Terminator’ tank on display during last year’s Victory Day parade in Red Square
Today’s big question

Will Russia’s ‘Terminator’ tanks break Ukraine’s resistance?

The mysterious Russian oligarch deaths
Vladimir Putin has previously deployed ‘extreme measures’ to crush opposition
Why we’re talking about . . .

The mysterious Russian oligarch deaths

The Week Footer Banner