Facebook took two weeks to remove video of 12-year-old girl that livestreamed her suicide

Facebook took two weeks to remove video of 12-year-old girl that livestreamed her suicide

The Next Web reports on the death of a 12-year-old girl, who broadcast her suicide using the Live.me streaming service via social media:

While YouTube promptly removed the unnerving clip and its different versions off its platform citing a violation against its "policy on violent or graphic content," The Washington Post reports the footage continued to show up on various Facebook pages for nearly two weeks before the social media behemoth eventually began wiping out different instances of the video on its website.

The disheartening incident took place on December 30, but by the time Facebook had removed the video off its platform, it had been viewed by thousands of people across the globe.

In the 40-minute livestream, the girl claims that, in addition to struggling with depression, she had been sexually abused by a family member. Then she proceeds to say farewell to her close friends and family, and eventually takes her own life. The last 10 minutes of the livestream show her lifeless body hanging from a tree as the sun sets.

How desperately sad. It's simply impossible to imagine what it must feel like to be a friend or relative of that poor young girl, and know that her last tragic moments were being shared on Facebook.

It won't help this poor young girl from losing her life, but Facebook needs to do more to remove links to the most disturbing and vile content that is frequently shared publicly on its systems.

Facebook loves to point out that it has more users than the population of the world's largest countries, but it clearly hasn't put adequate resources into providing equivalent social services and police forces to look after the 1.18 billion who log in every day.

Tags: ,

Smashing Security podcast
Check out "Smashing Security", the new weekly audio podcast, with Graham Cluley, Carole Theriault, and special guests from the world of information security.

"Three people having fun in an industry often focused on bad news" • "It's brilliant!" • "The Top Gear of computer security"

Latest episodes:

,

7 Responses

  1. Dhasaba

    January 17, 2017 at 4:33 pm #

    You could argue that she would be happy with it being spread around. Have you seen the media attention? It's bringing to light mental illness in a whole new way.

    • Graham Cluley in reply to Dhasaba.

      January 17, 2017 at 4:42 pm #

      She's dead – she's not capable of feeling happy or sad any more. :(

      And although it's important for mental illness not to be shoved under the carpet, I think it's also important for us to be very careful about sharing imagery and details which might encourage other vulnerable people to take their own lives.

  2. Bob

    January 17, 2017 at 5:14 pm #

    In the UK it's a criminal offence to distribute an Obscene Publication. If somebody makes a complaint to the police then Facebook can be prosecuted as a body corporate.

  3. Tim

    January 17, 2017 at 9:50 pm #

    I stopped reading right after you listed "The Washington Post" as a source.

    • Chris in reply to Tim.

      January 18, 2017 at 10:15 am #

      But still took the time to scroll down and leave a comment. Good for you Tim.

  4. Lisa B.

    January 18, 2017 at 2:29 pm #

    Facebook is not known for its decency or taste. They routinely ban breastfeeding photos as obscene and, most recently, they censored a photograph of the fountain statue of Neptune, which stands in the Piazza del Nettuno in Bologna (it was 'explicitly sexual' in nature).

    It is sad that it took two weeks for someone in FB to finally realize that this video was explicit and needed to be removed.

    Clearly, there are not enough people with common sense working for FB.

  5. Jason Sheldon

    January 19, 2017 at 2:43 pm #

    This isn't anywhere near the disturbing scale of that tragedy, but Facebook need to be regulated.. they really do. They flout their own guidelines, and are a faceless body that is impossible to appeal or enforce complaints.
    I recently saw someone advertising an airgun, on a local sale page (Midlands, UK). I reported it.. as not only did it look like a real hand gun, it's also listed specifically as an item that should not be sold or made available on Facebook. 6 times I reported it, and 6 times, they said it does not go against their community guidelines.. even though it's there in black and white that it clearly does.

    Then recently, I saw an advert appear for "Penis Enlargement" stuff.. No, I don't browse for stuff like that.. looking at "Why am I seeing this?", they were targeting males aged 35-50 in the UK area… but there was a graphic showing two penises – not what I want to see while I'm sat in the window of Costa Coffee…

    I took a screenshot of the advert as an experiment, posted it on my wall and complained that Facebook will ban any photo of a breastfeeding mother, or any tasteful nude photography, but if they can make money from it – they'll happily serve you graphic pictures of engorged cocks.

    Facebook removed my post.. apparently, it goes against their community guidelines…

    THEY SERVED THE ADVERT TO ME IN THE FIRST PLACE!!!

    I had to click a button to agree not to share sexually explicit imagery again.. but they still take the advertising revenue and push this stuff out that goes against their OWN rules.

    People may say it's a free service, don't use it.. Well, that's sensible in theory – but it is so embedded in internet activity these days.. from signing into other websites etc.. I'm locked in by virtue of using Facebook for many things OTHER than grumpy cats.

    Regulate it, mandatory.

Leave a Reply