Coroner issues warning as Instagram videos of ‘most harrowing nature’ are shown at Molly Russell’s trial

Written by

Videos shown at an inquest into the death of a schoolgirl Molly Russell were so troubling that the coroner considered editing them and issued the “biggest” warning before they were played.

The 14-year-old, from Harrowin north west London, ended her life in November 2017 after viewing online content about self-harm, depression and suicide.

An inquest into her death at North London Coroner’s Court was shown 17 clips she liked or saved on Instagram which appeared to “glamorise harm to young people”.

Before the clips were played, Coroner Andrew Walker asked those present to leave if they were likely to be affected by the material.

The court was told lawyers and the coroner had discussed whether they should be redacted beforehand because they were “so unpleasant to watch”.

“But Molly had no such choice, so we would effectively edit the footage for adult viewing when it was available in an unedited form for a child,” Mr Walker said.

Describing footage the court was about to see, the coroner said: “It is of the most disturbing nature and it is almost impossible to watch.

“If you are likely to be influenced by such videos, please do not stay here to watch them.”

The coroner turned to Molly’s family and said, “There’s no reason for any of you to stay.

“In my opinion, this sequence of video footage should be seen [by the court].”

An inquest at North London Coroner’s Court was shown 17 clips Molly liked or saved on Instagram which appeared to “glamorise harm to young people”

(Family distribution/PA)

The court was then played the clips, which related to suicide, drugs, alcohol, depression and self-harm.

Molly’s family remained in the courtroom while the videos were played, but the coroner chose to take a 15-minute break in the case afterward.

The schoolgirl’s family has been campaigning for better internet security since her death almost five years ago.

Instagram’s guidelines at the time, which were shown to the court, said users were allowed to post content about suicide and self-harm to “facilitate the gathering of support” for other users, but not if it “encourages or promotes” self-harm.

On Friday, the head of health and wellbeing at Instagram’s parent company Meta defended the social media platform’s content policies – saying suicide and self-harm material could have been posted by a user as a “cry for help”.

Elizabeth Lagone, head of health and wellness at Instagram’s parent company Meta, defended the social media platform’s content policies

(Beresford Hodge/PA)

Elizabeth Lagone told the court it was an important consideration by the company, even in its policies at the time of Molly’s death, to “consider the broad and incredible harm that can be done by silencing (a poster’s) struggles”.

Ms Lagone also denied that Instagram had treated children like Molly as “guinea pigs” when it launched content ranking – a new algorithm-driven system for personalizing and sorting content – in 2016.

Molly’s family’s lawyer, Oliver Sanders KC, said: “It’s true, isn’t it, that children, including children suffering from depression like Molly, who was on Instagram in 2016, were just guinea pigs in an experiment?”

She replied: “That is specifically not the way we develop policies and procedures in the company.”

Asked by Mr Sanders whether it was obvious that it was not safe for children to see “graphic suicide images”, the director said: “I don’t know … those are complicated questions.”

Sanders drew the witness’s attention to experts who had informed Meta that it was not safe for children to view the material before asking: “Had they previously told you otherwise?”

Molly Russell’s father Ian Russell (centre), mother Janet Russell (right) and her sister (left) arrive at Barnet Coroner’s Court on the first day of the inquest into her death

(Kirsty O’Connor/PA)

Ms Lagone replied: “We have ongoing discussions with them, but there are any number of … issues we are talking about with them.”

The court heard Molly created an Instagram account in March 2015 when she was 12 and was recommended 34, “possibly more”, sad or depression-related accounts on Instagram.

Of the accounts recommended, Mr Sanders said one referred to self-harm, one to concealment, four to suicidal feelings, one to themes of being “unable to go on”, two to mortality and one to burial.

On Thursday, Pinterest’s head of community operations, Judson Hoffman, apologized after admitting the platform was “not safe” when the 14-year-old used it.

Hoffman said he “deeply regrets” posts Molly saw on Pinterest before her death and said it was material he “would not show my children”.

The investigation, which is expected to last up to two weeks, continues.

If you are experiencing feelings of distress and isolation, or struggling to cope, the Samaritans offer support; you can speak to someone free of charge over the phone, in confidence, on 116 123 (UK and ROI), email jo@samaritans.orgor visit The Samaritans website to find information about your nearest branch.

For services local to you, the national mental health database – Hub of Hope – allows you to enter your postcode to search for organizations and charities offering mental health advice and support in your area.

Additional reporting from the Press Association

About the author

Leave a Comment