A senior executive at Instagram’s parent company has defended the platform’s policies on suicide and self-harming content, telling the inquest into Molly Russell’s death that guidelines had always been drawn up in consultation with experts.
Elizabeth Lagone, head of health and wellness policy at Meta, said the social media group worked “extensively with experts” when writing guidelines that allow users to discuss feelings related to suicide or self-harm.
Molly, 14, from Harrow, north-west London, killed herself in November 2017 after viewing extensive amounts of material on platforms including Instagram relating to suicide, depression, self-harm and anxiety.
Lagone told the North London coroner that at the time of Molly’s death, users were allowed to post content about suicide and self-harm to “facilitate the meeting of support” for other users, but not if it “encouraged or promoted” such acts.
In February 2019, Instagram changed its guidelines to ban “all graphic suicide and self-injurious content”. It still allows users to “talk about their own feelings related to suicide or self-harm”, provided such content is not graphic, promotional or shows methods or materials.
In testimony submitted to the court, Lagone said: “Experts have consistently told us that under the right circumstances, content that touches on suicide and self-harm can be shared in a positive context and can play an important role in destigmatizing mental health difficulties.”
Lagone said suicide and self-harming material could have been posted by a user as a “cry for help”. She also told the court that it was important for the company to “consider the broad and incredible harm that can be done by remaining silent [a user’s] matches.” Lagone, who is based in the United States, had been ordered to attend in person by the chief coroner, Andrew Walker.
Oliver Sanders KC, representing the Russell family, asked Lagone whether Instagram had treated young users like Russell as “guinea pigs” when it introduced a system known as content ranking in 2016. Under content ranking, users are sent posts that might be of interest to them, based on factors including what content they like and comment on. Instagram has a minimum age limit of 13 years.
“It’s true, isn’t it, that children, including children suffering from depression like Molly, who was on Instagram in 2016, were just guinea pigs in an experiment?” Sanders said. Lagone responded, “That’s specifically not the way we develop policies and procedures in the company.”
Addressing Instagram’s policy of banning the glorification of self-harm but allowing users to raise awareness about it, Sanders also asked: “Do you think the average 13-year-old would be able to tell the difference between encouraging and promoting self-harm and raise awareness of self-harm?”
Lagone responded: “I can’t answer that question because we don’t allow content that encourages self-harm.”
The court also heard that Instagram had advised Molly to follow at least 34 accounts with “sad or depressing” content-related handles. Four related to suicidal ideation, two to mortality, with other recommendations relating to self-harm, being close to death, suicidal ideation and burial.
Earlier on Friday morning, the inquest was shown videos on Instagram “of the most disturbing nature” seen by the teenager before she took her own life.
The court was shown 17 video clips that Molly had saved or liked on Instagram before she died. Walker warned that the footage “seems to glamorise harm to young people” and is “of the most disturbing nature and it is almost impossible to watch”.
The court was then shown a series of graphic video montages which showed people in suicidal situations. The montages were edited to music and some were subtitled with references to suicide. Molly’s family decided to stay in the courtroom as the videos were played, but the coroner chose to take a 15-minute break in the case afterward.
On Thursday, a senior executive at the image sharing platform Pinterest admitted the platform was “not safe” when Molly Russell used it and apologized for the graphic material the service showed the teenager before her death.
The investigation continues.
In the UK the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email firstname.lastname@example.org and in the UK and Ireland Samaritans can be contacted on freephone 116 123 or email email@example.com or jo @samaritans.ie . In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HJEM to 741741 to get in touch with a crisis text line advisor. In Australia the crisis support service is Lifeline 13 11 14. Other international helplines can be found at befrienders.org