Instagram representative says suicidal content can help support social media users, during inquest into 14-year-old Molly Russell’s death

0

An inquest into the death of London teenager Molly Russell has seen an Instagram executive defend the sharing of suicidal content on social media, claiming that it helps people to “share feelings and express themselves”.

Per The Telegraph, Elizabeth Lagone, head of health and wellbeing at Meta – Instagram’s parent company – gave evidence on 23 September, expressing that Instagram allows certain content because it is being “posted in order to create awareness”, for people to “come together for support” or for someone to “talk about their own experience”.

It came after representatives from both Pinterest and Meta flew to the UK to give evidence in the inquest – both issued a formal apology to Molly’s family.

Molly, who was just 14 when she took her own life, had viewed thousands of disturbing posts via social media in the months leading up to her death.

Oliver Sanders KC, representing the Russell family, challenged Lagone repeatedly on whether a child would be able to tell the difference between “content that encourages or raises awareness” of suicide and self-harm, according to The Telegraph.

Lagone replied: “I really can’t answer that question because we don’t allow content that encourages self-injury.”

14-year-old Molly Russell took her own life in 2017

Russell Family

She also added that it was important for Meta to consider “the broad and unbelievable harm” that silencing a poster might cause “when talking about their troubles”.

The court was shown Instagram’s guidelines at the time of Molly’s death, which said that users were allowed to post content about suicide and self-harm to “facilitate the coming together to support” other users but not if it “encouraged or promoted” it.

The inquest also saw some of the disturbing video content that Molly consumed before her death – which depicted incidents of self-harm and suicide – as well as the ‘recommended’ accounts she was encouraged to follow. Seven per cent of said accounts were either “sad or depressive related”.

A Meta spokesperson told GLAMOUR: “Our deepest sympathies remain with Molly’s family and we will continue to assist the Coroner in this inquest. These are incredibly complex issues. We’ve never allowed content that promotes or glorifies suicide and self harm and, since 2019 alone, we’ve updated our policies, deployed new technology to remove more violating content, shown more expert resources when someone searches for, or posts, content related to suicide or self-harm, and introduced controls designed to limit the types of content teens see.

“We continue to improve the technology we use, and between April and June 2022, we found and took action on 98.4% of suicide or self-harm content identified on Instagram before it was reported to us, up from 93.8% two years ago. We’ll continue to work closely with independent experts, as well as teens and parents, to help ensure our apps offer the best possible experience and support for teens.”

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Education News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment