Coroner Calls Teen’s Death “an act of self-harm whilst suffering from depression and the negative effects of online content”

By B.N. Frank

Social media platforms and other popular apps collect personal information on all users – including children (see 1, 2, 3).  Research continues to confirm that kids’ use of social media is having negative impacts on their overall health and safety (see 1, 2, 3, 4), even causing deaths (see 1, 2).  Recently, a coroner stated that a UK teen’s use of Instagram also contributed to her death.

From Ars Technica:


Coroner lists Instagram algorithm as contributing cause of UK teen’s death [Updated]

Meta called content “safe” that UK judge found “impossible to watch.”

Ashley Belanger

In a London court this week, coroner Andrew Walker had the difficult task of assessing a question that child safety advocates have been asking for years: How responsible is social media for the content algorithms feed to minors? The case before Walker involved a 14-year-old named Molly Russell, who took her life in 2017 after she viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content that Russell liked or saved in the days ahead of her death as so disturbing, the coroner said in court, that he found it “almost impossible to watch.”

Today, Walker concluded that Russell’s death couldn’t be ruled a suicide, Bloomberg reports. Instead, he described her cause of death as “an act of self-harm whilst suffering from depression and the negative effects of online content.”

Bloomberg reported that Walker came to this decision based on Russell’s “prolific” use of Instagram—liking, sharing, or saving 16,300 posts in six months before her death—and Pinterest—5,793 pins over the same amount of time—combined with how the platforms catered content to contribute to Russell’s depressive state.

“The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,” which “romanticized acts of self-harm” and “sought to isolate and discourage discussion with those who may have been able to help,” Walker said.

Following Walker’s ruling, Russell’s family issued a statement provided to Ars, calling it a landmark decision and saying that the court didn’t even review the most disturbing content that Molly encountered.

“This past fortnight has been particularly painful for our family,” the Russell family’s statement reads. “We’re missing Molly more agonizingly than usual, but we hope that the scrutiny this case has received will help prevent similar deaths encouraged by the disturbing content that is still to this day available on social media platforms including those run by Meta.”

Bloomberg reports that the family’s lawyer, Oliver Sanders, has requested that Walker “send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator.” In their statement, the family pushed UK regulators to quickly pass and enforce the UK Online Safety Bill, which The New York Times reported could institute “new safeguards for younger users worldwide.”

Defenses from Pinterest and Facebook took different tactics

During the inquest, Pinterest and Meta took different approaches to defend their policies. Pinterest apologized, saying it didn’t have the technology it currently has to more effectively moderate content that Molly was exposed to. But Meta’s head of health and well-being, Elizabeth Lagone, frustrated the family by telling the court that the content Molly viewed was considered “safe” by Meta’s standards.

“We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly, as ‘SAFE’ and not contravening the platform’s policies,” the Russell family wrote in their statement. “If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive.”

A Meta spokesperson told Bloomberg that the company is “committed to ensuring that Instagram is a positive experience for everyone, particularly teenagers,” promising to “carefully consider the Coroner’s full report when he provides it.”

Molly’s family made it a point to praise Pinterest for its transparency during the inquest, urging other social media companies to look to Pinterest as a model when dealing with anyone challenging content policy decisions.

“For the first time today, tech platforms have been formally held responsible for the death of a child,” the Russells’ statement said. “In the future, we as a family hope that any other social media companies called upon to assist an inquest follow the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process.”

Bloomberg reported that Pinterest has said that “Molly’s story has reinforced our commitment to creating a safe and positive space for our pinners.” In response to the ruling, Pinterest said it has “continued to strengthen” its “policies around self-harm content.”

Neither Pinterest nor Meta immediately responded to Ars’ request for comment. [Update: Pinterest told Ars that its thoughts are with the Russell family, saying it has listened carefully to the court and the family throughout the inquest. According to Pinterest, it is “committed to making ongoing improvements to help ensure that the platform is safe for everyone” and internally “the Coroner’s report will be considered with care.” Since Molly’s death, Pinterest said it has taken steps to improve content moderation, including blocking more than 25,000 self-harm related search terms and, since 2019, has combined “human moderation with automated machine learning technologies to reduce policy-violating content on the platform.”]

Meta continues to face criticism

“Why on earth are you doing this?” the Russell family’s lawyer, Sanders, shouted at Meta executive Lagone during the inquest, after she provided her controversial testimony that deemed the content Molly viewed as “safe.”

During this heated moment, Meta’s lawyer had to prompt Walker to calm Sanders down. Tension exists today, though, as the family remains critical of Meta’s stance in court that the company cannot moderate all content promoting self-harm. Lagone told the court that would threaten free speech by preventing vulnerable people from posting and seeking help for issues related to self-harm.

When Sanders directly asked Lagone if such otherwise permissible content was safe for children to view, Lagone replied, “I think it is safe for people to be able to express themselves.” Walker then had to intervene and push Lagone to state whether she thought the content was safe, to which she replied, “Yes, it is safe.”

Since then, Walker has now concluded that companies like Meta played a role in Molly’s death by feeding her self-harm-related content the court deemed unsafe due to the teen’s mental state.

In a statement provided to Ars, Merry Varney—a partner from law firm Leigh Day, which represented the family—has said that, unlike Pinterest, Meta wasn’t forthcoming when the family requested that it share some of the most damning evidence to win their case. The family said it was only because Walker’s inquest was so robust and required companies to share so much information that they got the rare ruling.

“The battles bereaved families face when seeking answers from social media companies are immense and even with the Senior Coroner taking a robust approach, it was only in August this year that Meta provided over 1,200 Instagram posts Molly engaged with, less than a month before the inquest started,” Varney said. “This included some of the most distressing videos and posts that Molly engaged with.”

The board of trustees for the Molly Rose Foundation—founded by the family to help other young people in distress—have joined the family in calling for the Online Safety Bill to pass.

“It’s almost five years since Molly died and we are still waiting for the promised government legislation,” the board said in a statement provided to Ars. “We can’t wait any longer; don’t aim for perfection, too many lives are at risk. The regulatory structure can be perfected in the months and years to come.”

However, as Bloomberg previously reported, the Online Safety Bill “could now be at risk of being altered” due to concerns that seem to echo Meta’s worries about moderating all content related to self-harm: “that some clauses risked stifling free speech.”

Until regulations change, courts remain the battleground for families seeking to hold social media companies accountable whenever child safety issues arise. The Russell family thanked Walker for his decision, saying that they “hope the data gathered may prove useful beyond this courtroom and continue to help create a safer web.”


Activist Post is Google-Free
Support for just $1 per month at Patreon or SubscribeStar

Research has also determined that children’s use and exposure to digital and wireless technology is harmful to their health and, of course,  everybody else’s (see 1, 2, 3, 4, 5, 6, 7, 8, 9).  But I digress… –

American tech insiders (aka “Silicon Valley Parents”) have taken significant measures to limit their own kids’ use and exposure to screens for years (see 1, 2) and other parents seem to be catching on.

Activist Post reports regularly about social media and unsafe technology.  For more information, visit our archives and the following websites:

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Coroner Calls Teen’s Death “an act of self-harm whilst suffering from depression and the negative effects of online content”"

Leave a comment