Lawsuit Filed: Popular Game Platform Allegedly “became the gateway enabling multiple adult users to prey on a 10-year-old girl.”

By B.N. Frank

Research continues to confirm that kids’ use of social media is having negative impacts on their overall health and safety (see 1, 2, 3, 4), even leading to some deaths (see 1, 2, 3).  Numerous studies have also determined that children are being exploited while using digital games and other online platforms (see 1, 2, 3).  Recently a lawsuit was filed against Roblox for not offering a safe platform for at least one child.  Other platforms are also defendants in the lawsuit.

From Ars Technica:


Roblox sued for allegedly enabling young girl’s sexual, financial exploitation

Meta, Snap, and Discord are also defendants in the lawsuit.

Ashley Belanger

Through the pandemic, the user-created game platform that’s so popular with kids, Roblox, expanded its user base and decided to go public. Within two years, its value shot from less than $4 billion to $45 billion. Now it’s being sued—along with Discord, Snap, and Meta—by a parent who alleges that during the pandemic, Roblox became the gateway enabling multiple adult users to prey on a 10-year-old girl.

The lawsuit filed Wednesday in the San Francisco Superior Court shows how sexual predators can exploit multiple social platforms at once to cover their tracks while financially and sexually exploiting children. It alleges that, in 2020, Roblox connected a young girl called S.U. with adult men who abused her for months, manipulating her into sending payments using Roblox currency called Robux and inducing her to share explicit photos on Discord and Snapchat through 2021. As the girl grew increasingly anxious and depressed, the lawsuit alleges that Instagram began recommending self-harm content, and ultimately, S.U. had to withdraw from school after multiple suicide attempts.

Like many similar product liability lawsuits that social platforms have recently faced for allegedly addicting children and causing harms, this new lawsuit seeks to hold platforms accountable for reportedly continuing to promote the use of features that tech companies know can pose severe risks for minor users. And S.U.’s guardian, known as C.U. in the lawsuit, wants platforms to pay for profiting off systems that allegedly recklessly engage child users.

The lawsuit says that platforms neglect to prevent predator access to minors, suggesting cheap and simple fixes that platforms overlook because they’d potentially limit profits. These suggestions include warning minors about potential predatory engagement, verifying the age of account holders, restricting adult users from messaging minor users, banning adult users who message minors, or preventing minors from circumventing parental oversight by limiting minor’s access to certain features and abilities to create duplicate accounts.

A Roblox spokesperson told Ars, “While we do not comment on pending litigation, Roblox has a safety-first culture and works tirelessly to maintain a platform that is safe and civil for all.” Roblox also says it has a zero-tolerance policy for users “endangering or sexualizing children in any way” and takes “swift action against anyone found to be acting in breach of our Community Standards.”

Discord told Ars it scans every image on its platform to detect child sexual abuse materials and, echoing Roblox, said it has a zero-tolerance policy and takes immediate action when it becomes aware of child endangerment or sexual exploitation. “This includes proactively investigating and banning users, shutting down servers, and making targeted efforts to detect and disable accounts that violate our Terms of Service and Community Guidelines.”

Snap did not immediately provide comment to Ars.

Meta told Ars that it cannot comment on active litigation but says its “deepest sympathies are with anyone affected by these difficult and complex issues.”

Meta has, arguably, faced the most criticism on this issue, ever since whistleblower Frances Haugen told the US Senate how Facebook knowingly harmed young users. A Meta spokesperson told Ars that it has implemented changes on Instagram that are similar—though seemingly weaker—to the changes the lawsuit seeks.

“Teens automatically have their accounts set to private when they join Instagram, adults can’t message teens that don’t follow them, we don’t show accounts belonging to teens to some adults in places where we suggest content, and we have controls designed to limit the types of content teens see,” a Meta spokesperson told Ars.

As a result of S.U.’s experiences on Roblox, Discord, Snapchat, and Instagram, the girl’s guardian C.U. has since had to quit her “dream job” with the government, sacrificing benefits and a pension, to attend to S.U.’s escalated care needs. So far, C.U. says in the lawsuit that she has gone into $10,000 in debt from healthcare co-pays and that S.U. continues to need care for ongoing health issues.

The lawsuit seeks damages from social platforms to help recover the costs of S.U.’s medical care, as well as monetary damages for S.U.’s future care, C.U.’s income loss and future earning capacity, punitive damages, and more, to be determined, they’ve demanded, by a jury.

“Things only got worse”

The lawsuit says that before S.U. began using Roblox, she was a happy kid who performed well in school. Today, she suffers from harms that C.U. connects directly to her social media use, including “addiction, sleep deprivation, anxiety, depression, self-harm, suicidal ideation, exploitation and abuse, and attempted suicide.”

According to the lawsuit, S.U. started playing games on Roblox, and C.U.’s parents even rewarded her for good behavior by giving the girl Robux (the platform’s currency).

For Christmas in 2019, S.U. got an iPad, and then when the pandemic started in 2020, S.U. began spending more time online. Around this time, an 18-year-old man named Charles connected with her on Robox, and he asked her to download Discord. She agreed and over the next few months, just as S.U. was turning 11 years old, “Charles exploited and abused S.U. for months. He encouraged her to drink and take prescription drugs, and manipulated her, and then introduced her to several of his friends, who also manipulated, exploited, and abused S.U.”

Having discovered how easy it was to create a social media account, S.U. also set up Instagram and Snapchat accounts. And just as Charles began his attack, the lawsuit alleges that Instagram and Snap “started identifying and sending” harmful content to S.U. “in significantly higher volumes.” Around the same time, S.U. began “acting out and engaging in self-harming behavior.”

By July, S.U. and Charles no longer spoke, but she subsequently met a 22-year-old man named Matthew, who claimed that “he was a moderator for the Roblox game she was playing” and he could get her in-game privileges if she sent him the Robux that C.U.’s parents had given the girl. To get the money, Matthew created items and requested S.U. to pay well over their worth. He also lied and promised he would send S.U. Robux, too.

Like Charles, Matthew invited S.U. to move things to Discord, and then Snapchat. Of the two, Snapchat proved more dangerous for S.U., the lawsuit alleges, as she began hiding evidence of her abuse in her “My Eyes Only” folder on Snapchat. The lawsuit describes this feature as the most harmful because it kept parents from discovering the photos. S.U. could conceal them in “a special tab within Snapchat itself that requires a passcode,” where “content cannot be recovered–even by Snap–without the correct passcode.” Eventually, these child sexual abuse materials were distributed for payment.

“My Eyes Only has no practical purpose or use, other than to hide potentially harmful content from parents and/or legal owners of the devices used to access Snap,” the lawsuit says. “In other words, it is useful only to minors and bad actors.”

S.U.’s first suicide attempts occurred in July and August 2020. For her 12th birthday in March 2021, S.U. planned her third attempt, and C.U. had to take S.U. out of school, as Child Protective Services investigated. No amount of counseling helped, and the lawsuit says this is the point when “things only got worse.”

In June 2021, S.U. was admitted into an in-hospital treatment program, where the lawsuit says she was sexually assaulted by another resident. C.U. quit her job, withdrew S.U. from the program, and sought help for S.U. through 2022, leading S.U. to receive diagnoses for a range of mental health conditions, including clinical depression, anxiety disorder, post-traumatic stress disorder, and borderline personality disorder.

Regulations versus litigation as solution to the problem

Because of the extreme circumstances, C.U. and S.U. are suing the defendants for violating multiple laws, including unlawful business practices under California’s Unfair Competition Law for developing platforms that are allegedly addictive and exploitative to children. They also have brought claims of invasion of privacy and unjust enrichment, including alleging that platforms like Roblox, Snapchat, and Discord financially benefited from “knowingly assisting, supporting, and facilitating the sexual solicitation and exploitation of S.U. and similarly situated children.”

In particular, they accuse Roblox—which the lawsuit notes markets its coding training to schools, camps, and Girl Scouts, and has a gaming platform user base that’s mostly kids—of failing to update its features as it experienced “explosive growth,” for fear of sacrificing profits.

C.U. and S.U. are hoping the lawsuit will result in changes on Roblox, including restricting adult users from messaging minors and adding parental controls, like notifications when kids receive direct messages from adult users, including the full transcript.

Having access to chat transcripts could help parents like C.U. build increasingly more common cases alleging child abuse against social media platforms. The lawsuit notes that Discord has failed to provide transcripts from S.U.’s chat histories, despite automated messaging from Discord saying that S.U. would receive the data within 30 days.

Earlier this year, California passed an online safety bill that’s designed to force social media companies to consider how platforms harm children and build adequate protections. However, lawyers for C.U. and S.U. include attorneys from the Social Media Victim Law Center (SMVLC). That law firm says on its website that civil litigation—more than federal or state regulations—is needed to force accountability on social platforms. “Until social media companies are forced to include the cost of victim compensation in their financial model, they will have no incentive to curtail their profit margins by designing safer products,” the law firm’s website says.

SMVLC did not provide Ars with comments before deadline.


Activist Post reports regularly about privacy invasive and unsafe technology.  For more information, visit our archives.

Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Lawsuit Filed: Popular Game Platform Allegedly “became the gateway enabling multiple adult users to prey on a 10-year-old girl.”"

Leave a comment