featured image

By Joel Rosenblatt and Cecilia D’Anastasio | Bloomberg

Roblox Corp. and Discord Inc. are the latest targets in a spate of lawsuits over social media addiction, in a case alleging an 11-year-old girl was exploited by a sexual predator while playing online video games.

The girl met adult men through the direct messaging services of Roblox and Discord, who she believed had safeguards to protect her, according to a statement from her attorneys at Seattle’s Social Media Victims Law Center, who have filed numerous other addiction lawsuits. .

Wednesday’s complaint in San Francisco state court gives Snap Inc. and Meta Platforms Inc. also blamed the girl’s mental health problems and suicide attempts.

“These men exploited her sexually and financially,” the group said. “They also introduced her to the social media platforms Instagram and Snapchat, to which she became addicted.”

A Discord spokesperson declined to comment on pending lawsuits, but said the company has “a zero-tolerance policy for anyone who endangers or sexualizes children.”

“We are working relentlessly to keep this activity out of our service and take immediate action when we become aware of it,” the company said in a statement, adding that it uses a technology called PhotoDNA to find and remove images of child exploitation and engages with government agencies where appropriate.

Meta declined to comment on the suit. Roblox and Snap did not immediately respond to requests for comment.

Meta and Snap have previously said they are working to protect their youngest users, including by providing resources on mental health topics and enhancing safeguards to stop the spread of harmful content.

More than 80 lawsuits have been filed this year against Meta, Snap, TikTok of ByteDance Inc. and Google from Alphabet Inc. focused on adolescent and young adult claims that they suffered from anxiety, depression, eating disorders and insomnia after becoming addicted to social media. In at least seven cases, the plaintiffs are the parents of children who have died by suicide.

Discord is a gaming chat app with 150 million monthly active users. Popular with young people, Discord was known online as a kind of wild west space. The company has stepped up its mitigation efforts over the past two years. In 2022, at least half a dozen cases involving child sexual abuse or child-care material cited Discord, according to a Bloomberg News search of Justice Department records.

Roblox is a gaming platform with over 203 million monthly active users, many of whom are children. Young players have been introduced to extremists on the platform, who may be having online conversations elsewhere, such as Discord or Skype. Roblox has robust moderation efforts, including scanning text chat for inappropriate words and any virtual image uploaded to the game.

The girl in the lawsuit, identified only by the initials SU, and her family are trying to hold the social media companies financially responsible for the damage they allegedly caused. The family also wants a court order instructing the platforms to make their products more secure, which can be done through existing technologies and at minimal time and cost to the companies, according to the Social Media Victims Law Center.

SU said that shortly after getting an iPad for Christmas at age 10, a man named Charles “befriended” her on Roblox and encouraged her to drink alcohol and take prescription drugs.

Later, encouraged by the men she met on Roblox and Discord, SU opened Instagram and Snapchat accounts, which she initially hid from her mother, according to the complaint.

Although she was not yet 13 — the minimum age for accounts on Instagram and Snap according to their terms of service — SU became so addicted to the platforms that she would sneak online in the middle of the night, causing her to fall asleep. robbed, the complaint said.