‘I was addicted to social media – now I’m suing Big Tech’

Hundreds of families are suing some of the world’s largest technology companies – which, they say, knowingly exposed children to dangerous products. One of the plaintiffs explains why they are taking on the power of Silicon Valley.

“I was completely trapped by addiction at the age of 12. And I didn’t get my life back during my teenage years.”

Taylor Little’s addiction was social media, an addiction that led to suicide attempts and depression over the years.

Taylor, who is now 21 and uses “they” pronouns, described tech companies as “big, bad monsters.”

Taylor believes these companies are intentionally putting highly addictive and damaging products into the hands of children.

jojojojojojojojo

That’s why Taylor and hundreds of other American families are suing four of the world’s largest technology companies.

Dangerous by design
The lawsuit against Meta – the owner of Facebook and Instagram – plus TikTok, Google and Snap Inc, owner of Snapchat, is one of the largest ever filed in Silicon Valley.

Plaintiffs include ordinary families and school districts from across America.

They claim that the platform was deliberately designed to be dangerous.

The family’s lawyers believe that the case of Molly Russell, a 14-year-old schoolgirl in England, is an important example of the potential dangers faced by teenagers.

Last year they monitored the inquest into his death via video link from Washington, looking for any evidence they could use in a lawsuit in the US.

Molly’s name is mentioned dozens of times in the main complaint filed in court in California.

Last week, the families involved in the case received a major boost when a federal judge ruled that the companies could not use the First Amendment of the US constitution, which protects free speech, to block the action.

Judge Gonzalez Rogers also ruled that S230 of the Communications Decency Act, which states that platforms are not publishers, does not provide blanket protection to companies.

The judge ruled that, for example, the lack of “robust” age verification and lax parental supervision, the family argued, were not freedom of expression issues.

The family’s attorney called it a “significant victory.”

The companies say the claims are untrue and they intend to vigorously defend themselves.

‘Like withdrawal’
Taylor, who lives in Colorado, said that before getting their first smartphone, they were sporty and outgoing people, enjoying taking part in dancing and theater.

“If my phone is taken away, it feels like it’s being taken away. It’s unbearable. Literally, when I say my phone is addictive, I don’t mean it’s habit-forming. I mean, my body and mind crave it.”

Taylor remembers the first social media notification they clicked on.

It was someone’s self-harm page, which featured pictures of wounds and cuts.

“As an 11-year-old, I clicked on a page and was shown it without warning. No, I didn’t look for it. I didn’t ask for it. I can still see it. I’m 21 years old, I can still see it.”

Taylor also struggled with content surrounding body image and eating disorders.

“It used to be – it was – like a cult. It felt like a cult. You were constantly bombarded with photos of bodies you couldn’t have without dying.

“You can’t avoid it.”

Attorneys for Taylor and other plaintiffs have taken a new approach to the litigation, focusing on the design of the platform and not individual posts, comments or images.

They claim the app contains design features that cause addiction and harm.

‘Absolutely not true’
Meta issued a statement saying: “Our thoughts are with the family represented in this complaint.

“We want to assure every parent that we have their best interests at heart in the efforts we make to provide a safe and supportive online experience for teens.”

TikTok declined to comment.

Google told us: “The allegations in this complaint are untrue. Protecting children across our platforms has always been at the core of our work.”

And Snapchat says its platform is “designed to remove the pressure to be perfect. We vet all content before it can reach a wide audience to prevent the spread of anything that could be harmful.”

Molly Russell
Taylor knows all about the story of Molly Russell, from northwest London, who took her own life after being exposed to negative and distressing content on Instagram.

Research at