A Maryland school district is suing Meta, Google, Snap, and TikTok owner ByteDance for allegedly contributing to a “mental health crisis” among students. A lawsuit filed by the Howard County Public School System on Thursday claims the social networks operated by these companies are “addictive and dangerous” products that have “rewired” the way kids “think, feel, and behave.”
Maryland school district sues Meta, Google, and TikTok over ‘mental health crisis’
Maryland school district sues Meta, Google, and TikTok over ‘mental health crisis’
The lawsuit cites a laundry list of issues on Instagram, Facebook, YouTube, Snapchat, and TikTok that it accuses of harming kids. That includes the (allegedly) addictive “dopamine-triggering rewards” on each app, such as TikTok’s For You page, which leverages data about user activity to provide an endless stream of suggested content. It also mentions Facebook and Instagram’s recommendation algorithms and “features that are designed to create harmful loops of repetitive and excessive product usage.”
Additionally, the school district accuses each platform of encouraging “unhealthy, negative social comparisons, which in turn cause body image issues and related mental and physical disorders” in kids. Other parts of the lawsuit address “defective” parental controls in each app, along with safety gaps it alleges promote child sexual exploitation.
“Over the past decade, Defendants have relentlessly pursued a strategy of growth-at-all costs, recklessly ignoring the impact of their products on children’s mental and physical health,” the lawsuit states. “In a race to corner the ‘valuable but untapped’ market of tween and teen users, each Defendant designed product features to promote repetitive, uncontrollable use by kids.”
The Howard County Public School System is far from the only school district that has decided to take legal action against social media companies as of late. In addition to two other school districts in Maryland, school systems in Washington state, Florida, California, Pennsylvania, New Jersey, Alabama, Tennessee, and others have filed similar lawsuits over the negative effects that social media has had on the mental health of kids.
“We’ve invested in technology that finds and removes content related to suicide, self-injury or eating disorders before anyone reports it to us,” Antigone Davis, Meta’s head of safety, says in an emailed statement to The Verge. “These are complex issues, but we will continue working with parents, experts and regulators such as the state attorneys general to develop new tools, features and policies that meet the needs of teens and their families.”
Google denies the allegations outlined in the lawsuit, with company spokesperson José Castañeda saying in a statement to The Verge, “In collaboration with child development specialists, we have built age-appropriate experiences for kids and families on YouTube, and provide parents with robust controls.” Meanwhile, Snap spokesperson Pete Boogaard says that the company “vet[s] all content before it can reach a large audience, which helps protect against the promotion and discovery of potentially harmful material.” ByteDance didn’t immediately respond to The Verge’s request for comment.
Critics have drawn attention to social media’s potential impact on children and teenagers, particularly after Facebook whistleblower Frances Haugen came forward with a trove of internal documents that indicated Meta knew about the potential harm Instagram had on some young users. Last week, US Surgeon General Dr. Vivek Murthy issued a public advisory that calls social media a “profound risk of harm to the mental health and well-being of children and adolescents.”
Some states have responded to the safety issues posed by social media by enacting laws that prevent kids from signing up for social media sites. While Utah will bar children under the age of 18 from using social media without parental consent starting next year, Arkansas has passed similar legislation preventing underage kids from signing up for social networks. At the same time, a flurry of national online safety laws, some of which could implement some sort of online age verification system, has made their way to Congress despite warnings from civil liberties and privacy advocates.