In recent months, a series of lawsuits have been filed against major technology companies, alleging that their platforms are contributing to a mental health crisis among children and teenagers. State officials and school districts across the United States have taken legal action against companies like Meta and TikTok, claiming that these platforms are designed to be addictive and expose young users to harmful content.
Arkansas Attorney General Tim Griffin filed a lawsuit against Meta, accusing the company of creating a mental health crisis in the state's communities. The lawsuit alleges that Meta's platforms have addictive features that lead to mental and behavioral issues among users. Similarly, Arkansas filed a separate lawsuit against TikTok, alleging that the company misled users about the age rating of its app and the prevalence of adult content.
These legal actions are part of a broader trend, with nearly 200 school districts joining lawsuits against social media companies like TikTok and Snapchat. The school districts argue that these platforms have harmed students' mental health and distracted them from their studies. The cases have been consolidated in the U.S. District Court in Oakland, California, along with hundreds of lawsuits filed by families alleging harm to their children from social media.
In response to mounting criticism, tech companies have introduced parental control tools to help monitor and limit children's online activities. Meta, for example, has developed features that allow parents to set time limits and monitor their children's use of Instagram. However, reports indicate that these tools are underutilized, with less than 10 percent of teens on Instagram enabling the parent supervision setting by the end of 2022.
The debate over children's online safety has also reached the legislative arena. Senators Richard Blumenthal and Marsha Blackburn reintroduced the Kids Online Safety Act , aiming to impose a duty of care on digital services to prevent certain harms to younger users. The bill has garnered support from over two dozen co-sponsors and is expected to be voted on in the current Senate session.
As the legal and legislative battles continue, the conversation about children's online safety remains a pressing issue. While tech companies emphasize their efforts to protect young users, critics argue that more stringent regulations and oversight are necessary to address the potential harms associated with social media use among children and teenagers.