Kids on their phones, Online Safety

Editorial: Congress, It’s Time to Prioritize Youth Safety

For years, digital platforms have prioritized profit while sacrificing the mental health and safety of users. The Kids Online Safety Act, or KOSA, is a necessary step toward holding platforms accountable for protecting their youngest customers and improving online safety. 

If passed, KOSA will require companies to reduce harm by mandating parental controls and default privacy settings for minors. The bill also instructs social media platforms to increase transparency for parents and policymakers while reducing features that contribute to addiction and exposure to harmful content.

The law mandates that platforms “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate harms” like mental health disorders and online harassment. Social media is likely the leading cause of the youth mental health crisis, and this would be the first federal law to do something about it.

KOSA was introduced in 2022 to address the U.S. government’s growing concerns about social media’s harm to youth. The bill passed the Senate 91-3 in July 2024 and is now being debated in the House.

Congress must prioritize passing KOSA to regulate the detrimental effects of social networks on youth. Congressional testimonies in 2021 revealed Facebook’s awareness of its detrimental algorithms and connections to an increase in youth suicide, depression, and anxiety rates.

Internal research has repeatedly shown that social media companies are aware of the harms of their design features but continue developing them to maximize profits. Leaked Facebook research from 2019, for example, proved that the like button causes “stress and anxiety” among its youngest users—and improves ad revenue through increased engagement.

Many popular social media platforms are facing lawsuits over their intention to profit from harmful features.

Attorney generals in 13 states are suing TikTok for falsely claiming the app is safe for young users. Leaked internal documents from these suits revealed a multitude of ways that TikTok placed their profits before users’ safety. TikTok’s own research found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety” and the product “has baked into it compulsive use.” 

TikTok recently rolled out seemingly good-natured attempts to reduce teens’ time on the app like screen time alerts after an hour of daily use and videos recommending they take a break from scrolling. The success of these implementations was measured by how much they improved “public trust in the TikTok platform via media coverage,” or their investor’s interest, rather than if they lowered users’ screen time or improved their well-being.

Kids Online Safety Act, mental health graph
Courtesy Jon Haidt Percent of 12- to 17-year-olds in the U.S. with major depression. Note that social media use also began growing significantly in 2012.

Because children and adolescents’ brains are still developing, they struggle to regulate their screen time effectively and are more susceptible to harmful elements. Platforms’ calculated attempts at hooking users in temporarily activate the brain’s reward center and release dopamine, while ultimately hurting their self-image and increasing feelings of loneliness. 

The parts of certain platforms that are the most successful at sucking users in, like temporary stories, have been picked up by others over time to compete for users. When Instagram found that their chronological feed was less enticing than TikTok’s algorithm, for example, they switched to an algorithmic feed. The adoption of these profitable, alluring, and harmful design elements is a feature—not a bug—of the Big Tech arms race for attention. 

KOSA will force these companies to reckon with the effects of these features on their youngest users by holding them liable for failing to exercise reasonable care. The “duty of care” provision of the bill will require tech companies to restrict addictive features, including infinite scrolling, personalized recommendation systems, and appearance-altering filters, for minors.

Another key provision of KOSA is “safeguards for minors,” which mandates that platforms offer tools for minors and guardians to control their communication, privacy settings, and access to features that may encourage dangerous or addictive behavior. This would include the ability for parents to “manage a minor’s privacy and account settings,” “restrict purchases,” and “view metrics of total time spent” on a platform.

The FBI gathered reports of online sextortion of minors in 2023—20 of which resulted in suicides—and found that the leading platform to recruit victims of sex trafficking was Snapchat, followed by Facebook and Instagram. New Mexico Attorney General Raúl Torrez filed a lawsuit against Snapchat over its misleading disappearing images, alleging they allow predators to “easily target children through sextortion schemes and other forms of sexual abuse.” KOSA’s default safeguard settings, like private accounts and limited direct messaging, would force platforms to protect minors from these harms.

Kids Online Safety Act Infographic
Celeste Zucker / M-A Chronicle

Free speech advocacy groups worry that KOSA could “silence important online conversations for all ages” or “curtail sensitive or controversial content,” violating the First Amendment.

KOSA, however, does nothing to restrict what people of any age can say or search. Before passing the Senate, KOSA was amended to further ensure the law couldn’t be used to infringe on users’ speech. It now explicitly states that the government can’t require a platform to prevent a minor from searching for information on “the prevention or mitigation of the harms.”

Most importantly, unrestricted access to social media content is not a constitutional right to begin with—especially for children, who already merit extra protection from the law in regard to other issues like child labor.

Passing KOSA is an essential next step in holding platforms accountable under the law for their design features that have been proven to hurt young users. It improves safety through better privacy controls and limits addictive features without infringing upon free speech or access to important online resources. After years of tweaking algorithms to better glue users to their phones, we must confront that much of the advancement of these technologies is not a sign of progress, but of unregulated power.

Rose Chane and Celeste Zucker were the lead authors of this article.

The Editorial Board is made up of Celine Chien, Tessa Ellingson, Ameya Nori, Lindsay Park, Ben Siegel, and Celeste Zucker. It represents the general consensus of the staff.

Leave a Reply

Your email address will not be published.