Last Updated on January 26, 2026 by Chicago Policy Review Staff
The Kids Online Safety Act (KOSA) recently reemerged as one of Washington’s most closely
watched bipartisan tech proposals. As worry has grown on Capitol Hill about children’s and teens’ online safety, research in recent years has increasingly found social media use to be driving surging rates of adolescent depression, anxiety, eating disorders, and self-harm. In 2021, the Centers for Disease Control and Prevention (CDC) reported that roughly one in five U.S. high school students experienced persistent feelings of sadness or hopelessness, alongside rising emergency-room visits for suspected suicide attempts. While adolescent mental health outcomes are shaped by multiple factors, a 2023 advisory from the U.S. Surgeon General identified social media use as a significant risk factor, citing evidence that increased exposure—particularly to algorithm-driven and recommendation-driven content—has been associated with higher rates of depression, anxiety, eating disorders, and self-harm among young people.
American concern over the problem has been mounting in the wake of congressional hearings on social media harms. Whistleblower revelations indicated that platforms’ own internal research had identified links between engagement-driven design choices and harms to young users’ mental health, and renewed scrutiny of algorithm-driven recommendation engines that prioritize engagement and can amplify harmful content at scale. The Kids Online Safety Act (KOSA) is a bipartisan congressional proposal aimed at protecting minors from online harms. Among other requirements, the KOSA would require covered platforms to exercise reasonable care in how they design and operate their services to prevent and mitigate a host of harms to minors. Some stakeholders view the legislation as a long-overdue response to Big Tech’s persistent failure to safeguard young users while others fear the bill could pose new threats to online privacy, free expression, and digital autonomy.
The ongoing debate over KOSA thus reflects a broader dilemma in digital governance: how to protect children from online harms without undermining the open and expressive internet that many young people also need for information, community, and support.
Background
Congress has held numerous hearings over the years on the risks online platforms pose to children. Revelations from Facebook whistleblower Frances Haugen in 2021 regarding Instagram’s own research linking social media to teen depression and eating disorders, for example, sparked a bipartisan “techlash” and a wave of legislative proposals intended to rein in social media harms.
One of the most dramatic moments came during a January 2024 Senate hearing in which Senate committees from both parties grilled Big Tech CEOs on social media content moderation and addictive design choices. Meta CEO Mark Zuckerberg even issued a public apology. The charged moment reflected rising public frustration and cleared the way for KOSA to resurface in the Senate.
Yet the real challenge is striking the right balance between the moral imperative to protect kids and teens, and the need to preserve their rights to information and autonomy online.
Legislative Journey and Political Support
KOSA is one of the few issues in which progressives focused on privacy and conservatives concerned with morality panics find common cause. The bill was originally introduced in 2022 but bogged down in debate over age verification requirements, which was narrowed and clarified in a compromise version of the bill in mid-2023 to address concerns about mandatory or intrusive verification methods. By February 2024, 62 senators, including Senate Majority Leader Chuck Schumer, had signed on as co-sponsors, a nearly filibuster-proof majority.
The bill was merged with a modified version of Children’s Online Privacy Protection Act (COPPA) 2.0, which expanded COPPA-like protections to teenagers, to become the Kids Online Safety and Privacy Act (KOSPA). In July 2024, the Senate passed KOSPA by a wide margin. The House had prepared a companion bill, H.R. 7891, but ran out of time before adjournment.
The coalition of parent groups, child psychologists, the NAACP, and even gaming companies such as Nintendo that backs the law has made child protection a rare cause that unifies the left and the right.
Key Provisions of KOSA
KOSA was introduced in 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN). The bill takes the approach of a “safety-by-design” model, codifying a new “duty of care” that would obligate major social media companies to use reasonable care in order to protect minors from harms like self-harm, eating disorders, depression, online bullying, and sexual exploitation.
The bill would codify high default privacy settings, ban the sharing of data without clear consent, and enable parents to set screen time limits and block certain content. Teens would also be given simplified privacy settings and the ability to opt out of algorithmic recommendations that can keep them trapped in a spiral of harmful content.
KOSA would not ban particular categories of content or give the government new censorship authorities, as some prior provisions have done. Rather, the bill would focus on the design and recommendation practices that are known to make harmful content more likely to spread. The companies would also be required to submit to annual independent audits of their child safety practices, and to make available anonymized data to qualified researchers in an effort to enable more academic study of youth and technology.
Advocates of the bill say that these changes would shift responsibility from individual families to the multibillion-dollar corporations whose products are known to affect adolescent behaviors. In effect, the legislation would make Big Tech accountable by design, not just by enforcement.
Concerns Over Privacy and Free Expression
KOSA has been criticized by civil liberties and digital rights organizations. The Electronic Frontier Foundation (EFF) describes it as “a dangerous bill,” noting that language in the bill is so vague, such as “prevent and mitigate harm,” that companies will feel compelled to over-moderate any sensitive content. Critics warn of a chilling effect, where companies block access to health, mental health, and sexual education information and resources to avoid liability. KOSA has been compared to the 2018 Allow States and Victims to Fight Online Sex Trafficking Act and Stop Enabling Sex Traffickers Act (FOSTA-SESTA) bill which resulted in websites removing legal content to reduce their legal risk.
Some of the most common examples of over-censorshipl relate to laws or policies that apply to large swaths of the internet and to a wide range of online harms, and that are vague in their requirements. In the U.S., for instance, after the enactment of FOSTA-SESTA, online platforms deleted or limited legal content about sex education, harm reduction, and LGBTQ+ resources to avoid legal liability. Multiple advocacy groups noted that FOSTA-SESTA led to disproportionately negative impacts on sex workers and LGBTQ+ youth who relied on online information and communities that were previously available and legal.
Similar situations have played out in other contexts. In the UK, the early enactment of age-verification and content-filtering requirements under its Online Safety Act led some platforms to block all categories of content they found difficult to assess, such as mental health forums.
Privacy groups also raise concerns that companies would engage in mass age verification to identify adult users who can access mature content and thereby shield minors from it. This could require ID checks, such as government documents or facial scans, which could end user anonymity. EFF has also warned that this would build centralized databases of sensitive information that could be targeted by malicious actors. Teenagers have also spoken out against the bill, warning that its requirements could limit access to online communities and resources that are often life-saving for vulnerable young people.
These tensions highlight KOSA’s central dilemma: how to ensure young users are protected from online harms without shutting them out from supportive information and spaces.
Revisions and Ongoing Debate
In early 2024, Senators Blumenthal and Blackburn released new amendments in response to many of the criticisms leveled against KOSA. The February 2024 revisions to KOSA removed the provision that would have allowed state attorneys general to enforce the duty-of-care standard and left primary enforcement authority with the Federal Trade Commission. The text also clarified that the duty of care pertains to design features only, rather than content moderation decisions. These adjustments were designed to alleviate concerns that the law could be weaponized for ideological censorship.
As a result of the changes, some opposition has been eased. In a joint letter, LGBTQ+ advocacy groups GLAAD, the Human Rights Campaign, and The Trevor Project announced they would no longer oppose KOSA after the revisions in February 2024 directly addressed their concerns about potential weaponization against queer youth.
However, groups such as Fight for the Future and EFF continue to oppose the bill, saying that without stronger guarantees that the law would be implemented in a “content-neutral” manner, companies will continue to err on the side of over-filtering content when faced with the risk of regulatory penalties. The controversy surrounding KOSA even after being passed as bipartisan legislation shows how little cultural consensus there is over the balance between speech and safety online.
Global Context and Future Outlook
A growing number of states, including California, Utah, and Arkansas, have enacted their own online safety laws, while federal initiatives would add to this fragmented patchwork of age-verification and data-privacy regulations. For example, California’s Age-Appropriate Design Code (2022) already mandates privacy-by-default settings for children’s accounts and risk assessments for youth-oriented apps.
Internationally, the United Kingdom’s Online Safety Act (2023) and the European Union’s Digital Services Act (DSA) both place transparency and risk-mitigation requirements on large platforms. These global precedents show both a common understanding that laissez-faire digital environments have not adequately protected children, but also how easily well-meaning legislation can veer into overreach.
If passed, KOSA would put the United States on track to join this new wave of “child-first” digital governance. But its longevity will hinge on how the legislation is implemented: whether regulators can define “harm” without encroaching on other values, whether companies can balance the costs of compliance with innovation, and whether society can balance the goals of protection and autonomy.
In the end, the Kids Online Safety Act is not just about protecting children, it is about redefining responsibility in the digital era. Congress must decide, not only if, but also how to shield the most vulnerable without eroding the open, expressive, and connected internet that young people rely on to thrive.

