Weighing the harms —

The Kids Online Safety Act isn’t all right, critics say

Critics warn KOSA could trigger widespread censorship, privacy concerns.

The Kids Online Safety Act isn’t all right, critics say
Aurich Lawson | Getty Images

Debate continues to rage over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms liable for feeding harmful content to minors. KOSA is lawmakers' answer to whistleblower Frances Haugen's shocking revelations to Congress. In 2021, Haugen leaked documents and provided testimony alleging that Facebook knew that its platform was addictive and was harming teens—but blinded by its pursuit of profits, it chose to ignore the harms.

Sen. Richard Blumenthal (D-Conn.), who sponsored KOSA, was among the lawmakers stunned by Haugen's testimony. He said in 2021 that Haugen had showed that "Facebook exploited teens using powerful algorithms that amplified their insecurities." Haugen's testimony, Blumenthal claimed, provided "powerful proof that Facebook knew its products were harming teenagers."

But when Blumenthal introduced KOSA last year, the bill faced immediate and massive blowback from more than 90 organizations—including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics warned lawmakers of KOSA's many flaws, but they were most concerned that the bill imposed a vague "duty of care" on platforms that was "effectively an instruction to employ broad content filtering to limit minors’ access to certain online content." The fear was that the duty of care provision would likely lead platforms to over-moderate and imprecisely filter content deemed controversial—things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escape from abusive situations.

So, regulators took a red pen to KOSA, which was reintroduced in May 2023 and amended this July, striking out certain sections and adding new provisions. KOSA supporters claim that the changes adequately address critics' feedback. These supporters, including tech groups that helped draft the bill, told Ars that they're pushing for the amended bill to pass this year.

And they might just get their way. Some former critics seem satisfied with the most recent KOSA amendments. LGBTQ+ groups like GLAAD and the Human Rights Campaign removed their opposition, Vice reported. And in the Senate, the bill gained more bipartisan support, attracting a whopping 43 co-sponsors from both sides of the aisle. Surveying the legal landscape, it appears increasingly likely that the bill could pass soon.

But should it?

Not all critics agreed that recent changes to the bill go far enough to fix its biggest flaws. In fact, the bill's staunchest critics told Ars that the legislation is incurably flawed—due to the barely changed duty of care provision—and that it still risks creating more harm than good for kids.

These critics also warn that all Internet users could be harmed, as platforms would likely start to censor a wide range of protected speech and limit user privacy by age-gating the Internet.

“Duty of care” fatally flawed?

To address some of the criticisms, the bill changed platform restrictions so that platforms wouldn't start blocking kids from accessing "resources for the prevention or mitigation of" any harms described under the duty of care provision. That theoretically means that if KOSA passes, kids should still be able to access online resources to deal with:

  • Mental health disorders, including anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors.
  • Patterns of use that indicate or encourage addiction-like behaviors.
  • Physical violence, online bullying, and harassment of the minor.
  • Sexual exploitation and abuse.
  • Promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol.
  • Predatory, unfair, or deceptive marketing practices, or other financial harms.

Previously, this section describing harmful content was slightly more vague and only covered minors' access to resources that would help them prevent and mitigate "suicidal behaviors, substance use, and other harms."

Irene Ly—who serves as counsel on tech policy for Common Sense Media, a nonprofit that provides age ratings for tech products—helped advise KOSA authors on the bill's amendments. Ly told Ars that these changes to the bill have narrowed the duty of care enough and reduced "the potential for unintended consequences." Because of this, "support for KOSA from LGBTQ+ groups and policymakers, including openly gay members of Congress" was "significantly boosted," Ly said.

But the duty of care section didn't really change that much, KOSA critics told Ars, and Vice reported that many of KOSA's supporters are far-right, anti-LGBTQ organizations seemingly hoping for a chance to censor a broad swath of LGBTQ+ content online.

The only other KOSA change was to strengthen the "knowledge standard" so that platforms can only be held liable for violating KOSA if they don't take "reasonable efforts" to mitigate harms when they know that kids are using their platforms. Previously, platforms could be found liable if a court ruled that platforms "reasonably should know" that there are minors on the platform.

Joe Mullin, a policy analyst for the Electronic Frontier Foundation (EFF)—a leading digital rights nonprofit that opposes KOSA—told Ars that while the knowledge standard change is "actually positive" because it's "tightened up a bit." Still, the revised KOSA "doesn't really solve the problem" that many critics still have with the legislation.

"I think the duty of care section is fatally flawed," Mullin told Ars. "They really didn't change it at all."

In a letter to lawmakers last month, legal experts with the think tank TechFreedom similarly described the duty of care section as "incurably flawed" and cautioned that it seemingly violates the First Amendment. The letter urged lawmakers to reconsider KOSA's approach:

The unconstitutionality of KOSA’s duty of care is highlighted by its vague and unmeetable nature. Platforms cannot 'prevent and mitigate' the complex psychological issues that arise from circumstances across an individual’s entire life, which may manifest in their online activity. These circumstances mean that material harmful to one minor may be helpful or even lifesaving to another, particularly when it concerns eating disorders, self-harm, drug use, and bullying. Minors are individuals, with differing needs, emotions, and predispositions. Yet KOSA would require platforms to undertake an unworkable one-size-fits-all approach to deeply personal issues, thus ultimately serving the best interests of no minors.

ACLU's senior policy counsel Cody Venzke agrees with Mullin and TechFreedom that the duty of care section is still the bill's biggest flaw. Like TechFreedom, Venzke remains unconvinced that changes are significant enough to ensure that kids won't be cut off from some online resources if KOSA passes in its current form.

"If you take the broad, vague provisions of the duty of care, platforms are going to end up taking down the bad as well as the good, because content moderation is biased," Venzke told Ars. "And it doesn't do youths any good to know that they can still search for helpful resources, but those helpful resources supported have been swept up in the broad takedowns of content."

Ars Video

How The Callisto Protocol's Gameplay Was Perfected Months Before Release

jump to endpage 1 of 4

Will KOSA require age verification?

KOSA supporters told Ars that the bill specifically does not require platforms to verify ages of users. Josh Golin, executive director at Fairplay—a nonprofit child advocacy organization that helped draft KOSA—told Ars that "contrary to KOSA's critics' claims, the legislation does not require content takedowns, age verification, or government IDs to use the Internet."

"In response to good-faith criticism of the 2022 version of KOSA, the bill has been considerably improved and, as a result, opposition from civil society has decreased significantly," Golin told Ars. "Perhaps that's why the remaining opponents are so desperate to scare people with doomsday scenarios, rather than engaging with the actual text of the bill."

But critics who feel the text is still too broad said that platforms could be more at risk for lawsuits if they don't verify users' ages.

Mullin told Ars that without age verification, it's still unclear what standard that platforms could use to avoid liability. Further, Mullin is concerned that platforms are not being size-gated under KOSA, which means any platform of any size could be liable and thus feel pressured to verify users' ages. Until these points are clarified, Mullin's not sure how KOSA wouldn't lead to a future where the whole Internet is age-gated.

"Is it good enough to say, 'Hey, if you're under 18, don't come here' or 'Click to confirm you're 18 or over?'" Mullin asked. "Will that be good enough?"

In TechFreedom's letter to lawmakers, the think tank predicted that KOSA would "force platforms to age-verify users" because it's the most risk-averse path to compliance.

"While doubtless well-intentioned, these changes merely trade a clear, explicit mandate for a vague, implicit one; the unconstitutional effect on anonymous expression will be the same," TechFreedom's letter said.

As the KOSA debate remains heated, Golin and Ly questioned the motives of some of KOSA's loudest critics like TechFreedom. Golin told Ars that because the think tank receives substantial financial contributions from Google and Meta, he thinks TechFreedom is motivated "to defend the status quo" and defeat KOSA.

TechFreedom's president Berin Szóka and free speech counsel Ari Cohn authored the letter to lawmakers. Szóka told Ars that "it's an old rule of Washington that those who have no substantive response to critiques of legislative text fall back on impugning motives. That's especially true when bills like KOSA haven't been vetted properly in hearings," Szóka said.

Cohn told Ars that he welcomes "anyone who disagrees with our analyses to engage on the substance of those disagreements." He thinks "that would be a productive conversation well worth having."

"The principles that make KOSA obviously unconstitutional are well-settled law," Cohn said. "Effectively ending anonymous speech online and imposing a duty on platforms to protect us from ideas is not bad because it threatens the 'status quo' of major platforms. It is bad because it threatens the First Amendment rights of every American who speaks or receives information online—and that is a sacrifice that Congress is not authorized to make."

Cohn also balked at criticism that TechFreedom is only upholding the status quo in this debate. He said that TechFreedom's "fidelity is to the law," not to platforms.

"Our principles, legal analysis, and positions are not for sale," Cohn said.

Critics: KOSA promotes censorship, not privacy

Evan Greer, deputy director of the nonprofit digital rights advocacy group Fight for the Future, told Ars that the organization "strongly supports strict regulation of Big Tech companies," but as far as Fight for the Future can tell, KOSA is not the privacy bill that supporters claim that it is.

"If KOSA were actually a privacy bill as its supporters claim, we would be all about it," Greer told Ars. "We support cracking down on tech companies harvesting of data, we support an end to manipulative business practices like autoplay, infinite scroll, intrusive notifications, and algorithmic recommendations powered by commercial surveillance. What we don't support is a bill that gives state attorneys general the power to dictate what content younger people can see on social media. That's where KOSA goes off the rails and becomes a censorship bill, rather than a privacy bill."

Because KOSA enforcement falls to state attorneys general—many of whom are elected officials—  the ACLU's senior policy counsel Cody Venzke told Ars that it's easier for the government to target and censor specific viewpoints that clash with their party politics.

Mullin agreed, saying that part of the reason why KOSA has so much bipartisan support is because both Democrats and Republicans are in favor of censoring opposing viewpoints and are "assuming the censorship will go their way."

"It's just totally crazy," Mullin told Ars. "People have very different views on how you can mitigate" harms like "eating disorders, addiction, bullying, sexual exploitation, drug use, alcohol use, gambling, tobacco use, and all predatory or deceptive marketing practices" by "controlling online speech."

"People don't agree about what's harmful on any of these issues," Mullin said. "These are challenging things to deal with, and families do it differently. And I don't think it's gonna be better when the government starts creating rules about it."

Venzke told Ars that "outside of very narrow exceptions," it's "not Congress's role to decide what is good speech and what is bad speech."

And he thinks that question probably shouldn't be up to platforms to answer, either. Venzke warned that KOSA would take decisions about kids' welfare out of the family's hands.

Instead, Venzke said that platforms would be required to "look at content that causes depression or anxiety"—"which is, of course, anything that's out there in the world"—and make decisions that could result in platforms dictating what speech flies online. Beyond cutting kids off from information, that could lead to wide-ranging censorship of the entire Internet, and TechFreedom's Cohn told Ars that's why he's found the bill's widespread support "baffling."

"By requiring platforms to protect us from 'harmful' ideas, it effectively deputizes major platforms to determine what speech is safe for us," Cohn told Ars. "If the government wanted to do that, the uproar would be rightfully furious. It is baffling that people who don't trust the platforms to begin with think it's any less terrible to outsource that work to them."

In addition, many young people are engaged with "politically challenging topics like LGBT rights, gun violence, and climate change," Venzke told Ars. If state attorneys general are "simply telling platforms to question these topics" as "the things that cause anxiety for young people," that is likely to chill young people's "opportunity to participate" in meaningful debates where young people have recently been vocal advocates online.

Unsurprisingly, last month, Techdirt reported that TikTok influencers had caught wind of how KOSA might be used to censor speech and have begun speaking out against KOSA online. The EFF has also been conducting outreach and urging young people to oppose the law. Mullin estimated that the EFF has helped tens of thousands of concerned citizens send letters to Congress urgently challenging KOSA. And when the EFF's efforts are "combined with allies and petitions from other sources," Mullin estimated that "well over a quarter of a million messages" opposing KOSA have reached lawmakers across the country.