「AM局の運用休止に係る特例措置に関する基本方針」の改定案に対する意見募集の結果及び同基本方針の改定
新世代モバイル通信システム委員会報告(案)に対する意見募集の結果
周波数再編アクションプラン(令和6年度版)の公表
利用者情報に関するワーキンググループ(第18回) 開催案内
日本放送協会のインターネット活用業務実施基準の変更の認可
日本放送協会の基幹放送局提供子会社への出資の認可
競争ルールの検証に関するWG(第60回)開催案内
4.9GHz帯における第5世代移動通信システムの普及のための特定基地局の開設計画の認定
モバイル接続料費用配賦ワーキンググループ(第5回)
情報通信法学研究会 通信法分科会(令和6年度第1回)開催案内
情報通信審議会 情報通信技術分科会 技術戦略委員会(第52回)
村上総務大臣閣議後記者会見の概要
第4回デジタル部会
固定電話番号の双方向ポータビリティの実現に向けた検討会(第3回)配布資料
X's Last-Minute Update to the Kids Online Safety Act Still Fails to Protect Kids—or Adults—Online
Late last week, the Senate released yet another version of the Kids Online Safety Act, written, reportedly, with the assistance of X CEO Linda Yaccarino in a flawed attempt to address the critical free speech issues inherent in the bill. This last minute draft remains, at its core, an unconstitutional censorship bill that threatens the online speech and privacy rights of all internet users.
TELL CONGRESS: VOTE NO ON KOSA
no kosa in last minute funding bills
Update Fails to Protect Users from Censorship or Platforms from LiabilityThe most important update, according to its authors, supposedly minimizes the impact of the bill on free speech. As we’ve said before, KOSA’s “duty of care” section is its biggest problem, as it would force a broad swath of online services to make policy changes based on the content of online speech. Though the bill’s authors inaccurately claim KOSA only regulates designs of platforms, not speech, the list of harms it enumerates—eating disorders, substance use disorders, and suicidal behaviors, for example—are not caused by the design of a platform.
The authors have failed to grasp the difference between immunizing individual expression and protecting a platform from the liability that KOSA would place on it.
KOSA is likely to actually increase the risks to children, because it will prevent them from accessing online resources about topics like addiction, eating disorders, and bullying. It will result in services imposing age verification requirements and content restrictions, and it will stifle minors from finding or accessing their own supportive communities online. For these reasons, we’ve been critical of KOSA since it was introduced in 2022.
This updated bill adds just one sentence to the “duty of care” requirement:“Nothing in this section shall be construed to allow a government entity to enforce subsection a [the duty of care] based upon the viewpoint of users expressed by or through any speech, expression, or information protected by the First Amendment to the Constitution of the United States.” But the viewpoint of users was never impacted by KOSA’s duty of care in the first place. The duty of care is a duty imposed on platforms, not users. Platforms must mitigate the harms listed in the bill, not users, and the platform’s ability to share users’ views is what’s at risk—not the ability of users to express those views. Adding that the bill doesn’t impose liability based on user expression doesn’t change how the bill would be interpreted or enforced. The FTC could still hold a platform liable for the speech it contains.
Let’s say, for example, that a covered platform like reddit hosts a forum created and maintained by users for discussion of overcoming eating disorders. Even though the speech contained in that forum is entirely legal, often helpful, and possibly even life-saving, the FTC could still hold reddit liable for violating the duty of care by allowing young people to view it. The same could be true of a Facebook group about LGBTQ issues, or for a post about drug use that X showed a user through its algorithm. If a platform’s defense were that this information is protected expression, the FTC could simply say that they aren’t enforcing it based on the expression of any individual viewpoint, but based on the fact that the platform allowed a design feature—a subreddit, Facebook group, or algorithm—to distribute that expression to minors. It’s a superfluous carveout for user speech and expression that KOSA never penalized in the first place, but which the platform would still be penalized for distributing.
It’s particularly disappointing that those in charge of X—likely a covered platform under the law—had any role in writing this language, as the authors have failed to grasp the world of difference between immunizing individual expression, and protecting their own platform from the liability that KOSA would place on it.
Compulsive Usage Doesn’t Narrow KOSA’s ScopeAnother of KOSA’s issues has been its vague list of harms, which have remained broad enough that platforms have no clear guidance on what is likely to cross the line. This update requires that the harms of “depressive disorders and anxiety disorders” have “objectively verifiable and clinically diagnosable symptoms that are related to compulsive usage.” The latest text’s definition of compulsive usage, however, is equally vague: “a persistent and repetitive use of a covered platform that significantly impacts one or more major life activities, including socializing, sleeping, eating, learning, reading, concentrating, communicating, or working.” This doesn’t narrow the scope of the bill.
The bill doesn’t even require that the impact be a negative one.
It should be noted that there is no clinical definition of “compulsive usage” of online services. As in past versions of KOSA, this updated definition cobbles together a definition that sounds just medical, or just legal, enough that it appears legitimate—when in fact the definition is devoid of specific legal meaning, and dangerously vague to boot.
How could the persistent use of social media not significantly impact the way someone socializes or communicates? The bill doesn’t even require that the impact be a negative one. Comments on an Instagram photo from a potential partner may make it hard to sleep for several nights in a row; a lengthy new YouTube video may impact someone’s workday. Opening a Snapchat account might significantly impact how a teenager keeps in touch with her friends, but that doesn’t mean her preference for that over text messages is “compulsive” and therefore necessarily harmful.
Nonetheless, an FTC weaponizing KOSA could still hold platforms liable for showing content to minors that they believe results in depression or anxiety, so long as they can claim the anxiety or depression disrupted someone’s sleep, or even just changed how someone socializes or communicates. These so-called “harms” could still encompass a huge swathe of entirely legal (and helpful) content about everything from abortion access and gender-affirming care to drug use, school shootings, and tackle football.
Dangerous Censorship Bills Do Not Belong in Must-Pass LegislationThe latest KOSA draft comes as incoming nominee for FTC Chair, Andrew Ferguson—who would be empowered to enforce the law, if passed—has reportedly vowed to protect free speech by “fighting back against the trans agenda,” among other things. As we’ve said for years (and about every version of the bill), KOSA would give the FTC under this or any future administration wide berth to decide what sort of content platforms must prevent young people from seeing. Just passing KOSA would likely result in platforms taking down protected speech and implementing age verification requirements, even if it's never enforced; the FTC could simply express the types of content they believe harms children, and use the mere threat of enforcement to force platforms to comply.
No representative should consider shoehorning this controversial and unconstitutional bill into a continuing resolution. A law that forces platforms to censor truthful online content should not be in a last minute funding bill.
TELL CONGRESS: VOTE NO ON KOSA
no kosa in last minute funding bills