【JCJ声明】グレタ・トゥーンベリさんたちの拘束についてイスラエルに強く抗議する=JCJ事務局
How are women changing networks, and networks changing women? Find out in a new episode of our podcast
6月月例経済報告
After networks: How can the future of the internet be imagined beyond social media platforms?
第462回 消費者委員会本会議【6月10日開催】
現地レポート : 大阪・関西万博 救護体制の脆弱さ
ESRI Discussion Paper No.402「親のジェンダー選好が子の教育投資に与える影響」
JVN: SinoTrack製IOT PC Platformにおける複数の脆弱性
JVN: 複数のHitachi Energy製品におけるセキュリティ関連の処理に対するレスポンスの違いに起因する情報漏えいの脆弱性
JVN: MicroDicom製DICOM Viewerにおける境界外書き込みの脆弱性
JVN: Assured Telematics製Fleet Management Systemにおける認可されていない制御領域への機微なシステム情報の漏えいの脆弱性
報告 : 第4回レイバーネット・フィールドワーク「東大悪人銅像めぐり」
注意喚起: 2025年6月マイクロソフトセキュリティ更新プログラムに関する注意喚起 (公開)
注意喚起: Adobe AcrobatおよびReaderの脆弱性(APSB25-57)に関する注意喚起 (公開)
JVN: Siemens製品に対するアップデート(2025年5月)
JVN: Siemens 製品に対するアップデート (2021年4月)
今国会で再審法改正を! 12日、坐り込み!
Weekly Report: フィッシング対策協議会が「フィッシング対策ガイドライン」を改訂
Oppose STOP CSAM: Protecting Kids Shouldn’t Mean Breaking the Tools That Keep Us Safe
A Senate bill re-introduced this week threatens security and free speech on the internet. EFF urges Congress to reject the STOP CSAM Act of 2025 (S. 1829), which would undermine services offering end-to-end encryption and force internet companies to take down lawful user content.
Tell Congress Not to Outlaw Encrypted Apps
As in the version introduced last Congress, S. 1829 purports to limit the online spread of child sexual abuse material (CSAM), also known as child pornography. CSAM is already highly illegal. Existing law already requires online service providers who have actual knowledge of “apparent” CSAM on their platforms to report that content to the National Center for Missing and Exploited Children (NCMEC). NCMEC then forwards actionable reports to law enforcement agencies for investigation.
S. 1829 goes much further than current law and threatens to punish any service that works to keep its users secure, including those that do their best to eliminate and report CSAM. The bill applies to “interactive computer services,” which broadly includes private messaging and email apps, social media platforms, cloud storage providers, and many other internet intermediaries and online service providers.
The Bill Threatens End-to-End EncryptionThe bill makes it a crime to intentionally “host or store child pornography” or knowingly “promote or facilitate” the sexual exploitation of children. The bill also opens the door for civil lawsuits against providers for the intentional, knowing or even reckless “promotion or facilitation” of conduct relating to child exploitation, the “hosting or storing of child pornography,” or for “making child pornography available to any person.”
The terms “promote” and “facilitate” are broad, and civil liability may be imposed based on a low recklessness state of mind standard. This means a court can find an app or website liable for hosting CSAM even if the app or website did not even know it was hosting CSAM, including because the provider employed end-to-end encryption and could not view the contents of content uploaded by users.
Creating new criminal and civil claims against providers based on broad terms and low standards will undermine digital security for all internet users. Because the law already prohibits the distribution of CSAM, the bill’s broad terms could be interpreted as reaching more passive conduct, like merely providing an encrypted app.
Due to the nature of their services, encrypted communications providers who receive a notice of CSAM may be deemed to have “knowledge” under the criminal law even if they cannot verify and act on that notice. And there is little doubt that plaintiffs’ lawyers will (wrongly) argue that merely providing an encrypted service that can be used to store any image—not necessarily CSAM—recklessly facilitates the sharing of illegal content.
Affirmative Defense Is Expensive and InsufficientWhile the bill includes an affirmative defense that a provider can raise if it is “technologically impossible” to remove the CSAM without “compromising encryption,” it is not sufficient to protect our security. Online services that offer encryption shouldn’t have to face the impossible task of proving a negative in order to avoid lawsuits over content they can’t see or control.
First, by making this protection an affirmative defense, providers must still defend against litigation, with significant costs to their business. Not every platform will have the resources to fight these threats in court, especially newcomers that compete with entrenched giants like Meta and Google. Encrypted platforms should not have to rely on prosecutorial discretion or favorable court rulings after protracted litigation. Instead, specific exemptions for encrypted providers should be addressed in the text of the bill.
Second, although technologies like client-side scanning break encryption, members of Congress have misleadingly claimed otherwise. Plaintiffs are likely to argue that providers who do not use these techniques are acting recklessly, leading many apps and websites to scan all of the content on their platforms and remove any content that a state court could find, even wrongfully, is CSAM.
Tell Congress Not to Outlaw Encrypted Apps
The Bill Threatens Free Speech by Creating a New Exception to Section 230The bill allows a new type of lawsuit to be filed against internet platforms, accusing them of “facilitating” child sexual exploitation based on the speech of others. It does this by creating an exception to Section 230, the foundational law of the internet and online speech. Section 230 provides partial immunity to internet intermediaries when sued over content posted by their users. Without that protection, platforms are much more likely to aggressively monitor and censor users.
Section 230 creates the legal breathing room for internet intermediaries to create online spaces for people to freely communicate around the world, with low barriers to entry. However, creating a new exception that exposes providers to more lawsuits will cause them to limit that legal exposure. Online services will censor more and more user content and accounts, with minimal regard as to whether that content is in fact legal. Some platforms may even be forced to shut down or may not even get off the ground in the first place, for fear of being swept up in a flood of litigation and claims around alleged CSAM. On balance, this harms all internet users who rely on intermediaries to connect with their communities and the world at large.