Tell Congress: KOSA Will Censor the Internet But Won't Help Kids

17 hours 42 minutes ago

The Kids Online Safety Act (KOSA) would censor the internet and would make government officials the arbiters of what young people can see online. It will likely lead to age verification, handing more power, and private data, to third-party identity verification companies like Clear or ID.me.

The government should not have the power to decide what topics are "safe" online for young people, and to force services to remove and block access to anything that might be considered unsafe for children. This isn’t safety—it’s censorship.

Electronic Frontier Foundation

【JCJ広島支部】学習交流会 日鉄呉跡地の軍事拠点化問題を考える会参加者発言1=編集部

19 hours 5 minutes ago
                     軍部復活させぬ 森 芳郎さん 昨年9月末に閉鎖された日鉄呉跡地約130㌶を防衛省が一括購入して「多機能な複合防衛拠点」をつくろうとしている。呉市議会や経済界には「停滞感のある呉には明るいニュース」とする声も多い。日鉄は「社の方針に合致する」と防衛省の意向を評価し、県・市との三者協議にも不参加を表明している。だが、戦中の呉を思い起こし不安を訴える声も少なくない。4月7日発足の「日鉄呉跡地問題を考える会」は、市民の危惧を受け「子どもたちの..
JCJ

[B] 「ドサクサ紛れに蔓延る悪」【西サハラ最新情報】  平田伊都子

20 hours 59 minutes ago
もしトラがほんトラになる?!、、 トランプがアメリカ大統領に返り咲いたら、困る人がたくさんいます。 あなたは、どうですか? ガザ戦争犯罪人ネタニヤフは、23日にバイデン米大統領と会談し、24日にはアメリカの上下両院合同会議で演説し、26日にトランプ前大統領と会談する予定でした。 が、バイデン大統領が、突然、大統領選撤退を言い出したので、予定が狂ってしまいました。 フランスはパリ五輪に向け、踊り子さんや空港労組の皆さんがストライキを予告しています、、世の中、ドサクサが続発中です!
日刊ベリタ

Digital Apartheid in Gaza: Unjust Content Moderation at the Request of Israel’s Cyber Unit

21 hours 14 minutes ago

This is part one of an ongoing series. 

Government involvement in content moderation raises serious human rights concerns in every context. Since October 7, social media platforms have been challenged for the unjustified takedowns of pro-Palestinian content—sometimes at the request of the Israeli government—and a simultaneous failure to remove hate speech towards Palestinians. More specifically, social media platforms have worked with the Israeli Cyber Unit—a government office set up to issue takedown requests to platforms—to remove content considered as incitement to violence and terrorism, as well as any promotion of groups widely designated as terrorists. 

Many of these relationships predate the current conflict, but have proliferated in the period since. Between October 7 and November 14, a total of 9,500 takedown requests were sent from the Israeli authorities to social media platforms, of which 60 percent went to Meta with a reported 94% compliance rate. 

This is not new. The Cyber Unit has long boasted that its takedown requests result in high compliance rates of up to 90 percent across all social media platforms. They have unfairly targeted Palestinian rights activists, news organizations, and civil society; one such incident prompted Meta’s Oversight Board to recommend that the company “Formalize a transparent process on how it receives and responds to all government requests for content removal, and ensure that they are included in transparency reporting.”

When a platform edits its content at the behest of government agencies, it can leave the platform inherently biased in favor of that government’s favored positions. That cooperation gives government agencies outsized influence over content moderation systems for their own political goals—to control public dialogue, suppress dissent, silence political opponents, or blunt social movements. And once such systems are established, it is easy for the government to use the systems to coerce and pressure platforms to moderate speech they may not otherwise have chosen to moderate.

Alongside government takedown requests, free expression in Gaza has been further restricted by platforms unjustly removing pro-Palestinian content and accounts—interfering with the dissemination of news and silencing voices expressing concern for Palestinians. At the same time, X has been criticized for failing to remove hate speech and has disabled features that allow users to report certain types of misinformation. TikTok has implemented lackluster strategies to monitor the nature of content on their services. Meta has admitted to suppressing certain comments containing the Palestinian flag in certain “offensive contexts” that violate its rules.

To combat these consequential harms to free expression in Gaza, EFF urges platforms to follow the Santa Clara Principles on Transparency and Accountability in Content Moderation and undertake the following actions:

  1. Bring in local and regional stakeholders into the policymaking process to provide a greater cultural competence—knowledge and understanding of local language, culture and contexts—throughout the content moderation system.
  2. Urgently recognize the particular risks to users’ rights that result from state involvement in content moderation processes.
  3. Ensure that state actors do not exploit or manipulate companies’ content moderation systems to censor dissenters, political opponents, social movements, or any person.
  4. Notify users when, how, and why their content has been actioned, and give them the opportunity to appeal.
Everyone Must Have a Seat at the Table

Given the significant evidence of ongoing human rights violations against Palestinians, both before and since October 7, U.S. tech companies have significant ethical obligations to verify to themselves, their employees, the American public, and Palestinians themselves that they are not directly contributing to these abuses. Palestinians must have a seat at the table, just as Israelis do, when it comes to moderating speech in the region, most importantly their own. Anything less than this risks contributing to a form of digital apartheid.

An Ongoing Issue

This isn’t the first time EFF has raised concerns about censorship in Palestine, including in multiple international forums. Most recently, we wrote to the UN Special Rapporteur on Freedom of Expression expressing concern about the disproportionate impact of platform restrictions on expression by governments and companies. In May, we submitted comments to the Oversight Board urging that moderation decisions of the rallying cry “From the river to the sea” must be made on an individualized basis rather than through a blanket ban. Along with international and regional allies, EFF also asked Meta to overhaul its content moderation practices and policies that restrict content about Palestine, and have issued a set of recommendations for the company to implement. 

And back in April 2023, EFF and ECNL submitted comments to the Oversight Board addressing the over-moderation of the word ‘shaheed’ and other Arabic-language content by Meta, particularly through the use of automated content moderation tools. In their response, the Oversight Board found that Meta’s approach disproportionately restricts free expression, is unnecessary, and that the company should end the blanket ban to remove all content using the “shaheed”.

Paige Collings