【79年目原爆忌】長崎 祈念式典に政治的圧力=関口達夫(元長崎放送記者)

3 months ago
 8月9日の長崎市平和祈念式典は、原爆死没者を追悼する厳粛な儀式である。その式典に今年、欧米主要国が政治的圧力をかけ、被爆者らの反発を招く異例の事態となった。 長崎市がガザへの攻撃を続けるイスラエルを式典に招待しなかったことに日本を除くG7のアメリカ、イギリス、フランス、ドイツ、イタリア、カナダとEU(欧州連合)が納得せず、駐日大使を式典に出席させなかったのだ。 ガザでは子どもや老人など約4万人が殺害されており、世界各地でイスラエルに対する抗議行動が続いている。この状況を踏ま..
JCJ

Senate Vote Could Give Helping Hand To Patent Trolls

3 months ago

Update 9/26/24: The hearing and scheduled committee vote on PERA and PREVAIL was canceled. Supporters can continue to register their opposition via our action, as these bills may still be scheduled for a vote later in 2024. 

Update 9/20/24: The Senate vote scheduled for Thursday, Sep. 19 has been rescheduled for Thursday, Sep. 26. 

A patent on crowdfunding. A patent on tracking packages. A patent on photo contests. A patent on watching an ad online. A patent on computer bingo. A patent on upselling

These are just a few of the patents used to harass software developers and small companies in recent years. Thankfully, they were tossed out by U.S. courts, thanks to the landmark 2014 Supreme Court decision in Alice v. CLS Bank. The Alice ruling  has effectively ended hundreds of lawsuits where defendants were improperly sued for basic computer use. 

Take Action

Tell Congress: No New Bills For Patent Trolls

Now, patent trolls and a few huge corporate patent-holders are upset about losing their bogus patents. They are lobbying Congress to change the rules–and reverse the Alice decision entirely. Shockingly, they’ve convinced the Senate Judiciary Committee to vote this Thursday on two of the most damaging patent bills we’ve ever seen.

The Patent Eligibility Restoration Act (PERA, S. 2140) would overturn Alice, enabling patent trolls to extort small business owners and even hobbyists, just for using common software systems to express themselves or run their businesses. PERA would also overturn a 2013 Supreme Court case that prevents most kinds of patenting of human genes.

Meanwhile, the PREVAIL Act (S. 2220) seeks to severely limit how the public can challenge bad patents at the patent office. Challenges like these are one of the most effective ways to throw out patents that never should have been granted in the first place. 

This week, we need to show Congress that everyday users and creators won’t stand for laws that actually expand avenues for patent abuse.

The U.S. Senate must not pass new legislation to allow the worst patent scams to expand and flourish. 

Take Action

Tell Congress: No New Bills For Patent Trolls

Joe Mullin

Desvelando la represión en Venezuela: Un legado de vigilancia y control estatal

3 months ago

The post was written by Laura Vidal (PhD), independent researcher in learning and digital rights.

This is part two of a series. Part one on surveillance and control around the July election is here.

Over the past decade, the government in Venezuela has meticulously constructed a framework of surveillance and repression, which has been repeatedly denounced by civil society and digital rights defenders in the country. This apparatus is built on a foundation of restricted access to information, censorship, harassment of journalists, and the closure of media outlets. The systematic use of surveillance technologies has created an intricate network of control.

Security forces have increasingly relied on digital tools to monitor citizens, frequently stopping people to check the content of their phones and detaining those whose devices contain anti-government material. The country’s digital identification systems, Carnet de la Patria and Sistema Patria—established in 2016 and linked to social welfare programs—have also been weaponized against the population by linking access to essential services with affiliation to the governing party. 

Censorship and internet filtering in Venezuela became omnipresent ahead of the recent election period. The government blocked access to media outlets, human rights organizations, and even VPNs—restricting access to critical information. Social media platforms like X (formerly Twitter) and WhatsApp were also  targeted—and are expected to be regulated—with the government accusing these platforms of aiding opposition forces in organizing a “fascist coup d’état” and spreading “hate” while promoting a “civil war.”

The blocking of these platforms not only limits free expression but also serves to isolate Venezuelans from the global community and their networks in the diaspora, a community of around 9 million people. The government's rhetoric, which labels dissent as "cyberfascism" or "terrorism," is part of a broader narrative that seeks to justify these repressive measures while maintaining a constant threat of censorship, further stifling dissent.

Moreover, there is a growing concern that the government’s strategy could escalate to broader shutdowns of social media and communication platforms if street protests become harder to control, highlighting the lengths to which the regime is willing to go to maintain its grip on power.

Fear is another powerful tool that enhances the effectiveness of government control. Actions like mass arrests, often streamed online, and the public display of detainees create a chilling effect that silences dissent and fractures the social fabric. Economic coercion, combined with pervasive surveillance, fosters distrust and isolation—breaking down the networks of communication and trust that help Venezuelans access information and organize.

This deliberate strategy aims not just to suppress opposition but to dismantle the very connections that enable citizens to share information and mobilize for protests. The resulting fear, compounded by the difficulty in perceiving the full extent of digital repression, deepens self-censorship and isolation. This makes it harder to defend human rights and gain international support against the government's authoritarian practices.

Civil Society’s Response

Despite the repressive environment, civil society in Venezuela continues to resist. Initiatives like Noticias Sin Filtro and El Bus TV have emerged as creative ways to bypass censorship and keep the public informed. These efforts, alongside educational campaigns on digital security and the innovative use of artificial intelligence to spread verified information, demonstrate the resilience of Venezuelans in the face of authoritarianism. However, the challenges remain extensive.

The Inter-American Commission on Human Rights (IACHR) and its Special Rapporteur for Freedom of Expression (SRFOE) have condemned the institutional violence occurring in Venezuela, highlighting it as state terrorism. To be able to comprehend the full scope of this crisis it is paramount to understand that this repression is not just a series of isolated actions but a comprehensive and systematic effort that has been building for over 15 years. It combines elements of infrastructure (keeping essential services barely functional), blocking independent media, pervasive surveillance, fear-mongering, isolation, and legislative strategies designed to close civic space. With the recent approval of a law aimed at severely restricting the work of non-governmental organizations, the civic space in Venezuela faces its greatest challenge yet.

The fact that this repression occurs amid widespread human rights violations suggests that the government's next steps may involve an even harsher crackdown. The digital arm of government propaganda reaches far beyond Venezuela’s borders, attempting to silence voices abroad and isolate the country from the global community. 

The situation in Venezuela is dire, and the use of technology to facilitate political violence represents a significant threat to human rights and democratic norms. As the government continues to tighten its grip, the international community must speak out against these abuses and support efforts to protect digital rights and freedoms. The Venezuelan case is not just a national issue but a global one, illustrating the dangers of unchecked state power in the digital age.

However, this case also serves as a critical learning opportunity for the global community. It highlights the risks of digital authoritarianism and the ways in which governments can influence and reinforce each other's repressive strategies. At the same time, it underscores the importance of an organized and resilient civil society—in spite of so many challenges—as well as the power of a network of engaged actors both inside and outside the country. 

These collective efforts offer opportunities to resist oppression, share knowledge, and build solidarity across borders. The lessons learned from Venezuela should inform global strategies to safeguard human rights and counter the spread of authoritarian practices in the digital era.

An open letter, organized by a group of Venezuelan digital and human rights defenders, calling for an end to technology-enabled political violence in Venezuela, has been published by Access Now and remains open for signatures.

Guest Author

The New U.S. House Version of KOSA Doesn’t Fix Its Biggest Problems

3 months ago

An amended version of the Kids Online Safety Act (KOSA) that is being considered this week in the U.S. House is still a dangerous online censorship bill that contains many of the same fundamental problems of a similar version the Senate passed in July. The changes to the House bill do not alter that KOSA will coerce the largest social media platforms into blocking or filtering a variety of entirely legal content, and subject a large portion of users to privacy-invasive age verification. They do bring KOSA closer to becoming law, and put us one step closer to giving government officials dangerous and unconstitutional power over what types of content can be shared and read online. 

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Reframing the Duty of Care Does Not Change Its Dangerous Outcomes

For years now, digital rights groups, LGBTQ+ organizations, and many others have been critical of KOSA's “duty of care.” While the language has been modified slightly, this version of KOSA still creates a duty of care and negligence standard of liability that will allow the Federal Trade Commission to sue apps and websites that don’t take measures to “prevent and mitigate” various harms to minors that are vague enough to chill a significant amount of protected speech.  

The biggest shift to the duty of care is in the description of the harms that platforms must prevent and mitigate. Among other harms, the previous version of KOSA included anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors, “consistent with evidence-informed medical information.” The new version drops this section and replaces it with the "promotion of inherently dangerous acts that are likely to cause serious bodily harm, serious emotional disturbance, or death.” The bill defines “serious emotional disturbance” as “the presence of a diagnosable mental, behavioral, or emotional disorder in the past year, which resulted in functional impairment that substantially interferes with or limits the minor’s role or functioning in family, school, or community activities.”  

Despite the new language, this provision is still broad and vague enough that no platform will have any clear indication about what they must do regarding any given piece of content. Its updated list of harms could still encompass a huge swathe of entirely legal (and helpful) content about everything from abortion access and gender-affirming care to drug use, school shootings, and tackle football. It is still likely to exacerbate the risks of children being harmed online because it will place barriers on their ability to access lawful speech—and important resources—about topics like addiction, eating disorders, and bullying. And it will stifle minors who are trying to find their own supportive communities online.  

Kids will, of course, still be able to find harmful content, but the largest platforms—where the most kids are—will face increased liability for letting any discussion about these topics occur. It will be harder for suicide prevention messages to reach kids experiencing acute crises, harder for young people to find sexual health information and gender identity support, and generally, harder for adults who don’t want to risk the privacy- and security-invasion of age verification technology to access that content as well.  

As in the past version, enforcement of KOSA is left up to the FTC, and, to some extent, state attorneys general around the country. Whether you agree with them or not on what encompasses a “diagnosable mental, behavioral, or emotional disorder,”  the fact remains that KOSA's flaws are as much about the threat of liability as about the actual enforcement. As long as these definitions remain vague enough that platforms have no clear guidance on what is likely to cross the line, there will be censorship—even if the officials never actually take action. 

The previous House version of the bill stated that “A high impact online company shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors.” The new version slightly modifies this to say that such a company "shall create and implement its design features to reasonably prevent and mitigate the following harms to minors.” These language changes are superficial; this section still imposes a standard that requires platforms to filter user-generated content and imposes liability if they fail to do so “reasonably.” 

House KOSA Edges Closer to Harmony with Senate Version 

Some of the latest amendments to the House version of KOSA bring it closer in line with the Senate version which passed a few months ago (not that this improves the bill).  

This version of KOSA lowers the bar, set by the previous House version, that determines  which companies would be impacted by KOSA’s duty of care. While the Senate version of KOSA does not have such a limitation (and would affect small and large companies alike), the previous House version created a series of tiers for differently-sized companies. This version has the same set of tiers, but lowers the highest bar from companies earning $2.5 billion in annual revenue, or having 150 million annual users, to companies earning $1 billion in annual revenue, or having 100 million annual users.  

This House version also includes the “filter bubble” portion of KOSA which was added to the Senate version a year ago. This requires any “public-facing website, online service, online application, or mobile application that predominantly provides a community forum for user-generated content” to provide users with an algorithm that uses a limited set of information, such as search terms and geolocation, but not search history (for example). This section of KOSA is meant to push users towards a chronological feed. As we’ve said before, there’s nothing wrong with online information being presented chronologically for those who want it. But just as we wouldn’t let politicians rearrange a newspaper in a particular order, we shouldn’t let them rearrange blogs or other websites. It’s a heavy-handed move to stifle the editorial independence of web publishers.   

Lastly, the House authors have added language  that the bill would have no actual effect on how platforms or courts interpret the law, but which does point directly to the concerns we’ve raised. It states that, “a government entity may not enforce this title or a regulation promulgated under this title based upon a specific viewpoint of any speech, expression, or information protected by the First Amendment to the Constitution that may be made available to a user as a result of the operation of a design feature.” Yet KOSA does just that: the FTC will have the power to force platforms to moderate or block certain types of content based entirely on the views described therein.  

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

KOSA Remains an Unconstitutional Censorship Bill 

KOSA remains woefully underinclusive—for example, Google's search results will not be impacted regardless of what they show young people, but Instagram is on the hook for a broad amount of content—while making it harder for young people in distress to find emotional, mental, and sexual health support. This version does only one important thing—it moves KOSA closer to passing in both houses of Congress, and puts us one step closer to enacting an online censorship regime that will hurt free speech and privacy for everyone.

Jason Kelley

KOSA’s Online Censorship Threatens Abortion Access

3 months ago

For those living in one of the 22 states where abortion is banned or heavily restricted, the internet can be a lifeline. It has essential information on where and how to access care, links to abortion funds, and guidance on ways to navigate potential legal risks. Activists use the internet to organize and build community, and reproductive healthcare organizations rely on it to provide valuable information and connect with people in need.

But both Republicans and Democrats in Congress are now actively pushing for federal legislation that could cut youth off from these vital healthcare resources and stifle online abortion information for adults and kids alike.

This summer, the U.S. Senate passed the Kids Online Safety Act (KOSA), a bill that would grant the federal government and state attorneys general the power to restrict online speech they find objectionable in a misguided and ineffective attempt to protect kids online. A number of organizations have already sounded the alarm on KOSA’s danger to online LGBTQ+ content, but the hazards of the bill don’t stop there.

KOSA puts abortion seekers at risk. It could easily lead to censorship of vital and potentially life-saving information about sexual and reproductive healthcare. And by age-gating the internet, it could result in websites requiring users to submit identification, undermining the ability to remain anonymous while searching for abortion information online.

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Abortion Information Censored

As EFF has repeatedly warned, KOSA will stifle online speech. It gives government officials the dangerous and unconstitutional power to decide what types of content can be shared and read online. Under one of its key censorship provisions, KOSA would create what the bill calls a “duty of care.” This provision would require websites, apps, and online platforms to comply with a vague and overbroad mandate to prevent and mitigate “harm to minors” in all their “design features.”

KOSA contains a long list of harms that websites have a duty to protect against, including emotional disturbance, acts that lead to bodily harm, and online harassment, among others. The list of harms is open for interpretation. And many of the harms are so subjective that government officials could claim any number of issues fit the bill.

This opens the door for political weaponization of KOSA—including by anti-abortion officials. KOSA is ambiguous enough to allow officials to easily argue that its mandate includes sexual and reproductive healthcare information. They could, for example, claim that abortion information causes emotional disturbance or death, or could lead to “sexual exploitation and abuse.” This is especially concerning given the anti-abortion movement’s long history of justifying abortion restrictions by claiming that abortions cause mental health issues, including depression and self-harm (despite credible research to the contrary).

As a result, websites could be forced to filter and block such content for minors, despite the fact that minors can get pregnant and are part of the demographic most likely to get their news and information from social media platforms. By blocking this information, KOSA could cut off young people’s access to potentially life-saving sexual and reproductive health resources. So much for protecting kids.

KOSA’s expansive and vague censorship requirements will also affect adults. To avoid liability and the cost and hassle of litigation, websites and platforms are likely to over-censor potentially covered content, even if that content is otherwise legal. This could lead to the removal of important reproductive health information for all internet users, adults included.

A Tool For Anti-Choice Officials

It’s important to remember that KOSA’s “duty of care” provision would be defined and enforced by the presidential administration in charge, including any future administration that is hostile to reproductive rights. The bill grants the Federal Trade Commission, majority-controlled by the President’s party, the power to develop guidelines and to investigate or sue any websites or platforms that don’t comply. It also grants the Executive Branch the power to form a Kids Online Safety Council to further identify “emerging or current risks of harms to minors associated with online platforms.”

Meanwhile, KOSA gives state attorneys general, including those in abortion-restrictive states, the power to sue under its other provisions, many of which intersect with the “duty of care.” As EFF has argued, this gives state officials a back door to target and censor content they don’t like, including abortion information.

It’s also directly foreseeable that anti-abortion officials would use KOSA in this way. One of the bill’s co-sponsors, Senator Marsha Blackburn (R-TN), has touted KOSA as a way to censor online content on social issues, claiming that children are being “indoctrinated” online. The Heritage Foundation, a politically powerful organization that espouses anti-choice views, also has its eyes on KOSA. It has been lobbying lawmakers to pass the bill and suggesting that a future administration could fill the Kids Online Safety Council with “representatives who share pro-life values.”

This all comes at a time when efforts to censor abortion information online are at a fever pitch. In abortion-restrictive states, officials have already been eagerly attempting to erase abortion from the internet. Lawmakers in both South Carolina and Texas have introduced bills to censor online abortion information, though neither effort has yet to be successful. The National Right to Life Committee has also created a model abortion law aimed at restricting abortion rights in a variety of ways, including digital access to information.

KOSA Hurts Anonymity Online

KOSA will also push large and important parts of the internet behind age gates. In order to determine which users are minors, online services will likely impose age verification systems, which require everyone—both adults and minors—to verify their age by providing identifying information, oftentimes including government-issued ID or other personal records.

This is deeply problematic for maintaining access to reproductive care. Age verification undermines our First Amendment right to remain anonymous online by requiring users to confirm their identity before accessing webpages and information. It would chill users who do not wish to share their identity from accessing or sharing online abortion resources, and put others’ identities at increased risk of exposure.

In a post-Roe United States, in which states are increasingly banning, restricting, and prosecuting abortions, the ability to anonymously seek and share abortion information online is more important than ever. For people living in abortion-restrictive states, searching and sharing abortion information online can put you at risk. There have been multiple instances of law enforcement agencies using digital evidence, including internet history, in abortion-related criminal cases. We’ve also seen an increase in online harassment and doxxing of healthcare professionals, even in more abortion-protective states.

Because of this, many organizations, including EFF, have tried to help people take steps to protect privacy and anonymity online. KOSA would undercut those efforts. While it’s true that our online ecosystem is already rich with private surveillance, age verification adds another layer of mass data collection. Online ID checks require adults to upload data-rich, government-issued identifying documents to either the website or a third-party verifier, creating a potentially lasting record of their visit to the website.

For abortion seekers taking steps to protect their anonymity and avoid this pervasive surveillance, this would make things all the more difficult. Using a public computer or creating anonymous profiles on social networks won’t keep you safe if you have to upload ID to access the information you need.

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

We Can Still Stop KOSA From Passing

KOSA has not yet passed the House, so there’s still time to stop it. But the Senate vote means that the House could bring it up for a vote at any time, and the House has introduced its own similarly flawed version of KOSA. If we want to protect access to abortion information online, we must organize now to stop KOSA from passing.

Lisa Femia