第212回統計委員会
競争ルールの検証に関するWG(第60回)
安心・安全なメタバースの実現に関する研究会(第10回)
情報通信審議会 電気通信事業政策部会 ユニバーサルサービス政策委員会(第38回)
Saving the Internet in Europe: Defending Free Expression
This post is part two in a series of posts about EFF’s work in Europe. Read about how and why we work in Europe here.
EFF’s mission is to ensure that technology supports freedom, justice, and innovation for all people of the world. While our work has taken us to far corners of the globe, in recent years we have worked to expand our efforts in Europe, building up a policy team with key expertise in the region, and bringing our experience in advocacy and technology to the European fight for digital rights.
In this blog post series, we will introduce you to the various players involved in that fight, share how we work in Europe, and how what happens in Europe can affect digital rights across the globe.
EFF’s approach to free speech
The global spread of Internet access and digital services promised a new era of freedom of expression, where everyone could share and access information, speak out and find an audience without relying on gatekeepers and make, tinker with and share creative works.
Everyone should have the right to express themselves and share ideas freely. Various European countries have experienced totalitarian regimes and extensive censorship in the past century, and as a result, many Europeans still place special emphasis on privacy and freedom of expression. These values are enshrined in the European Convention of Human Rights and the Charter of Fundamental Rights of the European Union – essential legal frameworks for the protection of fundamental rights.
Today, as so much of our speech is facilitated by online platforms, there is an expectation, that they too respect fundamental rights. Through their terms of services, community guidelines or house rules, platforms get to unilaterally define what speech is permissible on their services. The enforcement of these rules can be arbitrary, untransparent and selective, resulting in the suppression of contentious ideas and minority voices.
That’s why EFF has been fighting against both government threats to free expression and to hold tech companies accountable for grounding their content moderation practices in robust human rights frameworks. That entails setting out clear rules and standards for internal processes such as notifications and explanations to users when terms of services are enforced or changed. In the European Union, we have worked for decades to ensure that laws governing online platforms respect fundamental rights, advocated against censorship and spoke up on behalf of human rights defenders.
What’s the Digital Services Act and why do we keep talking about it?
For the past years, we have been especially busy addressing human rights concerns with the drafting and implementation of the DSA the Digital Services Act (DSA), the new law setting out the rules for online services in the European Union. The DSA covers most online services, ranging from online marketplaces like Amazon, search engines like Google, social networks like Meta and app stores. However, not all of its rules apply to all services – instead, the DSA follows a risk-based approach that puts the most obligations on the largest services that have the highest impact on users. All service providers must ensure that their terms of services respect fundamental rights, that users can get in touch with them easily, and that they report on their content moderation activities. Additional rules apply to online platforms: they must give users detailed information about content moderation decisions and the right to appeal and additional transparency obligations. They also have to provide some basic transparency into the functioning of their recommender systems and are not allowed to target underage users with personalized ads. The most stringent obligations apply to the largest online platforms and search engines, which have more than 45 million users in the EU. These companies, which include X, TikTok, Amazon, Google Search and Play, YouTube, and several porn platforms, must proactively assess and mitigate systemic risks related to the design, functioning and use of their service their services. These include risks to the exercise of fundamental rights, elections, public safety, civic discourse, the protection of minors and public health. This novel approach might have merit but is also cause for concern: Systemic risks are barely defined and could lead to restrictions of lawful speech, and measures to address these risks, for example age verification, have negative consequences themselves, like undermining users’ privacy and access to information.
The DSA is an important piece of legislation to advance users’ rights and hold companies accountable, but it also comes with significant risks. We are concerned about the DSA’s requirement that service providers proactively share user data with law enforcement authorities and the powers it gives government agencies to request such data. We caution against the misuse of the DSA’s emergency mechanism and the expansion of the DSA’s systemic risks governance approach as a catch-all tool to crack down on undesired but lawful speech. Similarly, the appointment of trusted flaggers could lead to pressure on platforms to over remove content, especially as the DSA does not limit government authorities from becoming trusted flaggers.
EFF has been advocating for lawmakers to take a measured approach that doesn’t undermine the freedom of expression. Even though we have been successful in avoiding some of the most harmful ideas, concerns remain, especially with regards to the politicization of the enforcement of the DSA and potential over-enforcement. That’s why we will keep a close eye on the enforcement of the DSA, ready to use all means at our disposal to push back against over-enforcement and to defend user rights.
European laws often implicate users globally. To give non-European users a voice in Brussels, we have been facilitating the DSA Human Rights Alliance. The DSA HR Alliance is formed around the conviction that the DSA must adopt a human rights-based approach to platform governance and consider its global impact. We will continue building on and expanding the Alliance to ensure that the enforcement of the DSA doesn’t lead to unintended negative consequences and respects users’ rights everywhere in the world.
The UK’s Platform Regulation Legislation
In parallel to the Digital Services Act, the UK has passed its own platform regulation, the Online Safety Act (OSA). Seeking to make the UK “the safest place in the world to be online,” the OSA will lead to a more censored, locked-down internet for British users. The Act empowers the UK government to undermine not just the privacy and security of UK residents, but internet users worldwide.
Online platforms will be expected to remove content that the UK government views as inappropriate for children. If they don’t, they’ll face heavy penalties. The problem is, in the UK as in the U.S. and elsewhere, people disagree sharply about what type of content is harmful for kids. Putting that decision in the hands of government regulators will lead to politicized censorship decisions.
The OSA will also lead to harmful age-verification systems. You shouldn’t have to show your ID to get online. Age-gating systems meant to keep out kids invariably lead to adults losing their rights to private speech, and anonymous speech, which is sometimes necessary.
As Ofcom is starting to release their regulations and guidelines, we’re watching how the regulator plans to avoid these human rights pitfalls, and will continue any fighting insufficient efforts to protect speech and privacy online.
Media freedom and plurality for everyone
Another issue that we have been championing is media freedom. Similar to the DSA, the EU recently overhauled its rules for media services: the European Media Freedom Act (EMFA). In this context, we pushed back against rules that would have forced online platforms like YouTube, X, or Instagram to carry any content by media outlets. Intended to bolster media pluralism, making platforms host content by force has severe consequences: Millions of EU users can no longer trust that online platforms will address content violating community standards. Besides, there is no easy way to differentiate between legitimate media providers, and such that are known for spreading disinformation, such as government-affiliated Russia sites active in the EU. Taking away platforms' possibility to restrict or remove such content could undermine rather than foster public discourse.
The final version of EMFA introduced a number of important safeguards but is still a bad deal for users: We will closely follow its implementation to ensure that the new rules actually foster media freedom and plurality, inspire trust in the media and limit the use of spyware against journalists.
Exposing censorship and defending those who defend us
Covering regulation is just a small part of what we do. Over the past years, we have again and again revealed how companies’ broad-stroked content moderation practices censor users in the name of fighting terrorism, and restrict the voices of LGBTQ folks, sex workers, and underrepresented groups.
Going into 2025, we will continue to shed light on these restrictions of speech and will pay particular attention to the censorship of Palestinian voices, which has been rampant. We will continue collaborating with our allies in the Digital Intimacy Coalition to share how restrictive speech policies often disproportionally affect sex workers. We will also continue to closely analyze the impact of the increasing and changing use of artificial intelligence in content moderation.
Finally, a crucial part of our work in Europe has been speaking out for those who cannot: human rights defenders facing imprisonment and censorship.
Much work remains to be done. We have put forward comprehensive policy recommendations to European lawmakers and we will continue fighting for an internet where everyone can make their voice heard. In the next posts in this series, you will learn more about how we work in Europe to ensure that digital markets are fair, offer users choice and respect fundamental rights.
Decidim Association on the urgency of service to civil society where freedoms and rights are being rolled back
We're Creating a Better Future for the Internet 🧑🏭
In the early years of the internet, website administrators had to face off with a burdensome and expensive process to deploy SSL certificates. But today, hundreds of thousands of people have used EFF’s free Certbot tool to spread that sweet HTTPS across the web. Now almost all internet traffic is encrypted, and everyone gets a basic level of security. Small actions mean big change when we act together. Will you support important work like this and give EFF a Year-End Challenge boost?
Unlock Bonus Grants Before 2025
Make a donation of ANY SIZE by December 31 and you’ll help us unlock bonus grants! Every supporter gets us closer to a series of seven Year-End Challenge milestones set by EFF’s board of directors. These grants become larger as the number of online rights supporters grows. Everyone counts! See our progress.
🚧 Digital Rights: Under Construction 🚧Since 1990, EFF has defended your digital privacy and free speech rights in the courts, through activism, and by making open source privacy tools. This team is committed to watching out for the users no matter what directions technological innovation may take us. And that’s funded entirely by donations.
fix_copyright_and_stay_golden.pngWith help from people like you, EFF has been able to help unravel legal and ethical questions surrounding the rise of AI; push the USPTO to withdraw harmful patent proposals; fight for the public's right to access police drone footage; and show why banning TikTok and passing laws like the Kids Online Safety Act (KOSA) will not achieve internet safety.
As technology’s reach continues to expand, so do everyone’s concerns about harmful side effects. That’s where EFF’s ample experience in tech policy, the law, and human rights shines. You can help us.
Donate to defend digital rights today and you’ll help us unlock bonus grants before the year ends.
Proudly Member-Supported Since 1990
________________________
EFF is a member-supported U.S. 501(c)(3) organization. We’re celebrating ELEVEN YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.
There’s No Copyright Exception to First Amendment Protections for Anonymous Speech
Some people just can’t take a hint. Today’s perfect example is a group of independent movie distributors that have repeatedly tried, and failed, to force Reddit to give up the IP addresses of several users who posted about downloading movies.
The distributors claim they need this information to support their copyright claims against internet service provider Frontier Communications, because it might be evidence that Frontier wasn’t enforcing its repeat infringer policy and therefore couldn’t claim safe harbor protections under the Digital Millennium. Copyright Act. Courts have repeatedly refused to enforce these subpoenas, recognizing the distributors couldn’t pass the test the First Amendment requires prior to unmasking anonymous speakers.
Here's the twist: after the magistrate judge in this case applied this standard and quashed the subpoena, the movie distributors sought review from the district court judge assigned to the case. The second judge also denied discovery as unduly burdensome but, in a hearing on the matter, also said there was no First Amendment issue because the users were talking about copyright infringement. In their subsequent appeal to the Ninth Circuit, the distributors invite the appellate court to endorse the judge’s statement.
As we explain in an amicus brief supporting Reddit, the court should refuse that invitation. Discussions about illegal activity clearly are protected speech. Indeed, the Supreme Court recently affirmed that even “advocacy of illegal acts” is “within the First Amendment’s core.” In fact, protecting such speech is a central purpose of the First Amendment because it ensures that people can robustly debate civil and criminal laws and advocate for change.
There is no reason to imagine that this bedrock principle doesn’t apply just because the speech concerns copyright infringement – —especially where the speakers aren’t even defendants in the case, but independent third parties. And unmasking Does in copyright cases carries particular risks given the long history of copyright claims being used as an excuse to take down lawful as well as infringing content online.
We’re glad to see Reddit fighting back against these improper subpoenas, and proud to stand with the company as it stands up for its users.
【裁判】「NHK文書等開示請求訴訟」が原告側完勝と言える勝利的和解で終了 JCJ会員らも原告団に加わる=放送を語る会
韓国労働ニュース12月前半号:尹錫悦大統領の弾劾・弾劾で一杯
UK Politicians Join Organizations in Calling for Immediate Release of Alaa Abd El-Fattah
As the UK’s Prime Minister Keir Starmer and Foreign Secretary David Lammy have failed to secure the release of British-Egyptian blogger, coder, and activist Alaa Abd El-Fattah, UK politicians call for tougher measures to secure Alaa’s immediate return to the UK.
During a debate on detained British nationals abroad in early December, chairwoman of the Commons Foreign Affairs Committee Emily Thornberry asked the House of Commons why the UK has continued to organize industry delegations to Cairo while “the Egyptian government have one of our citizens—Alaa Abd El-Fattah—wrongfully held in prison without consular access.”
In the same debate, Labour MP John McDonnell urged the introduction of a “moratorium on any new trade agreements with Egypt until Alaa is free,” which was supported by other politicians. Liberal Democrat MP Calum Miller also highlighted words from Alaa, who told his mother during a recent prison visit that he had “hope in David Lammy, but I just can’t believe nothing is happening...Now I think either I will die in here, or if my mother dies I will hold him to account.”
Alaa’s mother, mathematician Laila Soueif, has been on hunger strike for 79 days while she and the rest of his family have worked to engage the British government in securing Alaa’s release. On December 12, she also started protesting daily outside the Foreign Office and has since been joined by numerous MPs.
Support for Alaa has come from many directions. On December 6, 12 Nobel laureates wrote to Keir Starmer urging him to secure Alaa’s immediate release “Not only because Alaa is a British citizen, but to reanimate the commitment to intellectual sanctuary that made Britain a home for bold thinkers and visionaries for centuries.” The pressure on Labour’s senior politicians has continued throughout the month, with more than 100 MPs and peers writing to David Lammy on December 15 demanding Alaa’ be freed.
Alaa should have been released on September 29, after serving his five-year sentence for sharing a Facebook post about a death in police custody, but Egyptian authorities have continued his imprisonment in contravention of the country’s own Criminal Procedure Code. British consular officials are prevented from visiting him in prison because the Egyptian government refuses to recognise Alaa’s British citizenship.
David Lammy met with Alaa’s family in November and promised to take action. But the UK’s Prime Minister failed to raise the case at the G20 Summit in Brazil when he met with Egypt’s President El-Sisi.
If you’re based in the UK, here are some actions you can take to support the calls for Alaa’s release:
- Write to your MP (external link): https://freealaa.net/message-mp
- Join Laila Soueif outside the Foreign Office daily between 10-11am
- Share Alaa’s plight on social media using the hashtag #freealaa
The UK Prime Minister and Foreign Secretary’s inaction is unacceptable. Every second counts, and time is running out. The government must do everything it can to ensure Alaa’s immediate and unconditional release.