Congress Moves Closer to Risky Internet Takedown Law | EFFector 37.4

1 day 2 hours ago

Sorry, EFF doesn't hand out candy like the Easter Bunny, but we are here to keep you updated on the latest digital rights news with our EFFector newsletter!

This edition of EFFector explains how you can help us push back against the TAKE IT DOWN Act, an internet censorship law; why we oppose site-blocking legislation, found in two upcoming bills; and how to delete your data from 23andMe. 

You can read the full newsletter here, and even get future editions directly to your inbox when you subscribe! Additionally, we've got an audio edition of EFFector on the Internet Archive, or you can view it by clicking the button below:

LISTEN ON YouTube

EFFECTOR 37.4 - Congress Moves Closer to Risky Internet Takedown Law

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

Christian Romero

EFF Urges Court to Avoid Fair Use Shortcuts in Kadrey v. Meta Platforms

2 days 1 hour ago

EFF has filed an amicus brief in Kadrey v. Meta, one of the many ongoing copyright lawsuits against AI developers. Most of the AI copyright cases raise an important new issue: whether the copying necessary to train a generative AI model is a non-infringing fair use.

Kadrey, however, attempts to side-step fair use. The plaintiffs—including Sarah Silverman and other authors—sued Meta for allegedly using BitTorrent to download “pirated” copies of their books to train Llama, a large language model. In other words, their legal claims challenge how Meta obtained the training materials, not what it did with them.

But some of the plaintiffs’ arguments, if successful, could harm AI developers’ defenses in other cases, where fair use is directly at issue.

How courts decide this issue will profoundly shape the future of this transformative technology, including its capabilities, its costs, and whether its evolution will be shaped by the democratizing forces of the open market or the whims of an oligopoly.

A question this important deserves careful consideration on a full record—not the hyperbolic cries of “piracy” and the legal shortcuts that the plaintiffs in this case are seeking. As EFF explained to the court, the question of whether fair use applies to training generative AI is far too important to decide based on Kadrey’s back-door challenge.

And, as EFF explained, whether a developer can legally train an AI on a wide variety of creative works shouldn’t turn on which technology they used to obtain those materials. As we wrote in our brief, the “Court should not allow the tail of Meta’s alleged BitTorrent use to wag the dog of the important legal questions this case presents. Nor should it accept Plaintiffs’ invitation to let hyperbole about BitTorrent and 'unmitigated piracy' derail the thoughtful and fact-specific fair use analysis the law requires.”

We also urged the court to reject the plaintiffs’ attempt to create a carve out in copyright law for copies obtained using “BitTorrent.”

This dangerous argument seeks to categorically foreclose the possibility that even the most transformative, socially beneficial uses—such as AI training—could be fair use.

As EFF explained in its brief, adopting an exemption from the flexible, fact-specific fair use analysis for “BitTorrent,” “internet piracy,” “P2P downloading,” or something else, would defeat the purpose of the fair use doctrine as a safeguard for the application of copyright law to new technologies.

Tori Noble

Privacy on the Map: How States Are Fighting Location Surveillance

2 days 3 hours ago

Your location data isn't just a pin on a map—it's a powerful tool that reveals far more than most people realize. It can expose where you work, where you pray, who you spend time with, and, sometimes dangerously, where you seek healthcare. In today’s world, your most private movements are harvested, aggregated, and sold to anyone with a credit card. For those seeking reproductive or gender-affirming care, or visiting a protest or a immigration law clinic, this data is a ticking time bomb.

Last year, we sounded the alarm, urging lawmakers to protect individuals from the growing threats of location tracking tools—tools that are increasingly being used to target and criminalize people seeking essential reproductive healthcare.

The good news? Lawmakers in California, Massachusetts, Illinois and elsewhere are stepping up, leading the way to protect privacy and ensure that healthcare access and other exercise of our rights remain safe from invasive surveillance.

The Dangers of Location Data

Imagine this: you leave your home in Alabama, drop your kids off at daycare, and then drive across state lines to visit an abortion clinic in Florida. You spend two hours there before driving back home. Along the way, you used your phone’s GPS app to navigate or a free radio app to listen to the news. Unbeknownst to you, this “free” app tracked your entire route and sold it to a data broker. That broker then mapped your journey and made it available to anyone who would pay for it. This is exactly what happened when privacy advocates used a tool called Locate X, developed by Babel Street, to track a person’s device as they traveled from Alabama—where abortion is completely banned—to Florida, where abortion access is severely restricted but still available.

Despite this tool being marketed as solely for law enforcement use, private investigators were able to access it by falsely claiming they would work with law enforcement, revealing a major flaw in our data privacy system. In a time when government surveillance of private personal decisions is on the rise, the fact that law enforcement (and adversaries pretending to be law enforcement) can access these tools puts our personal privacy in serious danger.

The unregulated market for location data enables anyone, from law enforcement to anti-abortion groups, to access and misuse this sensitive information. For example, a data broker called Near Intelligence sold location data of people visiting Planned Parenthood clinics to an anti-abortion group. Likewise, law enforcement in Idaho used cell phone location data to charge a mother and her son with “aiding and abetting” abortion, a clear example of how this information can be weaponized to enforce abortion restrictions for patients and anyone else in their orbit. 

States Taking Action

As we’ve seen time and time again, the collection and sale of location data can be weaponized to target many vulnerable groups—immigrants, the LGBTQ+ community, and anyone seeking reproductive healthcare. In response to these growing threats, states like California, Massachusetts, and Illinois are leading the charge by introducing bills aimed at regulating the collection and use of location data. 

These bills are a powerful response to the growing threat. The bills are grounded in well-established principles of privacy law, including informed consent and data minimization, and they ensure that only essential data is collected, and that it’s kept secure. Importantly, they give residents—whether they reside in the state or are traveling from other states—the confidence to exercise their rights (such as seeking health care) without fear of surveillance or retaliation. 

This post outlines some of the key features of these location data privacy laws, to show authors and advocates of legislative proposals how best to protect their communities. Specifically, we recommend: 

  • Strong definitions,
  • Clear rules,
  • Affirmation that all location data is sensitive,
  • Empowerment of consumers through a strong private right of action,
  • Prohibition of “pay-for-privacy” schemes, and
  • Transparency through clear privacy policies.
Strong Definitions

Effective location privacy legislation starts with clear definitions. Without them, courts may interpret key terms too narrowly—weakening the law's intent. And in the absence of clear judicial guidance, regulated entities may exploit ambiguity to sidestep compliance altogether.

The following are some good definitions from the recent bills:

  • In the Massachusetts bill, "consent" must be “freely given, specific, informed, unambiguous, [and] opt-in.” Further, it must be free from dark patterns—ensuring people truly understand what they’re agreeing to. 
  • In the Illinois bill, a “covered entity” includes all manner of private actors, including individuals, corporations, and associations, exempting only individuals acting in noncommercial contexts. 
  • "Location information" must clearly refer to data derived from a device that reveals the past or present location of a person or device. The Massachusetts bill sets a common radius in defining protected location data: 1,850 feet (about one-third of a mile). The California bill goes much bigger: five miles. EFF has supported both radiuses.
  • A “permissible purpose” (which is key to the minimization rule) should be narrowly defined to include only: (1) delivering a product or service that the data subject asked for, (2) fulfilling an order, (3) complying with federal or state law, or (4) responding to an imminent threat to life.
Clear Rules

“Data minimization” is the privacy principle that corporations and other private actors must not process a person’s data except as necessary to give them what they asked for, with narrow exceptions. A virtue of this rule is that a person does not need to do anything in order to enjoy their statutory privacy rights; the burden is on the data processor to process less data. Together, these definitions and rules create a framework that ensures privacy is the default, not the exception.

One key data minimization rule, as in the Massachusetts bill, is: “It shall be unlawful for a covered entity to collect or process an individual’s location data except for a permissible purpose.” Read along with the definition above, this across-the-board rule means a covered entity can only collect or process someone’s location data to fulfil their request (with exceptions for emergencies and compliance with federal and state law).

Additional data minimization rules, as in the Illinois bill, back this up by restraining particular data practices:

  • Covered entities can not collect more precise data than strictly necessary, or use location data to make inferences beyond what is needed to provide the service. 
  • Data must be deleted once it’s no longer necessary for the permissible purpose. 
  • No selling, renting, trading, or leasing location data – full stop.
  • No disclosure of location data to government, except with a warrant, as required by state or federal law, on request of the data subject, or an emergency threat of serious bodily injury or death (defined to not include abortion). 
  • No other disclosure of location data, except as required for a permissible purpose or when requested by the individual. 

The California bill rests largely on data minimization rules like these. The Illinois and Massachestts bills place an additional limit: no collection or processing of location data absent opt-in consent from the data subject. Critically, consent in these two bills is not an exception to the minimization rule, but rather an added requirement. EFF has supported both models of data privacy legislation: just a minimization requirement; and paired minimization and consent requirements. 

All Location Data is Sensitive

To best safeguard against invasive location tracking, it’s essential to place legal restrictions on the collection and use of all location data—not just data associated with sensitive places like reproductive health clinics. Narrow protections may offer partial help, but they fall short of full privacy.

Consider the example at the beginning of the blog: if someone travels from Alabama to Florida for abortion care, and the law only shields data at sensitive sites, law enforcement in Alabama could still trace their route from home up to near the clinic. Once the person enters a protected “healthcare” zone, their device would vanish from view temporarily, only to reappear shortly after they leave. This gap in the tracking data could make it relatively easy to deduce where they were during that time, essentially revealing their clinic visit.

To avoid this kind of loophole, the most effective approach is to limit the collection and retention of all location data—no exceptions. This is the approach in all three of the bills highlighted in this post: California, Illinois, and Massachusetts.

Empowering Consumers Through a Strong PRA

To truly protect people’s location privacy, legislation must include a strong private right of action (PRA)—giving individuals the power to sue companies that violate their rights. A private right of action ensures companies can’t ignore the law and empowers people to seek justice directly when their sensitive data is misused. This is a top priority for EFF in any data privacy legislation.

The bills in Illinois and Massachusetts offer strong models. They make clear that any violation of the law is an injury and allow individuals to bring civil suits:“A violation of this [law] … regarding an individual’s location information constitutes an injury to that individual. … Any individual alleging a violation of this [law] … may bring a civil action …” Further, these bills provide a baseline amount of damages (sometimes called “liquidated” or “statutory” damages), because an invasion of statutory privacy rights is a real injury, even if it is hard for the injured party to prove out-of-pocket expenses from theft, bodily harm, or the like. Absent this kind of statutory language, some victims of privacy violations will lose their day in court.

These bills also override mandatory arbitration clauses that limit access to court. Corporations should not be able to avoid being sued by forcing their customers to sign lengthy contracts that nobody reads.

Other remedies include actual damages, punitive damages, injunctive relief, and attorney’s fees. These provisions give the law real teeth and ensure accountability can’t be signed away in fine print.

No Pay-for-Privacy Schemes

Strong location data privacy laws must protect everyone equally—and that means rejecting “pay-for-privacy” schemes that allow companies to charge users for basic privacy protections. Privacy is a fundamental right, not a luxury add-on or subscription perk. Allowing companies to offer privacy only to those who can afford to pay creates a two-tiered system where low-income individuals are forced to trade away their sensitive location data in exchange for access to essential services. These schemes also incentivize everyone to abandon privacy.

Legislation should make clear that companies cannot condition privacy protections on payment, loyalty programs, or any other exchange of value. This ensures that everyone—regardless of income—has equal protection from surveillance and data exploitation. Privacy rights shouldn’t come with a price tag.

We commend this language from the Illinois and Massachusetts bills: 

A covered entity may not take adverse action against an individual because the individual exercised or refused to waive any of such individual’s rights under [this law], unless location data is essential to the provision of the good, service, or service feature that the individual requests, and then only to the extent that this data is essential. This prohibition includes, but is not limited to: (1) refusing to provide a good or service to the individual; (2) charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties; or (3) providing a different level of quality of goods or services to the individual.

Transparency Through Clear Privacy Policies

It is helpful for data privacy laws to require covered entities to be transparent about their data practices. All three bills discussed in this post require covered entities to make available a privacy policy to the data subject—a solid baseline. This ensures that people aren’t left in the dark about how their location data is being collected, used, or shared. Clear, accessible policies are a foundational element of informed consent and give individuals the information they need to protect themselves and assert their rights.

It is also helpful for privacy laws like these to require covered entities to prominently publish their privacy policies on their websites. This allows all members of the public – as well as privacy advocates and government enforcement agencies – to track whether data processors are living up to their promises.

Next Steps: More States Must Join

The bottom line is clear: location data is highly sensitive, and without proper protections, it can be used to harm those who are already vulnerable. The digital trail we leave behind can reveal far more than we think, and without laws in place to protect us, we are all at risk. 

While some states are making progress, much more needs to be done. More states need to follow suit by introducing and passing legislation that protects location data privacy. We cannot allow location tracking to be used as a tool for harassment, surveillance, or criminalization.

To help protect your digital privacy while we wait for stronger privacy protection laws, we’ve published a guide specifically for how to minimize intrusion from Locate X, and have additional tips on EFF’s Surveillance Self-Defense site. Many general privacy practices also offer strong protection against location tracking.

If you live in California, Illinois, Massachusetts – or any state that has yet to address location data privacy – now is the time to act. Contact your lawmakers and urge them to introduce or support bills that protect our sensitive data from exploitation. Demand stronger privacy protections for all, and call for more transparency and accountability from companies that collect and sell location data. Together, we can create a future where individuals are free to travel without the threat of surveillance and retaliation.

Rindala Alajaji

EFF Joins Amicus Briefs Supporting Two More Law Firms Against Unconstitutional Executive Orders

3 days 3 hours ago

Update 4/11/25: EFF joined the ACLU and other legal advocacy organizations today in filing two additional amicus briefs in support of the law firms Jenner & Block and WilmerHale, which have also been targeted by President Donald Trump.

Original post published 4/3/25: EFF has joined the American Civil Liberties Union and other legal advocacy organizations across the ideological spectrum in filing an amicus brief asking a federal judge to strike down President Donald Trump’s executive order targeting law firm Perkins Coie for its past work on voting rights lawsuits and its representation of the President’s prior political opponents. 

As a legal organization that has fought in court to defend the rights of technology users for almost 35 years, including numerous legal challenges to federal government overreach, EFF unequivocally supports Perkins Coie’s challenge to this shocking, vindictive, and unconstitutional executive order. In punishing the law firm for its zealous advocacy on behalf of its clients, the March 6 order offends the First Amendment, the rule of law, and the legal profession broadly in numerous ways. We commend Perkins Coie and other targeted law firms that have chosen to do so (and their legal representatives) for fighting back.  

“If allowed to stand, these pressure tactics will have broad and lasting impacts on Americans' ability to retain legal counsel in important matters, to arrange their business and personal affairs as they like, and to speak their minds,” our brief says. 

Lawsuits against the federal government are a vital component of the system of checks and balances that undergirds American democracy. They reflect a confidence in both the judiciary to decide such matters fairly and justly, and the executive to abide by the court’s determination. They are a backstop against autocracy and a sustaining feature of American jurisprudence since Marbury v. Madison, 5 U.S. 137 (1803).   

The executive order, if enforced, would upend that system and set an appalling precedent: Law firms that represent clients adverse to a given administration can and will be punished for doing their jobs.   

This is a fundamental abuse of executive power.   

The constitutional problems are legion, but here are a few:   

  • The First Amendment bars the government from “distorting the legal system by altering the traditional role of attorneys” by controlling what legal arguments lawyers can make. See Legal Services Corp. v. Velasquez, 531 U.S. 533, 544 (2001). “An informed independent judiciary presumes an informed, independent bar.” Id. at 545.  
  • The executive order is also unconstitutional retaliation for Perkins Coie’s engaging in constitutionally protected speech during the course of representing its clients. See Lozman v. City of Riviera Beach, 585 U.S. 87, 90 (2018). 
  • The executive order violates fundamental precepts of separation of powers and the Fifth and Sixth Amendment rights of litigants to select the counsel of their choice. See United States v. Gonzalez-Lopez, 548 U.S. 140, 147–48 (2006).  

An independent legal profession is a fundamental component of democracy and the rule of law. As a nonprofit legal organization that frequently sues the federal government, we well understand the value of this bedrock principle and how it – and First Amendment rights more broadly – are threatened by President Trump’s executive orders targeting Perkins Coie and other law firms. It is especially important that the whole legal profession speak out against the executive orders in light of the capitulation by a few large law firms. 

The order must be swiftly nullified by the U.S. District Court for the District of Columbia, and must be uniformly vilified by the entire legal profession. 

The ACLU’s press releases with quotes from fellow amici can be found here and here.

David Greene

Florida’s New Social Media Bill Says the Quiet Part Out Loud and Demands an Encryption Backdoor

6 days ago

At least Florida’s SB 868/HB 743, “Social Media Use By Minors” bill isn’t beating around the bush when it states that it would require “social media platforms to provide a mechanism to decrypt end-to-end encryption when law enforcement obtains a subpoena.” Usually these sorts of sweeping mandates are hidden behind smoke and mirrors, but this time it’s out in the open: Florida wants a backdoor into any end-to-end encrypted social media platforms that allow accounts for minors. This would likely lead to companies not offering end-to-end encryption to minors at all, making them less safe online.

Encryption is the best tool we have to protect our communication online. It’s just as important for young people as it is for everyone else, and the idea that Florida can “protect” minors by making them less safe is dangerous and dumb.

The bill is not only privacy-invasive, it’s also asking for the impossible. As breaches like Salt Typhoon demonstrate, you cannot provide a backdoor for just the “good guys,” and you certainly cannot do so for just a subset of users under a specific age. After all, minors are likely speaking to their parents and other family members and friends, and they deserve the same sorts of privacy for those conversations as anyone else. Whether social media companies provide “a mechanism to decrypt end-to-end encryption” or choose not to provide end-to-end encryption to minors at all, there’s no way that doesn’t harm the privacy of everyone.

If this all sounds familiar, that’s because we saw a similar attempt from an Attorney General in Nevada last year. Then, like now, the reasoning is that law enforcement needs access to these messages during criminal investigations. But this doesn’t hold true in practice.

In our amicus brief in Nevada, we point out that there are solid arguments that “content oblivious” investigation methods—like user reporting— are “considered more useful than monitoring the contents of users’ communications when it comes to detecting nearly every kind of online abuse.” That remains just as true in Florida today.

Law enforcement can and does already conduct plenty of investigations involving encrypted messages, and even with end-to-end encryption, law enforcement can potentially access the contents of most messages on the sender or receiver’s devices, particularly when they have access to the physical device. The bill also includes measures prohibiting minors from accessing any sort of ephemeral messaging features, like view once options or disappearing messages. But even with those features, users can still report messages or save them. Targeting specific features does nothing to protect the security of minors, but it would potentially harm the privacy of everyone.

SB 868/HB 743 radically expands the scope of Florida’s social media law HB 3, which passed last year and itself has not yet been fully implemented as it currently faces lawsuits challenging its constitutionality. The state was immediately sued after the law’s passage, with challengers arguing the law is an unconstitutional restriction of protected free speech. That lawsuit is ongoing—and it should be a warning sign. Florida should stop coming up with bad ideas that can't be implemented.

Weakening encryption to the point of being useless is not an option. Minors, as well as those around them, deserve the right to speak privately without law enforcement listening in. Florida lawmakers must reject this bill. Instead of playing politics with kids' privacy, they should focus on real, workable protections—like improving consumer privacy laws to protect young people and adults alike, and improving digital literacy in schools.

Thorin Klosowski

Cybersecurity Community Must Not Remain Silent On Executive Order Attacking Former CISA Director

6 days 1 hour ago

Cybersecurity professionals and the infosec community have essential roles to play in protecting our democracy, securing our elections, and building, testing, and safeguarding government infrastructure. It is critically important for us to speak up to ensure that essential work continues and that those engaged in these good faith efforts are not maligned by an administration that has tried to make examples of its enemies in many other fields. 

President Trump has targeted the former Director of the government’s Cybersecurity and Infrastructure Security Agency (CISA), Chris Krebs, with an executive order cancelling the security clearances of employees at SentinelOne, where Krebs is now the CIO, and launching a probe of his work in the White House. President Trump had previously fired Krebs in 2020 when, in his capacity as CISA Director, Krebs released a statement calling that election, which Trump lost, "the most secure in American history.” 

The executive order directed a review to “identify any instances where Krebs’ or CISA’s conduct appears to be contrary to the administration’s commitment to free speech and ending federal censorship, including whether Krebs’ conduct was contrary to suitability standards for federal employees or involved the unauthorized dissemination of classified information.” Krebs was, in fact, fired for his public stance. 

We’ve seen this playbook before: In March, Trump targeted law firm Perkins Coie for its past work on voting rights lawsuits and its representation of the President’s prior political opponents in a shocking, vindictive, and unconstitutional executive order. After that order, many in the legal profession, including EFF, pushed back, issuing public statements and filing friend of the court briefs in support of Perkins Coie, and other law firms challenging executive orders against them. This public support was especially important in light of the fact that a few large firms capitulated to Trump rather than fight the orders against them.

It is critical that the cybersecurity community now join together to denounce this chilling attack on free speech and rally behind Krebs and SentinelOne rather than cowering because they fear they will be next

The White House must not be given free reign to turn cybersecurity professionals into political scapegoats. EFF regularly defends the infosec community, protecting researchers through education, legal defense, amicus briefs, and involvement in the community with the goal of promoting innovation and safeguarding their rights, and we call on its ranks to join us in defending Chris Krebs and SentinelOne. An independent infosec community is fundamental to protecting our democracy, and to the profession itself.

Jason Kelley

Certbot 4.0: Long Live Short-Lived Certs!

6 days 20 hours ago

When Let’s Encrypt, a free certificate authority, started issuing 90 day TLS certificates for websites, it was considered a bold move that helped push the ecosystem towards shorter certificate life times. Beforehand, certificate authorities normally issued certificate lifetimes lasting a year or more. With 4.0, Certbot is now supporting Let’s Encrypt’s new capability for six day certificates through ACME profiles and dynamic renewal at:

  • 1/3rd of lifetime left
  • 1/2 of lifetime left, if the lifetime is shorter than 10 days

There’s a few, significant reasons why shorter lifetimes are better:

  • If a certificate's private key is compromised, that compromise can't last as long.
  • With shorter life spans for the certificates, automation is encouraged. Which facilitates robust security of web servers.
  • Certificate revocation is historically flaky. Lifetimes 10 days and under prevent the need to invoke the revocation process and deal with continued usage of a compromised key.

There is debate on how short these lifetimes should be, but with ACME profiles you can have the default or “classic” Let’s Encrypt experience (90 days) or start actively using other profile types through Certbot with the --preferred-profile and --required-profile flags. For six day certificates, you can choose the “shortlived” profile.

These new options are just the beginning of the modern features the ecosystem can support and we are glad to have dynamic renewal times to start leveraging a more agile web that facilitates better security and flexible options for everyone. Thank you to the community and the Certbot team for making this happen!

Love ♥️ Certbot as much as us? Donate today to support this work.

Alexis Hancock

Congress Takes Another Step Toward Enabling Broad Internet Censorship

1 week ago

The House Energy and Commerce Committee on Tuesday advanced the TAKE IT DOWN Act (S. 146) , a bill that seeks to speed up the removal of certain kinds of troubling online content. While the bill is meant to address a serious problem—the distribution of non-consensual intimate imagery (NCII)—the notice-and-takedown system it creates is an open invitation for powerful people to pressure websites into removing content they dislike. 

As we’ve written before, while protecting victims of these heinous privacy invasions is a legitimate goal, good intentions alone are not enough to make good policy. 

take action

TELL CONGRESS: "Take It Down" Has No real Safeguards  

This bill mandates a notice-and-takedown system that threatens free expression, user privacy, and due process, without meaningfully addressing the problem it claims to solve. The “takedown” provision applies to a much broader category of content—potentially any images involving intimate or sexual content at all—than the narrower NCII definitions found elsewhere in the bill. The bill contains no protections against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored. 

The legislation’s 48-hour takedown deadline means that online service providers, particularly smaller ones, will have to comply quickly to avoid legal risks. That time crunch will make it impossible for services to verify the content is in fact NCII. Instead, services will rely on automated filters—infamously blunt tools that frequently flag legal content, from fair-use commentary to news reporting.

Communications providers that offer users end-to-end encrypted messaging, meanwhile, may be served with notices they simply cannot comply with, given the fact that these providers cannot view the contents of messages on their platforms. Platforms may respond by abandoning encryption entirely in order to be able to monitor content—turning private conversations into surveilled spaces. 

While several committee Members offered amendments to clarify these problematic provisions in the bill during committee consideration, committee leadership rejected all attempts to amend the bill. 

The TAKE IT DOWN Act is now expected to receive a floor vote in the coming weeks before heading to President Trump’s desk for his signature. Both the President himself and First Lady Melania Trump have been vocal supporters of this bill, and they have been urging Congress to quickly pass it. Trump has shown just how the bill can be abused, saying earlier this year that he would personally use the takedown provisions to censor speech critical of the president.

take action

TELL CONGRESS: "Take It Down" Has No real Safeguards  

Fast tracking a censorship bill is always troubling. TAKE IT DOWN is the wrong approach to helping people whose intimate images are shared without their consent. We can help victims of online harassment without embracing a new regime of online censorship.

Congress should strengthen and enforce existing legal protections for victims, rather than opting for a broad takedown regime that is ripe for abuse. 

Tell your Member of Congress to oppose censorship and to oppose S. 146.

India McKinney

Our Privacy Act Lawsuit Against DOGE and OPM: Why a Judge Let It Move Forward

1 week ago

Last week, a federal judge rejected the government’s motion to dismiss our Privacy Act lawsuit against the U.S. Office of Personnel Management (OPM) and Elon Musk’s “Department of Government Efficiency” (DOGE). OPM is disclosing to DOGE agents the highly sensitive personal information of tens of millions of federal employees, retirees, and job applicants. This disclosure violates the federal Privacy Act, a watershed law that tightly limits how the federal government can use our personal information.

We represent two unions of federal employees: the AFGE and the AALJ. Our co-counsel are Lex Lumina LLP, State Democracy Defenders Fund, and The Chandra Law Firm LLC.

We’ve already explained why the new ruling is a big deal, but let’s take a deeper dive into the Court’s reasoning.

Plaintiffs Have Standing

A plaintiff must show they have “standing” to bring their claim. Article III of the U.S. Constitution empowers courts to decide “cases” and “controversies.” Courts have long held this requires the plaintiff to show an “injury in fact” that is, among other things, “concrete.” In recent years, two Supreme Court decisions – Spokeo v. Robins (2016) and TransUnion v. Ramirez (2021) – addressed when an “intangible” injury, such as invasion of data privacy, is sufficiently concrete. They ruled that such injury must have “a close relationship to a harm traditionally recognized as providing a basis for a lawsuit in American courts.”

In our case, the Court held that our clients passed this test: “The complaint alleges concrete harms analogous to intrusion upon seclusion.” That is one of the common law privacy torts, long recognized in U.S. law. According to the Restatement of Torts, it occurs when a person “intrudes” on the “seclusion of another” in a manner “highly offensive to a reasonable person.”

The Court reasoned that the records at issue here “contain information about the deeply private affairs of the plaintiffs,” including “social security numbers, health history, financial disclosures, and information about family members.” The court also emphasized plaintiffs’ allegation that these records were “disclosed to DOGE agents in a rushed and insecure manner,” including “administrative access, enabling them to alter OPM records and obscure their own access to those records.”

The Court rejected defendants’ argument that our clients supposedly pled “only that DOGE agents were granted access to OPM’s data system,” and not also that “the DOGE agents in fact used that access to examine OPM records.” As a factual matter, plaintiffs in fact pled that “DOGE agents actually exploited their access to review, possess, and use OPM records.”

As a legal matter, such use is not required: “Exposure of the plaintiff’s personally identifiable information to unauthorized third parties, without further use or disclosure, is analogous to harm cognizable under the common law right to privacy.” So ruling, the Court observed: “at least four federal courts have found that the plaintiffs before them had made a sufficient showing of concrete injury, as analogous to common law privacy torts, when agencies granted DOGE agents access to repositories of plaintiffs’ personal information.”

To have standing, a plaintiff must also show that their “injury in fact” is “actual or imminent.” The Court held that our clients passed this test, too. It ruled that plaintiffs adequately alleged an actual injury: “ongoing unauthorized access by the DOGE agents to the plaintiffs’ data.” It also ruled that plaintiffs adequately alleged a separate, imminent injury: OPM’s disclosure to DOGE “has made the OPM data more vulnerable to hacking, identity theft, and other activities that are substantially harmful to the plaintiffs.” The Court emphasized the allegations of “sweeping and uncontrolled access to DOGE agents who were not properly vetted or trained,” as well as the notorious 2015 OPM data breach.

Finally, the Court held that our clients sufficiently alleged the remaining two elements of standing: that defendants caused plaintiffs’ injuries, and that an injunction would redress them.

Plaintiffs May Proceed on Their Privacy Act Claims

The Court held: “The plaintiffs have plausibly alleged violations of two provisions of the Privacy Act: 5 U.S.C. § 552a(b), which prohibits certain disclosures of records, and 5 U.S.C. § 552a(e)(10), which imposes a duty to establish appropriate safeguards and ensure security and confidentiality of records.” The Court cited two other judges who had recently “found a likelihood that plaintiffs will succeed” in their wrongful disclosure claims.

Reprising their failed standing arguments, the government argued that to plead a violation of the Privacy Act’s no-disclosure rule, our clients must allege “not just transmission to another person but also review of the records by that individual.” Again, the Court rejected this argument for two independent reasons. Factually, “the complaint amply pleads that DOGE agents viewed, possessed, and used the OPM records.” Legally, “the defendants misconstrue the term ‘disclose.’” The Court looked to the OPM’s own regulations, which define the term to include “providing personal review of a record,” and an earlier appellate court opinion, interpreting the term to include “virtually all instances [of] an agency’s unauthorized transmission of a protected record.”

Next, the government asserted an exception from the Privacy Act’s no-disclosure rule, for disclosure “to those officers and employees of the agency which maintains the record who have a need for the record in the performance of their duties.” The Court observed that our clients disputed this exception on two independent grounds: “both because [the disclosures] were made to DOGE agents who were not officers or employees of OPM and because, even if the DOGE agents were employees of OPM, they did not have a need for those records in the performance of any lawful duty.” On both grounds, the plaintiffs’ allegations sufficed.

Plaintiffs May Seek to Enjoin Privacy Act Violations

The Court ruled that our clients may seek injunctive and declaratory relief against the alleged Privacy Act violations, by means of the Administrative Procedure Act (APA), though not the Privacy Act itself. This is a win: What ultimately matters is the availability of relief, not the particular path to that relief.

As discussed above, plaintiffs have two claims that the government violated the Privacy Act: unlawful disclosures and unlawful cybersecurity failures. Plaintiffs also have an APA claim of agency action “not in accordance with law,” which refers back to these two Privacy Act violations.

To be subject to APA judicial review, the challenged agency action must be “final.” The Court found finality: “The complaint plausibly alleges that actions by OPM were not representative of its ordinary day-to-day operations but were, in sharp contrast to its normal procedures, illegal, rushed, and dangerous.”

Another requirement for APA judicial review is the absence of an “other adequate remedy.” The Court interpreted the Privacy Act to not allow the injunction our clients seek, but then ruled: “As a result, the plaintiffs have no adequate recourse under the Privacy Act and may pursue their request for injunctive relief under the APA.” The Court further wrote:

The defendants’ Kafkaesque argument to the contrary would deprive the plaintiffs of any recourse under the law. They contend that the plaintiffs have no right to any injunctive relief – neither under the Privacy Act nor under the APA. … This argument promptly falls apart under examination.

Plaintiffs May Proceed on Two More Claims

The Court allowed our clients to move forward on their two other claims.

They may proceed on their claim that the government violated the APA by acting in an “arbitrary and capricious” manner. The Court reasoned: “The complaint alleges that OPM rushed the onboarding process, omitted crucial security practices, and thereby placed the security of OPM records at grave risk.”

Finally, our clients may proceed on their claim that DOGE acted “ultra vires,” meaning outside of its legal power, when it accessed OPM records. The Court reasoned: “The complaint adequately pleads that DOGE Defendants plainly and openly crossed a congressionally drawn line in the sand.”

Next Steps

Congress passed the Privacy Act following the Watergate and COINTELPRO scandals to restore trust in government and prevent a future President from creating another “enemies list.” Congress found that the federal government’s increasing use of databases full of personal records “greatly magnified the harm to individual privacy,” and so it tightly regulated how agencies may use these databases.

The ongoing DOGE data grab may be the worst violation of the Privacy Act since its enactment in 1974. So it is great news that a judge has denied the government’s motion to dismiss our lawsuit. Now we will move forward to prove our case.

Related Cases: American Federation of Government Employees v. U.S. Office of Personnel Management
Adam Schwartz

EFF, Civil Society Groups, Academics Call on UK Home Secretary to Address Flawed Data Bill

1 week 2 days ago

Last week, EFF joined 30 civil society groups and academics in warning UK Home Secretary Yvette Cooper and Department for Science, Innovation & Technology Secretary Peter Kyle about the law enforcement risks contained within the draft Data Use and Access Bill (DUA Bill).

Clause 80 of the DUA Bill weakens the safeguards for solely automated decisions in the law-enforcement context and dilutes crucial data protection safeguards. 

Under sections 49 and 50 of the Data Protection Act 2018, solely automated decisions are prohibited from being made in the law enforcement context unless the decision is required or authorised by law. Clause 80 reverses this in all scenarios unless the data processing involves special category data. 

In short, this would enable law enforcement to use automated decisions about people regarding their socioeconomic status, regional or postcode data, inferred emotions, or even regional accents. This increases the already broad possibilities for bias, discrimination, and lack of transparency at the hands of law enforcement.

In the government’s own Impact Assessment for the DUA Bill, the Government acknowledged that “those with protected characteristics such as race, gender, and age are more likely to face discrimination from ADM due to historical biases in datasets.” Yet, politicians in the UK have decided to push forward with this discriminatory and dangerous agenda regardless. 

Further, given the already minimal transparency around automated decision making, individuals affected in the law enforcement context would have no or highly limited routes to redress.

The DUA Bill puts marginalised groups at risk of opaque, unfair and harmful automated decisions. Yvette Cooper and Peter Kyle must address the lack of safeguards governing law enforcement use of automated decision-making tools before time runs out.

The full letter can be found here

Paige Collings

Judge Rejects Government’s Attempt to Dismiss EFF Lawsuit Against OPM, DOGE, and Musk

2 weeks ago
Court Confirms That, If Proven, DOGE’s Ongoing Access to Personnel Records Is Illegal

NEW YORK—A lawsuit seeking to stop the U.S. Office of Personnel Management (OPM) from disclosing tens of millions of Americans’ private, sensitive information to Elon Musk’s “Department of Government Efficiency” (DOGE) can continue, a federal judge ruled Thursday

Judge Denise L. Cote of the U.S. District Court for the Southern District of New York partially rejected the defendants’ motion to dismiss the lawsuit, which was filed Feb. 11 on behalf of two labor unions and individual current and former government workers across the country. This decision is a victory: The court agreed that the claims that OPM illegally disclosed highly personal records of millions of people to DOGE agents can move forward with the goal of stopping that ongoing disclosure and requiring that any shared information be returned. 

Cote ruled current and former federal employees "may pursue their request for injunctive relief under the APA [Administrative Procedure Act]. ...  The defendants’ Kafkaesque argument to the contrary would deprive the plaintiffs of any recourse under the law." 

"The complaint plausibly alleges that actions by OPM were not representative of its ordinary day-to-day operations but were, in sharp contrast to its normal procedures, illegal, rushed, and dangerous,” the judge wrote.  

The Court added: “The complaint adequately pleads that the DOGE Defendants 'plainly and openly crossed a congressionally drawn line in the sand.'" 

OPM maintains databases of highly sensitive personal information about tens of millions of federal employees, retirees, and job applicants. The lawsuit by EFF, Lex Lumina LLP, State Democracy Defenders Fund, and The Chandra Law Firm argues that OPM and OPM Acting Director Charles Ezell illegally disclosed personnel records to DOGE agents in violation of the federal Privacy Act of 1974, a watershed anti-surveillance statute that prevents the federal government from abusing our personal information. 

The lawsuit’s union plaintiffs are the American Federation of Government Employees AFL-CIO and the Association of Administrative Law Judges, International Federation of Professional and Technical Engineers Judicial Council 1 AFL-CIO

“Today’s legal victory sends a crystal-clear message: Americans’ private data stored with the government isn't the personal playground of unelected billionaires,” said AFGE National President Everett Kelley. “Elon Musk and his DOGE cronies have no business rifling through sensitive data stored at OPM, period. AFGE and our allies fought back – and won – because we will not compromise when it comes to protecting the privacy and security of our members and the American people they proudly serve.” 

As the federal government is the nation’s largest employer, the records held by OPM represent one of the largest collections of sensitive personal data in the country. In addition to personally identifiable information such as names, social security numbers, and demographic data, these records include work information like salaries and union activities; personal health records and information regarding life insurance and health benefits; financial information like death benefit designations and savings programs;  nondisclosure agreements; and information concerning family members and other third parties referenced in background checks and health records.  

OPM holds these records for tens of millions of Americans, including current and former federal workers and those who have applied for federal jobs. OPM has a history of privacy violations—an OPM breach in 2015 exposed the personal information of 22.1 million people—and its recent actions make its systems less secure.  

With few exceptions, the Privacy Act limits the disclosure of federally maintained sensitive records on individuals without the consent of the individuals whose data is being shared. It protects all Americans from harms caused by government stockpiling of our personal data. This law was enacted in 1974, the last time Congress acted to limit the data collection and surveillance powers of an out-of-control President. The judge ruled that the request for an injunction under the Privacy Act claims can go forward under the Administrative Procedures Act, but not directly under the Privacy Act.  

For the order denying the motion to dismiss: https://www.eff.org/document/afge-v-opm-opinion-and-order-motion-dismiss 

For the complaint: https://www.eff.org/document/afge-v-opm-complaint 

For more about the case: https://www.eff.org/cases/american-federation-government-employees-v-us-office-personnel-management 

Contacts 

Electronic Frontier Foundation: press@eff.org 

Lex Lumina LLP: Managing Partner Rhett Millsaps, rhett@lex-lumina.com 

Josh Richman

Calyx Institute: A Case Study in Grassroots Innovation

2 weeks ago

Technologists play a huge role in building alternative tools and resources when our right to privacy and security are undermined by governments and major corporations. This direct resistance ensures that even in the face of powerful adversaries, communities can find some safety and autonomy through community-built tools.

One of the most renowned names in this work is the Calyx Institute, a New York based 501(c)3 nonprofit founded by Nicholas Merrill, after a successful and influential constitutional challenge to the National Security Letter (NSL) statute in the USA Patriot Act. Today Calyx’s mission is to defend digital privacy, advance connectivity, and strive for a future where everyone has access to the resources and tools they need to remain securely connected. Their work is made possible thanks to the generous donations of their over 12,000 grassroots members.

More recently, Calyx joined EFF’s network of grassroots organizations across the US, the Electronic Frontier Alliance (EFA). Members of the alliance are not-for-profit local organizations dedicated to EFA’s five guiding principles: privacy, free expression, access to knowledge, creativity, and security. Calyx has since been an exceptional ally, lifting up and collaborating with fellow members.

If you’re inspired by Calyx to start making a difference in your community, you can get started with our organizer toolkits. Once you’re ready, we hope you consider applying to join the alliance.

JOIN EFA

Defend Digital Rights Locally

We corresponded with Calyx over email to discuss the group's ambitious work, and what the future holds for Calyx. Here are excerpts from our conversation:

Thanks for chatting with us, to get started could you tell us a bit about Calyx’s current work?

Calyx focuses on three areas: (1) developing a privacy-respecting software ecosystem, (2) bridging the digital divide with affordable internet access, and (3) sustaining our community through grants, and research, and educational initiatives.

We build and maintain a digital ecosystem of free and open-source software (FOSS) centering on CalyxOS, an Android operating system that encrypts communications, combats invasive metadata collection, and protects users from geolocation tracking. The Calyx Internet Membership Program offers mobile hotspots so people have a way to stay connected despite limited resources or a lack of viable alternatives. Finally, Calyx actively engages with diverse stakeholder groups to build a shared understanding of privacy and expand digital-security literacy and provide grants to directly support aligned organizations. By partnering with our peers, funders, and service providers, we hope to drive collective action toward a privacy-and-rights-respecting future of technology.

Calyx projects work with a wide range of technologies. What are some barriers Calyx runs into in this work?

Our biggest challenge is one shared by many tech communities, particularly FOSS advocates: it is difficult to balance privacy and security with usability in tool development. On the one hand, the current data-mining business model of the tech sector makes it extremely hard to provide FOSS solutions to proprietary tech while keeping the tool intuitive and easy to use. On the other, there is a general lack of momentum for funding and growing an alternative digital ecosystem.

As a result, many digital rights enthusiasts are left with scarce resources and a narrow space within which to work on technical solutions. We need more people to work together and collectively advocate for a privacy-respecting tech ecosystem that cares about all communities and does not marginalize anyone.

Take CalyxOS, for example. Before it became a tangible project, our founder Nick spent years thinking about an alternative mobile operating system that put privacy first. Back in 2012, Nick spoke to Moxie Marlinspike, the creator of the Signal messaging app, about his idea. Moxie shared several valid concerns that almost led Nick to stop working on it. Fortunately, these warnings, which came from Moxie’s experience and success with Signal, made Nick even more determined, and he recruited an expert global team to help realize his idea.

What do you see as the role of technologists in defending civil liberties with local communities?

Technologists are enablers—they build tools and technical infrastructures, fundamental parts of the digital ecosystem within which people exercise their rights and enjoy their lives. A healthy digital ecosystem consists of technologies that liberate people. It is an arena where people willingly and actively connect and share their expertise, confident in the shared protocols that protect everyone’s rights and dignity. That is why Calyx builds and advocates for people-centered, privacy-focused FOSS tools.

How has Calyx supported folks in NYC? What have you learned from it?

It’s a real privilege to be part of the NYC tech community, which has such a wealth of technologists, policy experts, human rights watchdogs, and grassroots activists. In recent years, we joined efforts led by multiple networks and organizations to mobilize against unjustifiable mass surveillance and other digital threats faced by millions of people of color, immigrants, and other underrepresented groups.

We’re particularly proud of the support we provided to another EFA member, Surveillance Technology Oversight Project, on the Ban the Scan campaign to ban facial recognition in NYC, and CryptoHarlem to sustain their work bringing digital privacy and cybersecurity education to communities in Harlem and beyond. Most recently, we funded Sunset Spark—a small nonprofit offering free education in science and technology in the heart of Brooklyn—to develop a multipurpose curriculum focused on privacy, internet infrastructure, and the roles of the public and private sectors in our digital world.

These experiences deeply inspired us to shape a funding philosophy that centers the needs of organizations and groups with limited resources, helps local communities break barriers and build capacity, and grows reciprocal relationships between each member of the community.

You mentioned a grantmaking program, which is a really unique project for an EFA member. Could you tell us a bit about your theory of change for the program?

Since 2020, the Calyx Institute has been funding the development of digital privacy and security tools, research on mass surveillance systems, and training efforts to equip people with the knowledge and tools they need to protect their right to privacy and connectivity. In 2022, Calyx launched the Fusion Center Research Fund to aid investigations into law enforcement harvesting of personal data through intelligence-sharing centers. This effort, with nearly $200,000 disbursed to grantees, helped reveal the deleterious impact of surveillance technology on privacy and freedom of expression.

These efforts have led to the Sepal Fund, Calyx’s pilot program to offer small groups unrestricted and holistic grants. This program will provide five organizations, collectives, or projects a yearly grant of up to $50,000 for a total of three years. In addition, we will provide our grantees opportunities for professional development, as well as other resources. Through this program, we hope to sustain and elevate research, tool development, and education that will support digital privacy and defend internet freedom.


Could you tell us a bit about how people can get involved?

All our projects are, at their core, community projects, and we welcome insights and involvement from anyone to whom our work is relevant. CalyxOS offers a variety of ways to connect, including a CalyxOS Matrix room and GitLab repository where users and programmers interact in real time to troubleshoot and discuss improvements. Part of making CalyxOS accessible is ensuring that it’s as widely available as possible, so anyone who would like to be part of that translation and localization effort should visit our weblate site.

What does the future look like for Calyx?

We are hoping that the future holds big things for us, like CalyxOS builds on more affordable and globally available mobile devices so that people in different locations with varied resources can equally enjoy the right to privacy. We are also looking forward to updating our visual communication—we have been “substance over style” for so long that it will be exciting to see how a refreshed look will help us reach new audiences.

Finally, what’s your “moonshot”? What’s the ideal future Calyx wants to build?

The Calyx dream is accessible digital privacy, security, and connectivity for all, regardless of budget or tech background, centering communities that are most in need.

We want a future where everyone has access to the resources and tools they need to remain securely connected. To get there, we’ll need to work on building a lot of capacity, both technological and informational. Great tools can only fulfill their purpose if people know why and how to use them. Creating those tools and spreading the word about them requires collaboration, and we are proud to be working toward that goal alongside all the organizations that make up the EFA.

Our thanks to the Calyx Institute for their continued efforts to build private and secure tools for targeted groups, in New York City and across the globe. You can find and support other Electronic Frontier Alliance affiliated groups near you by visiting eff.org/fight.

Rory Mir

Site-Blocking Legislation Is Back. It’s Still a Terrible Idea.

2 weeks 1 day ago

More than a decade ago, Congress tried to pass SOPA and PIPA—two sweeping bills that would have allowed the government and copyright holders to quickly shut down entire websites based on allegations of piracy. The backlash was immediate and massive. Internet users, free speech advocates, and tech companies flooded lawmakers with protests, culminating in an “Internet Blackout” on January 18, 2012. Turns out, Americans don’t like government-run internet blacklists. The bills were ultimately shelved. 

Thirteen years later, as institutional memory fades and appetite for opposition wanes, members of Congress in both parties are ready to try this again. 

take action

Act Now To Defend the Open Web  

The Foreign Anti-Digital Piracy Act (FADPA), along with at least one other bill still in draft form, would revive this reckless strategy. These new proposals would let rights holders get federal court orders forcing ISPs and DNS providers to block entire websites based on accusations of infringing copyright. Lawmakers claim they’re targeting “pirate” sites—but what they’re really doing is building an internet kill switch.

These bills are an unequivocal and serious threat to a free and open internet. EFF and our supporters are going to fight back against them. 

Site-Blocking Doesn’t Work—And Never Will 

Today, many websites are hosted on cloud infrastructure or use shared IP addresses. Blocking one target can mean blocking thousands of unrelated sites. That kind of digital collateral damage has already happened in AustriaRussia​, and in the US.

Site-blocking is both dangerously blunt and trivially easy to evade. Determined evaders can create the same content on a new domain within hours. Users who want to see blocked content can fire up a VPN or change a single DNS setting to get back online. 

These workarounds aren’t just popular—they’re essential tools in countries that suppress dissent. It’s shocking that Congress is on the verge of forcing Americans to rely on the same workarounds that internet users in authoritarian regimes must rely on just to reach mislabeled content. It will force Americans to rely on riskier, less trustworthy online services. 

Site-Blocking Silences Speech Without a Defense

The First Amendment should not take a back seat because giant media companies want the ability to shut down websites faster. But these bills wrongly treat broad takedowns as a routine legal process. Most cases would be decided in ex parte proceedings, with no one there to defend the site being blocked. This is more than a shortcut–it skips due process entirely. 

Users affected by a block often have no idea what happened. A blocked site may just look broken, like a glitch or an outage. Law-abiding publishers and users lose access, and diagnosing the problem is difficult. Site-blocking techniques are the bluntest of instruments, and they almost always punish innocent bystanders. 

The copyright industries pushing these bills know that site-blocking is not a narrowly tailored fix for a piracy epidemic. The entertainment industry is booming right now, blowing past its pre-COVID projections. Site-blocking legislation is an attempt to build a new American censorship system by letting private actors get dangerous infrastructure-level control over internet access. 

EFF and the Public Will Push Back

FADPA is already on the table. More bills are coming. The question is whether lawmakers remember what happened the last time they tried to mess with the foundations of the open web. 

If they don’t, they’re going to find out the hard way. Again. 

take action

Tell Congress: No To Internet Blacklists  

Site-blocking laws are dangerous, unnecessary, and ineffective. Lawmakers need to hear—loud and clear—that Americans don’t support government-mandated internet censorship. Not for copyright enforcement. Not for anything.

Joe Mullin

Vote for “How to Fix the Internet” in the Webby Awards People's Voice Competition!

2 weeks 2 days ago

EFF’s “How to Fix the Internet” podcast is a nominee in the Webby Awards 29th Annual People's Voice competition – and we need your support to bring the trophy home!

Vote now!

We keep hearing all these dystopian stories about technology’s impact on our lives and our futures — from tracking-based surveillance capitalism to the dominance of a few large platforms choking innovation to the growing pressure by authoritarian governments to control what we see and say. The landscape can feel bleak. Exposing and articulating these problems is important, but so is envisioning and then building a better future. 

That’s where our podcast comes in. Through curious conversations with some of the leading minds in law and technology, “How to Fix the Internet” explores creative solutions to some of today’s biggest tech challenges.    

Over our five seasons, we’ve had well-known, mainstream names like Marc Maron to discuss patent trolls, Adam Savage to discuss the rights to tinker and repair, Dave Eggers to discuss when to set technology aside, and U.S. Sen. Ron Wyden, D-OR, to discuss how Congress can foster an internet that benefits everyone. But we’ve also had lesser-known names who do vital, thought-provoking work – Taiwan’s then-Minister of Digital Affairs Audrey Tang discussed seeing democracy as a kind of open-source social technology, Alice Marwick discussed the spread of conspiracy theories and disinformation, Catherine Bracy discussed getting tech companies to support (not exploit) the communities they call home, and Chancey Fleet discussing the need to include people with disabilities in every step of tech development and deployment.   

We’ve just recorded our first interview for Season 6, and episodes should start dropping next month! Meanwhile, you can catch up on our past seasons to become deeply informed on vital technology issues and join the movement working to build a better technological future.  

 And if you’ve liked what you’ve heard, please throw us a vote in the Webbys competition!  

Vote now!

Our deepest thanks to all our brilliant guests, and to the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology, without whom this podcast would not be possible. 

Click below to listen to the show now, or choose your podcast player:

%3Ciframe%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2F1c515ea8-cb6d-4f72-8d17-bc9b7a566869%3Fdark%3Dfalse%26amp%3Bshow%3Dtrue%22%20width%3D%22100%25%22%20height%3D%22480px%22%20frameborder%3D%22no%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

Or get our YouTube playlist! Or, listen to the episodes on the Internet Archive!

Josh Richman

EFF Urges Third Circuit to Join the Legal Chorus: No One Owns the Law

2 weeks 2 days ago

Two appeals courts have recently rejected efforts by private parties to use copyright to restrict access to the laws that most directly affect ordinary citizens: regulations that ensure our homes, workplaces, devices, and many other products, are safe and fit for purpose. Apparently hoping the third time will be the charm, a standards organization is asking the Third Circuit Court of Appeals to break ranks and hold that a private party that helps develop a law also gets to own that law. In an amicus brief filed with co-counsel Abigail Burton and Samuel Silver of Welsh & Recker, P.C., on behalf of multiple entities— including Watch Duty, iFixit, Public.Resource.Org, and multiple library associations—EFF urged the court to instead join the judicial consensus and recognize that no one owns the law.

EFF urged the court to join the judicial consensus and recognize that no one owns the law.

This case concerns UpCodes, a company that has created a database of building codes—like the National Electrical Code—that includes codes incorporated by reference into law. ASTM, a private organization that coordinated the development of some of those codes, insists that it retains copyright in them even after they have been adopted into law, and therefore has the right to control how the public accesses and shares them. Fortunately, neither the Constitution nor the Copyright Act support that theory. Faced with similar claims, some courts, including the Fifth Circuit Court of Appeals, have held that the codes lose copyright protection when they are incorporated into law. Others, like the D.C. Circuit Court of Appeals in a case EFF defended on behalf of Public.Resource.Org, have held that, whether or not the legal status of the standards changes once they are incorporated into law, making them fully accessible and usable online is a lawful fair use. A federal court in Pennsylvania followed the latter path in this case, finding that UpCodes’ database was a protected fair use.

The Third Circuit should affirm the ruling, preferably on the alternative ground that standards incorporated into law are necessarily promoted to the public domain. The internet has democratized access to law, making it easier than ever for the public —from journalists to organizers to safety professionals to ordinary concerned citizens —to understand, comment on, and share the myriad regulations that bind us. That work is particularly essential where those regulations are crafted by private parties and made mandatory by regulators with limited public oversight and increasingly limited staffing. Copyright law should not be read to impede it.

The Supreme Court has explained that “every citizen is presumed to know the law, and it needs no argument to show that all should have free access” to it. Apparently, it needs some argument after all, but it is past time for the debate to end.

Related Cases: Freeing the Law with Public.Resource.Org
Corynne McSherry

Announcing EFF’s New Exhibit on Border Surveillance and Accompanying Events

2 weeks 3 days ago

EFF has created a traveling exhibit, “Border Surveillance: Places, People, and Technology,” which will make its debut at the Angel Island Immigration Station historical site this spring.

The exhibition on Angel Island in San Francisco Bay will run from April 2, 2025 through May 28, 2025. We would especially like to thank the Angel Island Immigration Station Foundation and Angel Island State Park for their collaboration. You can learn more about the exhibit’s hours of operation and how to visit it here

For the last several years, EFF has been amassing data and images detailing the massive increase in surveillance technology infrastructure at the U.S.-Mexico border. EFF staff members have made a series of trips along the U.S.-Mexico border, from the California coast to the tip of Texas, to learn from communities on both sides of the border; interview journalists, aid workers, and activists; and map and document border surveillance technology. We created the most complete open-source and publicly-available map of border surveillance infrastructure. We tracked how the border has been used as a laboratory to test new surveillance technologies. We went to court to protect the privacy of digital information for people at the border. We even released a folder of more than 65 open-licensed images of border surveillance technology so that reporters, activists, and scholars can use alternative and open sources of visual information to inform discourse.

Now, we are hoping this traveling exhibit will be a way for us to share some of that information with the public. Think of it as Border Surveillance 101. 

We could not ask for a more poignant or significant place to launch this exhibit than at the historic Angel Island Immigration Station. Between 1910 and 1940, hundreds of thousands of immigrants, primarily from Asia, hoping to enter the United States through the San Francisco Bay were detained at Angel Island. After the Chinese Exclusion Act of 1882 prevented Chinese laborers from moving to the United States, immigrants were held on Angel Island for days, months, or in some cases, even years, while they awaited permission to enter the country. Unlike New York City’s Ellis Island, which became a monument to welcoming immigrants,  Angel Island became a symbol of exclusion. The walls of the buildings where people awaited rulings on their immigration proceedings to this day,bear inscriptions and carved graffiti that show the depths of their uncertainty, alienation, fear—and hope. 

We hope that by juxtaposing the human consequences of historic exclusion with today’s high-tech, digital surveillance under which hopeful immigrants, asylum seekers, and borderlands residents live, we will invite viewers to think about what side of history they want to be on. 

If your institution—be it a museum, library, school or community center—is interested in hosting the exhibit in the future, please reach out to Senior Policy Analyst Matthew Guariglia at matthew@eff.org

Programing

In addition to the physical exhibit that you can visit on Angel Island, EFF will host two events to further explore surveillance at the U.S.-Mexico border. On April 3, 2025 from 1-2pm PDT, EFF will be joined by journalists, activists, and researchers that operate on both sides of the border, for a livestream event titled “Life and Migration Under Surveillance at the U.S.-Mexico Border.”

For people in the Bay Area, EFF will host an in-person event in San Francisco titled “Tracking and Documenting Surveillance at the U.S.-Mexico Border” on April 9th, 6-8pm hosted by the Internet Archive. Please check our events page for more information to RSVP.  

Matthew Guariglia

Congress Must Reject Transparent Efforts to Undermine the Courts

2 weeks 5 days ago

Earlier this week, the House Judiciary Committee passed H.R. 1526, a bill by Rep. Darrell Issa to prevent courts from issuing nationwide injunctions. This bill could receive a vote on the House floor as early as next week. Senator Josh Hawley recently introduced a similar bill in the Senate. Both bills would prohibit district courts from handing down injunctive relief orders that apply to parties that are not involved in the case. 

EFF opposes both bills. We see this legislation for what it is: a transparent attempt to limit courts' ability to act as an effective check on the Trump administration’s recent flood of illegal orders and actions – some of which EFF itself is challenging. Congress should firmly oppose any effort to prevent the judicial branch from fulfilling its constitutional duty.

Indeed, this is a remedy in search of a problem. There are already well-established tests for injunctive relief: Courts must consider multiple factors, including the strength of the case against the defendant, the potential harms of granting the injunction, what other relief is available, and the public interest.  As part of this analysis, courts can and do tailor the relief they grant to what they conclude is necessary to remedy the harm. Nationwide injunctions may be necessary to stop nationwide unlawful conduct. And if an injunction was improperly granted, its target can appeal to have it overturned. 

To be clear, EFF doesn’t agree with every grant of nationwide relief. Courts sometimes get it wrong, often because they misinterpret the law they are asked to apply. If Congress wants to fix that kind of problem, it should draft specific legislation to reform or clarify specific laws. It should not, and cannot, rewrite our constitutional system of checks and balances just because it doesn’t like some of the outcomes.

Corynne McSherry

Online Tracking is Out of Control—Privacy Badger Can Help You Fight Back

2 weeks 6 days ago

Every time you browse the web, you're being tracked. Most websites contain invisible tracking code that allows companies to collect and monetize data about your online activity. Many of those companies are data brokers, who sell your sensitive information to anyone willing to pay. That’s why EFF created Privacy Badger, a free, open-source browser extension used by millions to fight corporate surveillance and take back control of their data. 

Since we first released Privacy Badger in 2014, online tracking has only gotten more invasive and Privacy Badger has evolved to keep up. Whether this is your first time using it or you’ve had it installed since day one, here’s a primer on how Privacy Badger protects you.

Online Tracking Isn't Just Creepy—It’s Dangerous 

The rampant data collection, sharing, and selling fueled by online tracking has serious consequences. Fraudsters purchase data to identify elderly people susceptible to scams. Government agencies and law enforcement purchase people’s location data and web browsing records without a warrant. Data brokers help predatory companies target people in financial distress. And surveillance companies repackage data into government spy tools.

Once your data enters the data broker ecosystem, it’s nearly impossible to know who buys it and what they’re doing with it. Privacy Badger blocks online tracking to prevent your browsing data from being used against you. 

Privacy Badger Disrupts Surveillance Business Models

Online tracking is pervasive because it’s profitable. Tech companies earn enormous profits by targeting ads based on your online activity—a practice called “online behavioral advertising.” In fact, Big Tech giants like Google, Meta, and Amazon are among the top companies tracking you across the web. By automatically blocking their trackers, Privacy Badger makes it harder for Big Tech companies to profit from your personal information.

Online behavioral advertising has made surveillance the business model of the internet. Companies are incentivized to collect as much of our data as possible, then share it widely through ad networks with no oversight. This not only exposes our sensitive information to bad actors, but also fuels government surveillance. Ending surveillance-based advertising is essential for building a safer, more private web. 

While strong federal privacy legislation is the ideal solution—and one that we continue to advocate for—Privacy Badger gives you a way to take action today. 

Privacy Badger fights for a better web by incentivizing companies to respect your privacy. Privacy Badger sends the Global Privacy Control and Do Not Track signals to tell companies not to track you or share your data. If they ignore these signals, Privacy Badger will block them, whether they are advertisers or trackers of other kinds. By withholding your browsing data from advertisers, data brokers, and Big Tech companies, you can help make online surveillance less profitable. 

How Privacy Badger Protects You From Online Tracking

Whether you're looking to protect your sensitive information from data brokers or simply don’t want Big Tech monetizing your data, Privacy Badger is here to help.

Over the past decade, Privacy Badger has evolved to fight many different methods of online tracking. Here are some of the ways that Privacy Badger protects your data:

  • Blocks Third-Party Trackers and Cookies: Privacy Badger stops tracking code from loading on sites that you visit. That prevents companies from collecting data about your online activity on sites that they don’t own. 
  • Sends the GPC Signal to Opt Out of Data Sharing: Privacy Badger sends the Global Privacy Control (GPC) signal to opt out of websites selling or sharing your personal information. This signal is legally binding in some states, including California, Colorado, and Connecticut. 
  • Stops Social Media Companies From Tracking You Through Embedded Content: Privacy Badger replaces page elements that track you but are potentially useful (like embedded tweets) with click-to-activate placeholders. Social media buttons, comments sections, and video players can send your data to other companies, even if you don’t click on them.
  • Blocks Link Tracking on Google and Facebook: Privacy Badger blocks Google and Facebook’s attempts to follow you whenever you click a link on their websites. Google not only tracks the links you visit from Google Search, but also the links you click on platforms that feel more private, like Google Docs and Gmail
  • Blocks Invasive “Fingerprinting” Trackers: Privacy Badger blocks trackers that try to identify you based on your browser's unique characteristics, a particularly problematic form of tracking called “fingerprinting.” 
  • Automatically learns to block new trackers: Our Badger Swarm research project continuously discovers new trackers for Privacy Badger to block. Trackers are identified based on their behavior, not just human-curated blocklists.
  • Disables Harmful Chrome Settings: Automatically disables Google Chrome settings that are bad for your privacy.
  • Easy to Disable on Individual Sites While Maintaining Protections Everywhere Else: If blocking harmful trackers ends up breaking something on a website, you can disable Privacy Badger for that specific site while maintaining privacy protections everywhere else.

All of these privacy protections work automatically when you install Privacy Badger—there’s no setup required! And it turns out that when Privacy Badger blocks tracking, you’ll also see fewer ads and your pages will load faster. 

You can always check to see what Privacy Badger has done on the site you’re visiting by clicking on Privacy Badger’s icon in your browser toolbar.

Fight Corporate Surveillance by Spreading the Word About Privacy Badger

Privacy is a team sport. The more people who withhold their data from data brokers and Big Tech companies, the less profitable online surveillance becomes. If you haven’t already, visit privacybadger.org to install Privacy Badger on your web browser. And if you like Privacy Badger, tell your friends about how they can join us in fighting for a better web!

Install Privacy Badger

Lena Cohen

A New Tool to Detect Cellular Spying | EFFector 37.3

3 weeks ago

Take some time during your Spring Break to catch up on the latest digital rights news by subscribing to EFF's EFFector newsletter!

This edition of the newsletter covers our new open source tool to detect cellular spying, Rayhunter; The Foilies 2025, our tongue-in-cheek awards to the worst responses to public records requests; and our recommendations to the NSF for the new AI Action Plan to put people first.

You can read the full newsletter here, and even get future editions directly to your inbox when you subscribe! Additionally, we've got an audio edition of EFFector on the Internet Archive, or you can view it by clicking the button below:

LISTEN ON YouTube

EFFECTOR 37.3 - A NEW TOOL TO DETECT CELLULAR SPYING

Since 1990 EFF has published EFFector to help keep readers on the bleeding edge of their digital rights. We know that the intersection of technology, civil liberties, human rights, and the law can be complicated, so EFFector is a great way to stay on top of things. The newsletter is chock full of links to updates, announcements, blog posts, and other stories to help keep readers—and listeners—up to date on the movement to protect online privacy and free expression. 

Thank you to the supporters around the world who make our work possible! If you're not a member yet, join EFF today to help us fight for a brighter digital future.

Christian Romero

How to Delete Your 23andMe Data

3 weeks 1 day ago

This week, the genetic testing company 23andMe filed for bankruptcy, which means the genetic data the company collected on millions of users is now up for sale. If you do not want your data included in any potential sale, it’s a good time to ask the company to delete it.

When the company first announced it was considering a sale, we highlighted many of the potential issues, including selling that data to companies with poor security practices or direct links to law enforcement. With this bankruptcy, the concerns we expressed last year remain the same. It is unclear what will happen with your genetic data if 23andMe finds a buyer, and that uncertainty is a clear indication that you should consider deleting your data. California attorney general Rob Bonta agrees.

First: Download Your Data

Before you delete your account, you may want to download the data for your own uses. If you do so, be sure to store it securely. To download you data:

  1. Log into your 23andMe account and click your username, then click "Settings." 
  2. Scroll down to the bottom where it says "23andMe Data" and click "View."
  3. Here, you'll find the option to download various parts of your 23andMe data. The most important ones to consider are:
    1. The "Reports Summary" includes details like the "Wellness Reports," "Ancestry Reports," and "Traits Reports."
    2. The "Ancestry Composition Raw Data" the company's interpretation of your raw genetic data.
    3. If you were using the DNA Relatives feature, the "Family Tree Data" includes all the information about your relatives. Based on the descriptions of the data we've seen, this sounds like the data the bad actors collected.
    4. You can also download the "Raw data," which is the uninterpreted version of your DNA. 

There are other types of data you can download on this page, though much of it will not be of use to you without special software. But there's no harm in downloading it all. 

How to Delete Your Data

Finally, you can delete your data and revoke consent for research. While it doesn’t make this clear on the deletion page, this also authorizes the company to destroy your DNA sample, if you hadn't already asked them to do so. You can also make this request more explicit if you want in the Account preferences section page.

If you're still on the page to download your data from the steps above, you can skip to step three. Otherwise:

  1. Click your username, then click "Settings." 
  2. Scroll down to the bottom where it says "23andMe Data" and click "View."
  3. Scroll down to the bottom of this page, and click "Permanently Delete Data."
  4. You should get a message stating that 23andMe received the request but you need to confirm by clicking a link sent to your email. 
  5. Head to your email account associated with your 23andMe account to find the email titled "23andMe Delete Account Request." Click the "Permanently Delete All Records" button at the bottom of the email, and you will be taken to a page that will say "Your data is being deleted" (You may need to log in again, if you logged out).

23andMe should give every user a real choice to say “no” to a data transfer in this bankruptcy and ensure that any buyer makes real privacy commitments. Other consumer genetic genealogy companies should proactively take these steps as well. Our DNA contains our entire genetic makeup. It can reveal where our ancestors came from, who we are related to, our physical characteristics, and whether we are likely to get genetically determined diseases. Even if you don’t add your own DNA to a private database, a relative could make that choice for you by adding their own.

This incident is an example of why this matters, and how certain features that may seem useful in the moment can be weaponized in novel ways. A bankruptcy should not result in our data getting shuffled off to the highest bidder without our input or a guarantee of  real protections.

Thorin Klosowski
Checked
1 hour 32 minutes ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed