When Platforms and the Government Unite, Remember What’s Private and What Isn’t

2 months ago

For years now, there has been some concern about the coziness between technology companies and the government. Whether a company complies with casual government requests for data, requires a warrant, or even fights overly-broad warrants has been a canary in the digital coal mine during an era where companies may know more about you than your best friends and families. For example, in 2022, law enforcement served a warrant to Facebook for the messages of a 17-year-old girl—messages that were later used as evidence in a criminal trial that the teenager had received an abortion. In 2023, after a four year wait since announcing its plans, Facebook encrypted its messaging system so that the company no longer had access to the content of those communications.

The privacy of messages and the relationship between companies and the government have real-world consequences. That is why a new era of symbiosis between big tech companies and the U.S. government bodes poorly for both, our hopes that companies will be critical of requests for data, and any chance of tech regulations and consumer privacy legislation. But, this chumminess should also come with a heightened awareness for users: as companies and the government become more entwined through CEO friendships, bureaucratic entanglements, and ideological harmony, we should all be asking what online data is private and what is sitting on a company's servers and accessible to corporate leadership at the drop of hat.

Over many years, EFF has been pushing for users to switch to platforms that understand the value of encrypting data. We have also been pushing platforms to make end-to-end encryption for online communications and for your stored sensitive data the norm. This type of encryption helps ensure that a conversation is private between you and the recipient, and not accessible to the platform that runs it or any other third-parties. Thanks to the combined efforts of our organization and dozens of other concerned groups, tech users, and public officials, we now have a lot of options for applications and platforms that take our privacy more seriously than in previous generations. But, in light of recent political developments it’s time for a refresher course: which platforms and applications have encrypted DMs, and which have access to your sensitive personal communications.

The existence of what a platform calls “end-to-end encryption” is not foolproof. It may be poorly implemented, lack widespread adoption to attract the attention of security researchers, lack the funding to pay for security audits, or use a less well-established encryption protocol that doesn’t have much public scrutiny. It also can’t protect against other sorts of threats, like someone gaining access to your device or screenshotting a conversation. Being caught using certain apps can itself be dangerous in some cases. And it takes more than just a basic implementation to resist a targeted active attack, as opposed to later collection. But it’s still the best way we currently have to ensure our digital conversations are as private as possible. And more than anything, it needs to be something you and the people you speak with will actually use, so features can be an important consideration.

No platform provides a perfect mix of security features for everyone, but understanding the options can help you start figuring out the right choices. When it comes to popular social media platforms, Facebook Messenger uses end-to-end encryption on private chats by default (this feature is optional in group chats on Messenger, and on some of the company’s other offerings, like Instagram). Other companies, like X, offer optional end-to-end encryption, with caveats, such as only being available to users who pay for verification. Then there’s platforms like Snapchat, which have given talks about their end-to-end encryption in the past, but don’t provide further details about its current implementations. Other platforms, like Bluesky, Mastodon, and TikTok, do not offer end-to-end encryption in direct messages, which means those conversations could be accessible to the companies that run the platforms or made available to law enforcement upon request.

As for apps more specifically designed around chat, there are more examples. Signal offers end-to-end encryption for text messages and voice calls by default with no extra setup on your part, and collects less metadata than other options. Metadata can reveal information such as who you are talking with and when, or your location, which in some cases may be all law enforcement needs. WhatsApp is also end-to-end encrypted. Apple’s Messages app is end-to-end encrypted, but only if everyone in the chat has an iPhone (blue bubbles). The same goes for Google Messages, which is end-to-end encrypted as long as everyone has set it up properly, which sometimes happens automatically.

Of course, we have a number of other communication tools at our disposal, like Zoom, Slack, Discord, Telegram, and more. Here, things continue to get complicated, with end-to-end encryption being an optional feature sometimes, like on Zoom or Telegram; available only for specific types of communication, like video and voice calls on Discord but not text conversations; or not being available at all, like with Slack. Many other options exist with varying feature-sets, so it’s always worth doing some research if you find something new. This does not mean you need to avoid these tools entirely, but knowing that your chats may be available to the platform, law enforcement, or an administrator is an important thing to consider when choosing what to say and when to say it. 

And for high-risk users, the story becomes even more complicated. Even on an encrypted platform, users can be subject to targeted machine-in-the middle attacks (also known as man-in-the middle attacks) unless everyone verifies each others’ keys. Most encrypted apps will let you do this manually, but some have started to implement automatic key verification, which is a security win. And encryption doesn’t matter if message backups are uploaded to the company’s servers unencrypted, so it’s important to either choose to not backup messages, or carefully set up encrypted backups on platforms that allow it. This is all before getting into the intricacies of how apps handle deleted and disappearing messages, or whether there’s a risk of being found with an encrypted app in the first place.

CEOs are not the beginning and the end of a company’s culture and concerns—but we should take their commitments and signaled priorities seriously. At a time when some companies may be cozying up to the parts of government with the power to surveil and marginalize, it might be an important choice to move our data and sensitive communications to different platforms. After all, even if you are not at specific risk of being targeted by the government, your removed participation on a platform sends a clear political message about what you value in a company. 

Thorin Klosowski

【Bookガイド】2月の“推し本”紹介=萩山 拓(ライター)

2 months ago
  ノンフィクション・ジャンルからチョイスした気になる本の紹介です(刊行順・販価は税別)◆望月衣塑子『軍拡国家』角川新書 2/10刊 900円 軍拡に舵を切るこの国で、私たちの生活はどう変わる? 5年で43兆円の防衛費増、敵基地攻撃能力の保有など、周辺諸国の脅威が声高に叫ばれる中、専守防衛という国の在り方は大転換した。防衛問題を追い続けてきた著者による最新レポート。 著者は東京新聞社会部記者。入社後、東京地検特捜部などを担当。官邸での官房長官記者会見で、真実を明らかにするべく..
JCJ

EFF Sues OPM, DOGE and Musk for Endangering the Privacy of Millions

2 months 1 week ago
Lawsuit Argues Defendants Violated the Privacy Act by Disclosing Sensitive Data

NEW YORK—EFF and a coalition of privacy defenders led by Lex Lumina filed a lawsuit today asking a federal court to stop the U.S. Office of Personnel Management (OPM) from disclosing millions of Americans’ private, sensitive information to Elon Musk and his “Department of Government Efficiency” (DOGE).

The complaint on behalf of two labor unions and individual current and former government workers across the country, filed in the U.S. District Court for the Southern District of New York, also asks that any data disclosed by OPM to DOGE so far be deleted.

The complaint by EFF, Lex Lumina LLP, State Democracy Defenders Fund, and The Chandra Law Firm argues that OPM and OPM Acting Director Charles Ezell illegally disclosed personnel records to Musk’s DOGE in violation of the federal Privacy Act of 1974. Last week, a federal judge temporarily blocked DOGE from accessing a critical Treasury payment system under a similar lawsuit.

This lawsuit’s plaintiffs are the American Federation of Government Employees AFL-CIO; the Association of Administrative Law Judges, International Federation of Professional and Technical Engineers Judicial Council 1 AFL-CIO; Vanessa Barrow, an employee of the Brooklyn Veterans Affairs Medical Center; George Jones, President of AFGE Local 2094 and a former employee of VA New York Harbor Healthcare; Deborah Toussant, a former federal employee; and Does 1-100, representing additional current or former federal workers or contractors.

As the federal government is the nation’s largest employer, the records held by OPM represent one of the largest collections of sensitive personal data in the country. In addition to personally identifiable information such as names, social security numbers, and demographic data, these records include work information like salaries and union activities; personal health records and information regarding life insurance and health benefits; financial information like death benefit designations and savings programs; and nondisclosure agreements; and information concerning family members and other third parties referenced in background checks and health records. OPM holds these records for tens of millions Americans, including current and former federal workers and those who have applied for federal jobs. OPM has a history of privacy violations—an OPM breach in 2015 exposed the personal information of 22.1 million people—and its recent actions make its systems less secure. 

With few exceptions, the Privacy Act limits the disclosure of federally maintained sensitive records on individuals without the consent of the individuals whose data is being shared. It protects all Americans from harms caused by government stockpiling of our personal data. This law was enacted in 1974, the last time Congress acted to limit the data collection and surveillance powers of an out-of-control President.

“The Privacy Act makes it unlawful for OPM Defendants to hand over access to OPM’s millions of personnel records to DOGE Defendants, who lack a lawful and legitimate need for such access,” the complaint says. “No exception to the Privacy Act covers DOGE Defendants’ access to records held by OPM. OPM Defendants’ action granting DOGE Defendants full, continuing, and ongoing access to OPM’s systems and files for an unspecified period means that tens of millions of federal-government employees, retirees, contractors, job applicants, and impacted family members and other third parties have no assurance that their information will receive the protection that federal law affords.” 

For more than 30 years, EFF has been a fierce advocate for digital privacy rights. In that time, EFF has been at the forefront of exposing government surveillance and invasions of privacy—such as forcing the release of hundreds of pages of documents about domestic surveillance under the Patriot Act—and enforcing existing privacy laws to protect ordinary Americans—such as in its ongoing lawsuit against Sacramento's public utility company for sharing customer data with police. 

For the complaint: https://www.eff.org/document/afge-v-opm-complaint

For more about the litigation: https://www.eff.org/deeplinks/2025/02/eff-sues-doge-and-office-personnel-management-halt-ransacking-federal-data

Contacts:
Electronic Frontier Foundation: press@eff.org
Lex Lumina LLP: Managing Partner Rhett Millsaps, rhett@lex-lumina.com

Josh Richman

The TAKE IT DOWN Act: A Flawed Attempt to Protect Victims That Will Lead to Censorship

2 months 1 week ago

Congress has begun debating the TAKE IT DOWN Act (S. 146), a bill that seeks to speed up the removal of a troubling type of online content: non-consensual intimate imagery, or NCII. In recent years, concerns have also grown about the use of digital tools to alter or create such images, sometimes called deepfakes.

While protecting victims of these heinous privacy invasions is a legitimate goal, good intentions alone are not enough to make good policy. As currently drafted, the Act mandates a notice-and-takedown system that threatens free expression, user privacy, and due process, without addressing the problem it claims to solve.

The Bill Will Lead To Overreach and Censorship

S.B. 146 mandates that websites and other online services remove flagged content within 48 hours and requires “reasonable efforts” to identify and remove known copies. Although this provision is designed to allow NCII victims to remove this harmful content, its broad definitions and lack of safeguards will likely lead to people misusing the notice-and-takedown system to remove lawful speech.

take action

"Take It Down" Has No real Safeguards  

The takedown provision applies to a much broader category of content—potentially any images involving intimate or sexual content—than the narrower NCII definitions found elsewhere in the bill. The takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Lawful content—including satire, journalism, and political speech—could be wrongly censored. The legislation’s tight time frame requires that apps and websites remove content within 48 hours, meaning that online service providers, particularly smaller ones, will have to comply so quickly to avoid legal risk that they won’t be able to verify claims. Instead, automated filters will be used to catch duplicates, but these systems are infamous for flagging legal content, from fair-use commentary to news reporting.

TAKE IT DOWN creates a far broader internet censorship regime than the Digital Millennium Copyright Act (DMCA), which has been widely abused to censor legitimate speech. But at least the DMCA has an anti-abuse provision and protects services from copyright claims should they comply. This bill contains none of those minimal speech protections and essentially greenlights misuse of its takedown regime.

Threats To Encrypted Services

The online services that do the best job of protecting user privacy could also be under threat from Take It Down. While the bill exempts email services, it does not provide clear exemptions for private messaging apps, cloud storage, and other end-to-end encrypted (E2EE) services. Services that use end-to-end encryption, by design, are not able to access or view unencrypted user content.

How could such services comply with the takedown requests mandated in this bill? Platforms may respond by abandoning encryption entirely in order to be able to monitor content—turning private conversations into surveilled spaces.

In fact, victims of NCII often rely on encryption for safety—to communicate with advocates they trust, store evidence, or escape abusive situations. The bill’s failure to protect encrypted communications could harm the very people it claims to help.

Victims Of NCII Have Legal Options Under Existing Law

An array of criminal and civil laws already exist to address NCII. In addition to 48 states that have specific laws criminalizing the distribution of non-consensual pornography, there are defamation, harassment, and extortion statutes that can all be wielded against people abusing NCII. Since 2022, NCII victims have also been able to bring federal civil lawsuits against those who spread this harmful content.

As we explained in 2018:

If a deepfake is used for criminal purposes, then criminal laws will apply. If a deepfake is used to pressure someone to pay money to have it suppressed or destroyed, extortion laws would apply. For any situations in which deepfakes were used to harass, harassment laws apply. There is no need to make new, specific laws about deepfakes in either of these situations.


In many cases, civil claims could also be brought against those distributing the images under causes of action like False Light invasion of privacy. False light claims commonly address photo manipulation, embellishment, and distortion, as well as deceptive uses of non-manipulated photos for illustrative purposes.

A false light plaintiff (such as a person harmed by NCII) must prove that a defendant (such as a person who uploaded NCII) published something that gives a false or misleading impression of the plaintiff in such a way to damage the plaintiff’s reputation or cause them great offense. 

Congress should focus on enforcing and improving these existing protections, rather than opting for a broad takedown regime that is bound to be abused. Private platforms can play a part as well, improving reporting and evidence collection systems. 

Joe Mullin