Beyond Pride Month: Protections for LGBTQ+ People All Year Round

2 weeks 3 days ago

The end of June concluded LGBTQ+ Pride month, yet the risks LGBTQ+ people face persist every month of the year. This year, LGBTQ+ Pride took place at a time of anti-LGBTQ+ violence, harassment and vandalism and back in May, US officials had warned that LGBTQ+ events around the world might be targeted during Pride Month. Unfortunately, that risk is likely to continue for some time. So too will activist actions, community organizing events, and other happenings related to LGBTQ+ liberation. 

We know it feels overwhelming to think about how to keep yourself safe, so here are some quick and easy steps you can take to protect yourself at in-person events, as well as to protect your data—everything from your private messages with friends to your pictures and browsing history.

There is no one-size-fits-all security solution to protect against everything, and it’s important to ask yourself questions about the specific risks you face, balancing their likelihood of occurrence with the impact if they do come about. In some cases, the privacy risks brought about by technologies may actually be worth risking for the convenience that they offer. For example, is it more of a risk to you that phone towers are able to identify your cell phone’s device ID, or that you have your phone turned on and handy to contact others in the event of danger? Carefully thinking through these types of questions is the first step in keeping yourself safe. Here’s an easy guide on how to do just that.

Tips For In-Person Events And Protests


For your devices:

  • Enable full disk encryption for your device to ensure all files across your entire device cannot be accessed if taken by law enforcement or others.
  • Install an encrypted messenger app such as Signal (for iOS or Android) to guarantee that only you and your chosen recipient can see and access your communications. Turn on disappearing messages, and consider shortening the amount of time messages are kept in the app when you are actually attending an event. If instead you have a burner device with you, be sure to save the numbers for emergency contacts.
  • Remove biometric device unlock like fingerprint or FaceID to prevent police officers from physically forcing you to unlock your device with your fingerprint or face. You can password-protect your phone instead.
  • Log out of accounts and uninstall apps or disable app notifications to avoid app activity in precarious legal contexts from being used against you, such as using gay dating apps in places where homosexuality is illegal. 
  • Turn off location services on your devices to avoid your location history from being used to identify your device’s comings and goings. For further protections, you can disable GPS, Bluetooth, Wi-Fi, and phone signals when planning to attend a protest.

For you:

  • Wearing a mask during a protest is advisable, particularly as gathering in large crowds increases the risk of law enforcement deploying violent tactics like tear gas, as well as increasing the possibility of being targeted through face recognition technology
  • Tell friends or family when you plan to attend and leave an event so that they can follow up to make sure you are safe if there are arrests, harassment, or violence. 
  • Cover your tattoos to reduce the possibility of image recognition technologies like facial recognition, iris recognition and tattoo recognition identifying you.
  • Wearing the same clothing as everyone in your group can help hide your identity during the protest and keep you from being identified and tracked afterwards. Dressing in dark and monochrome colors will help you blend into a crowd.
  • Say nothing except to assert your rights if you are arrested. Without a warrant, law enforcement cannot compel you to unlock your devices or answer questions, beyond basic identification in some jurisdictions. Refuse consent to a search of your devices, bags, vehicles, or home, and wait until you have a lawyer before speaking.

Given the increase in targeted harassment and vandalism towards LGBTQ+ people, it’s especially important to consider counterprotesters showing up at various events. Since the boundaries between parade and protest might be blurred, you must take precautions. Our general guide for attending a protest covers the basics for protecting your smartphone and laptop, as well as providing guidance on how to communicate and share information responsibly. We also have a handy printable version available here.

LGBTQ+ Pride is about recognition of our differences and claiming honor in our presence in public spaces. Because of this, it’s an odd thing to have to take careful privacy precautions to keep yourself safe during Pride events. Consider it like you would any aspect of bodily autonomy and self determination—only you get to decide what aspects of yourself you share with others. You get to decide how you present to the world and what things you keep private. With a bit of care, you can maintain privacy, safety, and pride in doing so.

Paige Collings

UN Draft Cybercrime Treaty Dangerously Expands State Surveillance Powers Without Robust Privacy, Data Protection Safeguards

2 weeks 3 days ago

This is the third post in a series highlighting flaws in the proposed UN Cybercrime Convention. Check out Part I, our detailed analysis on the criminalization of security research activities, and Part II, an analysis of the human rights safeguards. 

As we near the final negotiating session for the proposed UN Cybercrime Treaty, countries are running out of time to make critical improvements to the draft text. Delegates meeting in New York from July 29 to August 9 must finalize the convention’s text that, if adopted, will expand surveillance laws dramatically and weaken human rights safeguards significantly. This proposed UN treaty is not a cybercrime treaty; it is an expansive global surveillance pact.

Countries that believe in the rule of law must stand up and either defeat the convention or dramatically limit its  scope, adhering to non-negotiable red lines as outlined by over 100 NGOs. In an uncommon alliance, civil society and industry agreed earlier this year in a joint letter that the treaty as it was currently drafted  must be rejected  and amended to protect privacy and data protection rights—none of which have been made in the latest version of the proposed Convention.

The UN Ad Hoc Committee overseeing the talks and preparation of a final text is expected to consider a revised but still-flawed text  in its entirety, along with the interpretative notes, during the first week of the session, with a focus on all provisions not yet agreed ad referendum. However, in keeping with the principle in multilateral negotiations that nothing is agreed until everything is agreed, any provisions of the draft that have already been agreed could potentially be reopened. 

An updated draft, dated May 23, 2024, but released on June 14th, is far from settled, though. Tremendous disagreements still exist among countries on crucial issues, including the scope of cross border surveillance powers and protection of human rights. Nevertheless, some countries expect the latest draft  to be adopted. 

Earlier drafts included criminalization of a wide range of speech, and a number of non-cyber crimes. Just when we thought Member States had succeeded in removing many of the most concerning crimes from the convention’s text, they could be making a reappearance. The Ad-Hoc Committee Chair’s proposed General Assembly resolution includes a promise of two additional sessions to negotiate an amendment with more crimes: “a draft protocol supplementary to the Convention, addressing, inter alia, additional criminal offenses.”

Let us be clear: Without robust mandatory data protection and privacy safeguards, the updated draft is bad news for people around the world. It will exacerbate existing disparities in human rights protections, potentially allowing increased government overreach, unchecked surveillance, and access to sensitive data that will leave individuals vulnerable to privacy and data protection violations, human rights abuses, or transnational repression. Critical privacy safeguards continue to be woefully inadequate, and there are no explicit data protection principles in the text itself.

In this third post, we explore  problems caused by the expansive definition of “electronic data,” combined with the lack of mandatory privacy and data protection safeguards in the proposed convention. This term has a very broad and vague reach. It appears to include sensitive personal data, like biometric identifiers, which could be accessed by police without adequate protections and under weak privacy safeguards. Worse, it could then be shared with other governments. This poses significant risks for refugees, human rights defenders, and anyone who travels across borders. Instead of this race to the bottom, we call for ironclad privacy and data protection principles in the text to thwart abuses.

Key Surveillance Powers Involving Electronic Data

Chapter IV of the draft, which deals with criminal procedural measures, creates a wide range of government powers to monitor and access people’s digital systems and data, focusing mainly on “subscriber data,” “traffic data,” and “content data.” These powers can be broadly described as forms of communications surveillance or surveillance of communications data. Traditionally, the invasiveness of communications surveillance has been evaluated on the basis of such artificial and formalistic categories.

The revised draft introduces a catch-all category called "Electronic Data" in Article 2(b), defined as "any representation of facts, information, or concepts in a form suitable for processing in an information and communications technology system, including a program suitable to cause an information and communications technology system to perform a function." 

  • Electronic data includes non-communication data, information that hasn’t been communicated to someone else or stored with a service provider. This extremely broad definition includes all forms of digital data, which in other contexts would enjoy specific protections based on the nature or origin of that data.
  • For example, sensitive data such as biometric identifiers should require more stringent processes before being accessed due to the significant risks if collected without proper protections.
  • Additionally, data related to interactions with one’s attorney or doctor is subject to legal privileges. These privileges are designed to ensure that individuals can communicate openly and honestly with their legal and medical professionals without fear that their private information will be exposed or used against them in legal proceedings. However, the current draft does not distinguish between the sensitivity of different types of 'electronic data' or mandate robust privacy or data protection principles accordingly.
  • This definition  includes an array of electronic data types that can be processed, stored, or transmitted by ICT systems—ranging from text, images, documents, and biometric identifiers, to software programs and databases. Examples of electronic data in emerging technologies could include training data sets used for machine learning models, including images, text, and structured data; transaction records and smart contracts stored in blockchain networks; sensor data collected from smart devices such as temperature readings, motion detection, and environmental monitoring data; 3D models, spatial data, and user interaction logs used to create immersive experiences; among others. It also includes sensitive information about people that might not always be interpreted as communication data, such as biometric identifiers and neural data, among others.

    Three investigative powers—preservation orders (Article 25), production orders (Article 27), and search and seizure (Article 28)—relate to this broader category of “electronic data.” When data is stored, regardless of how it would have been classified for communications surveillance purposes, the Articles 25, 27, and 28 powers can be used to target it. That includes stored information that would have been regarded as subscriber, traffic, or content data in a communications surveillance context, as well as information (like on-device metadata or recordings, or a diary created locally but never shared) that could not be a target of communications surveillance at all. In other words, the categories “traffic data,” “subscriber information,” and “content data” apply to communications surveillance, but these categories are not used—and no such distinctions are drawn—in the context of the preservation, production, and search and seizure powers of stored electronic data.

    While there’s consensus that communications content deserves significant protection in law because of its capability to reveal sensitive information, it is now clear that other non-communication data, including those arising from a variety of “electronic data,” may reveal even more sensitive data about an individual than the content itself, and thus deserves at least equivalent level protection. The processing of this very sensitive electronic data, coupled with the absence of mandatory robust data protection principles and robust human rights safeguards in the convention itself, raises significant concerns about overreach, privacy invasion, and the unchecked power it grants to police.

    Today, these types of information might, taken alone or analyzed collectively, reveal a person’s identity, behavior, associations, physical or medical conditions, race, color, sexual orientation, national origins, or viewpoints. Emerging technologies illustrate these risks clearly. For instance, data from wearable health devices can disclose detailed medical conditions and physical activity patterns; smart home devices can track daily routines and behaviors; and social media analytics can infer political views, social connections, and personal preferences based on patterns of interactions, posts, and likes. Other body-worn sensors like those in augmented reality devices may reveal physiological information related to conscious and unconscious emotional reactions to things we see, hear, or do.

    Additionally, geolocation data from smartphones and Internet of Things (IoT) devices can map an individual’s movements over time, potentially identifying their location history, frequented places, and daily commutes, as well as patterns of whom they spent time with. Photo, video surveillance and face recognition data used in public and private spaces can identify individuals and track their interactions, while biometric data from various other sources can confirm identities and provide access to sensitive personal information.

    As a result, all data, including electronic data, should be given the highest protection in the proposed convention to safeguard individual privacy and prevent misuse amid the rise of emerging technologies. But the existing convention text gives individual countries huge discretion in what kind of protection to afford to people’s data when implementing these powers. As elsewhere, we should have mandatory privacy safeguards (not just what domestic law might conclude is “appropriate” under Article 24) providing strong limits and oversight for access to all sorts of sensitive data.

    Finally, the proposed convention’s vaunted “technological neutrality” also means that there is no built-in mechanism for imposing any new safeguards or restrictions on government access to new kinds of sensitive data in the future. If new technologies are more intimately connected with our bodies, brains, and activities than old technologies, or if they mediate more and more of our social or political lives, the proposed convention does not provide any road map to making the data they produce any harder for police to access.
Like Communication Surveillance Powers, Powers Related to "Electronic Data" All Lack Clear and Robust Privacy and Data Protection Safeguards


All three powers referring to “electronic data” share a problem which we’ve previously seen in other powers related to communications surveillance: none of them include clear mandatory privacy and data protection safeguards to limit how the powers are used. All of the investigative powers in Chapter IV of the draft convention rely on national laws to determine whether or not restrictions that govern them are “appropriate,” leaving out numerous international law standards that ought to be made explicit.

For the “electronic data” powers discussed below, this is equally alarming because these powers can potentially authorize law enforcement to obtain literally anything stored in any computer or digital storage medium. There are no kinds of data that are inherently off-limits in the the text of the convention itself (such as a rule that requests may not compel self-incrimination, or that they must respect privileges such as attorney-client privilege or doctor-patient privilege), nor even any that necessarily require prior judicial authorization to obtain, leaving such decisions to the discretion of national law.

Domestic Expedited Preservation Orders of Electronic Data
  • Article 25 on preservation orders, already agreed ad referendum, is especially problematic. It’s very broad, will result in individuals’ data being preserved and available for use in prosecutions far more than needed, and fails to include necessary safeguards to avoid abuse of power. By allowing law enforcement to demand preservation with no factual justification, it also risks spreading familiar deficiencies in U.S. law worldwide. Article 25 requires each country to create laws or other measures that let authorities quickly preserve specific electronic data, particularly when there are grounds to believe that such data is at risk of being lost or altered. 
  • Article 25(2) ensures that when preservation orders are issued, the person or entity in possession of the data must keep it for up to 90 days, giving authorities enough time to obtain the data through legal channels, while allowing this period to be renewed. There is no specified limit on the number of times the order can be renewed, so it can potentially be reimposed indefinitely. Preservation orders should be issued only when they’re absolutely necessary, but Article 24 does not mention the principle of necessity and lacks individual notice and explicit grounds requirements and statistical transparency obligations. The article fail to limit the number of times preservation orders may be renewed to prevent indefinite data preservation requirements. Each preservation order renewal must require a demonstration of continued necessity and factual grounds justifying continued preservation.

  • Article 25(3) also compels states to adopt laws that enable gag orders to accompany preservation orders, prohibiting  service providers or individuals from informing users that their data was subject to such an order. The duration of such a gag order is left up to domestic legislation. As with all other gag orders, the confidentiality obligation should be subject to time limits and only be available to the extent that disclosure would demonstrably threaten an investigation or other vital interest. Further, individuals whose data was preserved should be notified when it is safe to do so without jeopardizing an investigation. Independent oversight bodies must oversee the application of preservation orders.

Indeed, academics such as prominent law professor and former U.S. Department of Justice lawyer Orin S. Kerr have criticized similar U.S. data preservation practices under 18 U.S.C. § 2703(f) for allowing law enforcement agencies to compel internet service providers to retain all contents of an individual's online account without their knowledge, any preliminary suspicion, or judicial oversight. This approach, intended as a temporary measure to secure data until further legal authorization is obtained, lacks the foundational legal scrutiny typically required for searches and seizures under the Fourth Amendment, such as probable cause or reasonable suspicion.

The lack of explicit mandatory safeguards raise similar concerns about Article 25 of the proposed UN convention. Kerr argues that these U.S. practices constitute a "seizure" under the Fourth Amendment, indicating that such actions should be justified by probable cause or, at the very least, reasonable suspicion—criteria conspicuously absent in the current draft of the UN convention. 

By drawing on Kerr's analysis, we see a clear warning: without robust safeguards, including an explicit grounds requirement, prior judicial authorization, explicit notification to users, and transparency, preservation orders of electronic data proposed under the draft UN Cybercrime Convention risk replicating the problematic practices of the U.S. on a global scale.

Production Orders of Electronic Data

Article 27(a)’s treatment of “electronic data” in production orders, in light of the draft convention’s broad definition of the term, is especially problematic. This article, which has already been agreed ad referendum, allows production orders to be issued to custodians of electronic data, requiring them to turn over copies of that data. While demanding customer records from a company is a traditional governmental power, this power is dramatically increased in the UD.

As we explain above, the extremely  broad definition of electronic data, which is often sensitive in nature, raises new and significant privacy and data protection concerns, as it permits authorities to access potentially sensitive information without immediate oversight and prior judicial authorization. The convention needs instead to require prior judicial authorization before such information can be demanded from the companies that hold it.  This ensures that an impartial authority assesses the necessity and proportionality of the data request before it is executed. Without mandatory data protection safeguards for the processing of personal data, law enforcement agencies might collect and use personal data without adequate restrictions, thereby risking the exposure and misuse of personal information.

The draft convention fails to include these essential data protection safeguards. To protect human rights, data should be processed lawfully, fairly, and in a transparent manner in relation to the data subject. Data should be collected for specified, explicit, and legitimate purposes and not further processed in a manner that is incompatible with those purposes. 

Data collected should be adequate, relevant, and limited to what is necessary to the purposes for which they are processed. Authorities should request only the data that is essential for the investigation. Production orders should clearly state the purpose for which the data is being requested. Data should be kept in a format that permits identification of data subjects for no longer than is necessary for the purposes for which the data is processed. None of these principles are present in Article 27(a) and they must be. 

Search and Seizure of Stored Electronic Data 

The draft's Article 28, also agreed ad referendum, gives governments sweeping powers to search and seize electronic data, but without clear, mandatory privacy and data protection safeguards, poses a serious threat to privacy and data protection. Article 24 provides some limitations, but they are vague and insufficient, leaving much to the discretion of national laws, and subject to what each country deems “appropriate.” This could lead to significant privacy violations and misuse of sensitive personal information. 

  • Search or Access: Article 28(1) is a search-and-seizure power that applies to any “electronic data” in an information and communications technology (ICT) system (28(1)(a)) or data storage medium (28(1)(b)). Just as with the prior articles, it doesn’t include specific restrictions on these searches and doesn’t limit what may be targeted, for what purposes, or under what conditions. For example, this could allow authorities to access all files and data on a suspect’s personal computer, mobile device, or cloud storage account.

  • Expanding the Search: Article 28(2) allows authorities to search additional systems if they have grounds to believe the data sought is accessible from the initially searched system. While prior judicial authorization must be a requirement so the judge can assess the necessity and proportionality of the search, Article 24 only mandates appropriate conditions and safeguards without explicit judicial authorization. In the US, for example, this power triggers Fourth Amendment protections, which require particularity—specifying the place to be searched and the items to be seized—in search warrants to prevent unreasonable searches and seizures​, Article 28(3) empowers authorities to seize or secure electronic data accessed under the previous provisions, including making and retaining copies of electronic data, maintaining its integrity, and rendering it inaccessible or removing it from the system.

  • Seizure or Securing Data: Article 28(3)(d) specifically allows authorities to “[r]ender inaccessible or remove those electronic data in the accessed information and communications technology system.” For instance, authorities could copy and store all emails and documents from a suspect’s cloud storage service and then delete them from the original source.

Additionally, Article 28(3)(d) raises additional significant free expressions and security concerns. 

  • First, it seems to allow a court order to permanently destroy the only copy of some data, as there is no requirement to make a backup or to be prepared to restore the data later if there is no court process or the person is not convicted of a crime.
  • Second, with regard to publicly accessible data, this is a form of takedown process that can implicate free expression concerns. Articles 5 and 24 help mitigate these concerns. By applying these safeguards, Articles 5 and 24 aim to ensure that the implementation of Article 28(3)(d) does not infringe on free expression or result in disproportionate actions. However, due to the deficiencies in these articles, it remains to be seen how they will be applicable in practice. 

As we have written before, Article 24, on conditions and safeguards, fails to protect human rights, by deferring safeguards to national law, rather than laying out strong protections to match the increased powers that the proposed convention provides. It fails to explicitly include crucial principles like legality, necessity, and non-discrimination. Effective human rights protections require prior judicial approval before surveillance is conducted, transparency about actions taken, notifying users when their data is accessed if it does not jeopardize the investigation, and providing ways for individuals to challenge abuses. By deferring those safeguards to national law, Article 24 weakens these protections, as national laws can vary greatly and may not always provide the necessary safeguards. 

A safeguard in a treaty that defers to national laws risks inconsistency and abuse. Strong protections in some nations may be undermined by weaker laws in others, ultimately failing to provide the promised protection.

This creates a race to the bottom in human rights standards, where the weakest domestic laws set the global norm, jeopardizing privacy, data protection, and fundamental freedoms that the United Nations treaty aims to uphold. 

International Cooperation and Electronic Data

The draft UN Cybercrime Convention includes significant provisions for international cooperation, extending the reach of domestic surveillance powers across borders, by one state on behalf of another state. Such powers, if not properly safeguarded, pose substantial risks to privacy and data protection. (While this post focuses on the safeguards for electronic data, equally concerning is the treatment of communication data, particularly subscriber data and traffic data, which also lacks robust protections and brings up concerning risks.)

  • Article 42 (1) (“International cooperation for the purpose of expedited preservation of stored electronic data”) allows one state to ask another to obtain preservation of “electronic data” under the domestic power outlined in Article 25. For example, if Country A is investigating a crime and suspects that relevant data is stored on servers in Country B, Country A can request Country B to preserve this data to prevent it from being deleted or altered before Country A can formally request access to it. Country A may use the 24/7 network as outlined in Article 41(3)(c) to seek information about the data’s location and the service provider. 

    The 24/7 network significantly extends its role beyond merely preserving stored electronic data in Articles 41(3)(c) & (d). The network 24/7 is also empowered to collect evidence when provided legal information, and locate suspects, as well as provide electronic data to avert emergencies if "permitted by the domestic law and practice of the requested Country. Alarmingly, Article 24, which sets out conditions and safeguards, does not apply to the powers exercised by the 24/7 Network. This absence of oversight means that the network can operate without the necessary checks and balances, potentially leading to abuses of power.

    It is important to note that Article 23(4) regarding the scope of application of the domestic criminal procedural measures (Chapter IV) only authorizes the application of Article 24 safeguards to specific powers within the international cooperation chapter (Chapter V). While one could argue that powers in Chapter V closely matching those in Chapter IV should be subject to the same safeguards, significant powers in Chapter V, such as those related to law enforcement cooperation (Article 47) and the 24/7 network (Article 41), do not specifically reference the corresponding Chapter IV powers. Consequently, they may not be covered by Article 24 safeguards. This leaves critical aspects, such as handling electronic data in an emergency or turning over subscriber information and location, without adequate human rights protections. Furthermore, Article 47 on law enforcement cooperation highlights the extensive sharing and exchange of sensitive data, emphasizing the risks of misuse.

  • Article 44 (1) (“Mutual legal assistance in accessing stored electronic data”) allows one state to ask another “to search or similarly access, seize or similarly secure, and disclose electronic data,” presumably using powers similar to those under Article 28, although that article is not referenced in Article 44. This specific provision, which has not yet been agreed ad referendum, enables comprehensive international cooperation in accessing stored electronic data. For instance, if Country A needs to access emails stored in Country B for an ongoing investigation, it can request Country B to search and provide the necessary data.
Ironclad Data Protection Principles Are Essential for the Proposed Convention

The basic powers for domestic surveillance are not new and are relatively straightforward, but the introduction of an international convention granting authorities new access to sensitive data—especially across borders—demands stringent data protection measures. 

  • Data processing must be lawful and fair.
  • Data should be collected only for specified, explicit, and legitimate purposes and not processed further in a way incompatible with those purposes.
  • Data collection must be minimized so that it’s adequate, relevant, and not excessive in relation to the government’s specific stated purposes.
  • Data should be accurate and kept up to date.
  • Data must not be kept longer than absolutely necessary.
  • Data must be protected against unauthorized access and breaches.
  • Individuals should be able to access information about the processing of their own personal data.
  • Individuals should be informed about how their data is being used, the purpose of processing, and their rights.
  • Data controllers must demonstrate compliance with data protection principles, with accountability mechanisms in place to hold them responsible for violations.

Respecting human rights is not only a legal obligation but also a practical necessity for law enforcement. As the Office of the High Commissioner for Human Rights (OHCHR) said in “Human Rights and Law Enforcement: A Trainer’s Guide on Human Rights for the Police,” law enforcement agencies’ effectiveness is improved when they respect human rights. Moreover, as the Vienna Declaration and Programme of Action note, “The administration of justice, including law enforcement (...) agencies, (...) in full conformity with applicable standards contained in international human rights instruments, [is] essential to the full and non-discriminatory realization of human rights and indispensable to the process of democracy and sustainable development.”

Conclusion

The current draft of the UN Cybercrime Convention is fundamentally flawed. It dangerously expands surveillance powers without robust checks and balances, undermines human rights, and poses significant risks to marginalized communities. The broad and vague definitions of "electronic data," coupled with weak privacy and data protection safeguards, exacerbate these concerns.

 Traditional domestic surveillance powers are particularly concerning as they underpin international surveillance cooperation. This means that one country can easily comply with the requests of another, which if not adequately safeguarded, can lead to widespread government overreach and human rights abuses. 

Without stringent data protection principles and robust privacy safeguards, these powers can be misused, threatening human rights defenders, immigrants, refugees, and journalists. We urgently call on all countries committed to the rule of law, social justice, and human rights to unite against this dangerous draft. Whether large or small, developed or developing, every nation has a stake in ensuring that privacy and data protection are not sacrificed. 

Significant amendments must be made to ensure these surveillance powers are exercised responsibly and protect privacy and data protection rights. If these essential changes are not made, countries must reject the proposed convention to prevent it from becoming a tool for human rights violations or transnational repression.

Katitza Rodriguez

Hundreds of Tech Companies Want to Cash In on Homeland Security Funding. Here's Who They Are and What They're Selling.

2 weeks 4 days ago

This post was co-written by EFF research intern Andrew Zuker.

Whenever government officials generate fear about the U.S.-Mexico border and immigration, they also generate dollars–hundreds of millions of dollars–for tech conglomerates and start-ups.

The Electronic Frontier Foundation (EFF) today has released the U.S. Border-Homeland Security Technology Dataset, a multilayered dataset of the vendors who supply or market the technology for the U.S. government’s increasingly AI-powered homeland security efforts, including the so-called “virtual wall” of surveillance along the southern border with Mexico.

The four-part dataset includes a hand-curated directory that profiles more than 230 companies that manufacture, market or sell technology products and services, including DNA-testing, ground sensors, and counter-drone systems, to U.S. Department of Homeland Security (DHS) components engaged in border security and immigration enforcement. Vendors on this list are either verified federal contract holders, or have sought to do business with immigration/border authorities or local law enforcement along the border, through activities such as advertising homeland security products on their websites and exhibiting at border security conferences.

It features companies often in the spotlight, including Elbit Systems and Anduril Industries, but also lesser-known contractors, such as surveillance vendors Will-Burt Company and Benchmark. Many companies also supply the U.S. Department of Defense as part of the pipeline from battlefields to the borderlands.

The spreadsheet includes a separate list of 463 companies that have registered for Customs and Border Protection (CBP) and Immigration and Customs Enforcement "Industry Day" events and a roster of 134 members of the DHS-founded Homeland Security Technology Consortium. Researchers will also find a compilation of the annual Top 100 contractors to DHS and its components dating back to 2006.

Download the dataset as an XLSX file through this link or access it as a Google Sheet (Google's Privacy Policy applies).

Border security and surveillance is a rapidly growing industry, fueled by the potential of massive congressional appropriations and accelerated by the promise of artificial intelligence. Of the 233 companies included in our initial survey, two-thirds promoted artificial intelligence, machine learning, or autonomous technology in their public-facing materials.

An HDT Global vehicle at the 2024 Border Security Expo. Source: Dugan Meyer (CC0 1.0 Universal)

Federal spending on homeland security has increased year over year, creating a lucrative market which has attracted investment from big tech and venture capital. Just last month, U.S. Rep. Mark Amodei, Chair of the House Appropriations Homeland Security Subcommittee, defended a funding package that included a "record-level" $300 million in funding for border security technology, including "autonomous surveillance towers; mobile surveillance platforms; counter-tunnel equipment, and a significant investment in counter-drone capability." 

This research project was made possible with internship support from the Heinrich Böll Foundation, in collaboration with EFF and the Reynolds School of Journalism at the University of Nevada, Reno.

Drew Mitnick of the Böll Foundation, who was also involved in building a similar data set of European vendors, says mapping the homeland security technology industry is essential to public debate. "We see the value of the project will be to better inform policymakers about the types of technology deployed, the privacy impact, the companies operating the technology, and the nature of their relationships with the agencies that operate the technology," he said.​

Information for this project was aggregated from a number of sources including press releases, business profile databases, vendor websites, social media, flyers and marketing materials, agency websites, defense industry publications, and the work of journalists, advocates, and watchdogs, including the Electronic Frontier Foundation and the student researchers who contribute to EFF’s Atlas of Surveillance. For our vendor profiles, we verified agency spending with each vendor using financial records available online through both the Federal Procurement Data System (FPDS.gov), and USAspending.gov websites.

While many of the companies included have multiple divisions and offer a range of goods and services, this project is focused specifically on vendors who provide and market technology, communications, and IT capabilities for DHSsub-agencies, including CBP, ICE and Citizenship and Immigration Services (CIS). We have also included companies that sell to other agencies operating at the border, such as the Drug Enforcement Administration and state and local law enforcement agencies engaged in border enforcement.

The data is organized by vendor and includes information on the type of technology or services they offer, the vendor’s participation in specific federal border security initiatives, procurement records, the company's website, parent companies and related subsidiaries, specific surveillance products offered, and which federal agencies they serve. Additional links and supporting documents have been included throughout. We have also provided links to scans of promotional materials distributed at border security conferences.

This dataset serves as a snapshot of the homeland security industry. While we set out to be exhaustive, we discovered the corporate landscape is murky with acquisitions, mergers, holding companies, and sub-sub-contractors that often intentionally obscure the connections between the various enterprises attempting to rake in lucrative government contracts. We hope that by providing a multilayered view, this data will serve as a definitive resource for journalists, academics, advocates of privacy and human rights, and policymakers. 

This work should be the starting point for further investigation—such as Freedom of Information Act requests and political influence analysis—into the companies and agencies rapidly expanding and automating surveillance and immigration enforcement, whether the aim is to challenge a political narrative or to hold authorities and the industry accountable.

If you use this data in your own research or have information that would further enrich the dataset, we'd love to hear from you at aos@eff.org.

Dave Maass

Craig Newmark Philanthropies Matches EFF's Monthly Donors

3 weeks 1 day ago

Craig Newmark Philanthropies will match up to $30,000 for your entire first year as a new monthly or annual EFF Sustaining Donor! Many thanks to Craig Newmark—founder of craigslist and a persistent supporter of digital freedom— for making this possible. This generous matching challenge bolsters celebrations for EFF's 34th anniversary on July 10 as well as EFF's ongoing summer membership drive: be a member for as little as $20 and get rare gifts featuring The Encryptids (including a Bigfoot enamel pin!).

Since its founding in 1990, the Electronic Frontier Foundation has relied on member support to power its public interest legal work, advocacy, and technology development. To wit, more than half of EFF's funding comes from small dollar donors around the world, and EFF's community of monthly and annual Sustaining Donors play a crucial role in keeping the organization running strong. Sustaining Donors giving $10 or less each month raised over $400,000 for EFF last year. Every member and every cent counts. This free donation matching offer from Craig Newmark Philanthropies takes EFF supporters' donations even further at a time when many households are especially conscious of their finances.

Sustaining Donors giving $10 or less each month raised over $400,000 for EFF last year. Every member and every cent counts.

Over the past several years, grants from Craig Newmark Philanthropies have focused on supporting trustworthy journalism to defend our democracy and hold the powerful accountable, as well as cybersecurity to protect consumers and journalists alike from malware and other dangers online. With over 30 years of donor support from Newmark, EFF built networks to help defend against disinformation warfare, fought online harassment, strengthened ethical journalism, and researched state-sponsored malware, cyber-mercenaries, and consumer spyware. EFF’s Threat Lab conducts research on surveillance technologies used to target journalists, communities, activists, and individuals. For example, EFF helped co-found, and continues to provide leadership to the Coalition Against Stalkerware. EFF also created and updated tools to educate and train working and student journalists alike to keep themselves safe from adversarial attacks. In addition to maintaining our popular Surveillance Self Defense guide, EFF scaled up the Report Back tool for student journalists, cybersecurity students, and grassroots volunteers to collaboratively study technology in society.

'Fix Copyright' member t-shirts. Creativity is fun for the whole family.

With this generous matching challenge from Craig Newmark Philanthropies, we are pleased to double the impact of our recurring Sustaining Donors and let all digital rights supporters know that we're in this together. EFF is deeply grateful to its passionate members and everyone who values a brighter future for privacy, security, and free expression.

Become a Sustaining Donor

Double Your Impact on Digital Rights today

Aaron Jue

It’s Time For Lawmakers to Listen to Courts: Your Law Regulating Online Speech Will Harm Internet Users’ Free Speech Rights

3 weeks 2 days ago

Despite a long history of courts ruling that government efforts to regulate speech online harm all internet users and interfere with their First Amendment rights, state and federal lawmakers continue to pass laws that do just that. Three separate rulings issued in the past week show that the results of these latest efforts are as predictable as they are avoidable: courts will strike down these laws.

The question is, why aren’t lawmakers listening? Instead of focusing on passing consumer privacy legislation that attacks the harmful business practices of the most dominant online services, lawmakers are seeking to censor the internet or block young people from it. Instead of passing laws that increase competition and help usher in a new era of online services and interoperability, lawmakers are trying to force social media platforms to host specific viewpoints. 

Recent decisions by the Supreme Court and two federal district courts underscore how these laws, in addition to being unconstitutional, are also bad policy. Whatever the good intentions of lawmakers, laws that censor the internet directly harm people’s ability to speak online, access others’ speech, remain anonymous, and preserve their privacy.

The consistent rulings by these courts should send a clear message to lawmakers considering internet legislation: it’s time to focus on advancing legislation that solves some of the most pressing privacy and competition problems online without censoring users’ speech. Those proposals have the benefit of both being constitutional and addressing many of the harms people—both adults and young people—experience online. Let’s take a look at each of these cases.

Court Puts Mississippi Law on Hold, Highlights KOSA’s Unconstitutionality

A federal court on Monday blocked Mississippi from enforcing its children’s online safety law (House Bill 1126), ruling that it violates the First Amendment rights of adults and young people. The law requires social media services to verify the ages of all users, to obtain parental consent for any minor users, and to block minor users from being exposed to “harmful” material.

EFF filed a friend-of-the-court brief in the case that highlighted the many ways in which the law burdened adults’ ability to access lawful speech online, chilled anonymity online, and threatened their data privacy.

The district court agreed with EFF, ruling that “the Act requires all users (both adults and minors) to verify their ages before creating an account to access a broad range of protected speech on a broad range of covered websites. This burdens adults’ First Amendment rights.”

The court’s ruling also demonstrates the peril Congress faces should it advance the Kids Online Safety Act. Both chambers have drafted slightly different versions of KOSA, but the core provision of both bills would harm internet usersespecially young people—by censoring a large swath of protected speech online.

EFF has previously explained in detail why KOSA will block everyone’s ability to access information online in ways that violate the First Amendment. The Mississippi court’s ruling earlier this week confirms that KOSA is unconstitutional, as the law contains similar problematic provisions.

Both Mississippi HB 1126 and KOSA include a provision that imposes liability on social media services that fail to “prevent and mitigate” exposing young people to several categories of content that the measures deem to be harmful. And both bills include a carveout that says a service will not face liability if a young person independently finds the material or searches for it.

The district court ruled that these “monitoring-and-censorship requirements” violated the First Amendment. First, the court found that the provision created restrictions on what content could be accessed online and thus triggered strict review under the First Amendment. Next, the court found that the provision fell far short of complying with the First Amendment because it doesn’t effectively prevent the harms to minors that Mississippi claims justify the law.

In short, if lawmakers believe they have a compelling interest in blocking certain content from minors online, the carveout provisions of KOSA and HB 1126 undercut their claims that such information is inherently harmful to minors. The First Amendment prevents lawmakers from engaging in such half-measures precisely because those proposals chill vast amounts of lawful speech while being inherently ineffective at addressing the harms that animated enacting them in the first place.

Another aspect of the court’s decision putting HB 1126 on hold should also serve as a warning to KOSA’s proponents. The Mississippi court ruled that the state law also ran afoul of the First Amendment because it treated online services differently based on the type of content they host.

HB 1126 broadly regulates social media services that allow users to create and post content and interact with others. But the law exempts other online services that “provide a user with access to news, sports, commerce, online video games or content primarily generated or selected by the digital service provider.”

The district court ruled that HB 1126’s exemption of certain online services based on the content subjected the law to the First Amendment’s strict requirements when lawmakers seek to regulate the content of lawful speech.

“The facial distinction in H.B. 1126 based on the message the digital service provider conveys, or the more subtle content-based restriction based upon the speech’s function or purpose, makes the Act content-based, and therefore subject to strict scrutiny,” the court wrote.

KOSA contains a similar set of carveouts in its definitions. The bill would apply to online services that are likely to be used by minors but exempts news and sports websites and related services. KOSA will thus similarly be subjected to strict review under the First Amendment and, as EFF has said repeatedly, will likely fall far short of meeting the Constitution’s requirements.

Indiana Court Reaffirms That Age-Verification Schemes Violate the First Amendment

An Indiana federal court’s decision to block the state’s age-verification law highlights the consensus that such measures violate the First Amendment because they harm adults’ ability to access protected speech and burden their rights to anonymity and privacy. The decision casts significant doubt on similar bills being contemplated across the country, including in California.

The Indiana law requires an online service in which more than one-third of the content hosted includes adult sexual materials to verify the ages of its users and block young people from that material. The age-verification mandate required services to obtain government-issued identification from users or to have users submit to invasive methods to verify their age, such as providing personal information or facial recognition.

The court ruled that Indiana’s law was unconstitutional because it placed immense burdens on adults’ rights to access “a significant amount of speech protected by the First Amendment.” In particular, the law would require general-purpose websites that serve a variety of users and host a variety of content to implement age verification for all users if a third of the content featured sexual material.

As a result, users who visited that site but never accessed the sexual content would still have to verify their age. “Indeed, the Act imposes burdens on adults accessing constitutionally protected speech even when the majority of a website contains entirely acceptable, and constitutionally protected, material,” the court wrote.

Conversely, young people who have a First Amendment right to access the majority of non-sexual content on that site would not be able to.

The Indiana court’s decision is in keeping with more than two decades’ worth of rulings by the Supreme Court and lower courts that have found age-verification laws to be unconstitutional. What’s remarkable is that, despite this clearly developed law, states across the country continue to try to pass these laws.

Lawmakers should heed these courts’ consistent message and work on finding other ways to address harms to children online, such as by passing comprehensive data privacy laws, rather than continuing to pass laws that courts will strike down.

Supreme Court Confirms that Laws Targeting Content Moderation Will Face First Amendment Challenges, But Data Privacy and Competition Laws are Fair Game

The Supreme Court’s ruling this week in a pair of cases challenging states’ online content moderation laws should also serve as a wakeup call to lawmakers. If a state or Congress wants to pass a law that requires or coerces an online service to modify how it treats users’ speech, it will face an uphill battle to being constitutional.

Although EFF plans to publish an in-depth analysis of the decision soon, the court’s decision confirms what EFF has been saying for years: the First Amendment limits lawmakers ability to dictate what type of content online services host. And although platforms often make bad or inconsistent content moderation decisions, users are best served when private services—not the government—make those choices.

Importantly, the Supreme Court also confirmed something else EFF has long said: the First Amendment is not a barrier to lawmakers enacting measures that target dominant social media companies’ invasive privacy practices or their anti-competitive behavior.

Comprehensive consumer data privacy laws that protect all internet users are both much needed and can be passed consistent with the First Amendment.

The same is true for competition laws. Lawmakers can pass measures that instill greater competition for users and end the current companies’ dominance. Also, laws could allow for the development and growth of a variety of third-party services that can interoperate with major social media companies and provide options for users that the major companies do not.

The Supreme Court’s decision thus reinforces that lawmakers have many paths to addressing many of the harms occurring online, and that they can do so without violating the First Amendment. EFF hopes that lawmakers will take up this opportunity, and we continue to be ready to help lawmakers pass pro-competition and consumer data privacy laws.

Related Cases: NetChoice Must-Carry Litigation
Aaron Mackey

Careful with your marshmallows 🔥

3 weeks 3 days ago

EFF’s summer membership drive ends next week! Through EFF's 34th anniversary:

The digital future relies on your support—donate today!

We’re back with the final story from our friends, The Encryptids, who have come out of the woodwork to celebrate EFF’s summer membership drive! These creatures may be mysterious, but your digital rights shouldn’t be.

We all know technology can be transformative. In today’s post, the multifaceted and multitalented Wolfgang Von Lycanz illustrates how the internet can amplify the best parts of us...

-Aaron Jue
EFF Membership Team

_______________________________________

Y

ou contain multitudes. My boss knows me as Wolfgang, the mild-mannered insurance actuary. This is true. But on the internet, I am a radical environmentalist. I am a passionate moombahton DJ. I am a scholar of medieval Slavic history. I am a celebrated author of Doctor Who fanfiction. I am a friend to people I have never met in person. Online, each geeky part of me blossoms in its own meadow, and I find community with people who understand. This is the internet at its best: a place to be creative, learn without fear, and explore the different aspects of our identities freely.

This is the internet at its best: a place to be creative, learn without fear, and explore the different aspects of our identities freely.

I recall when Facebook and Google decided that using your government name was the key to civility online. Of course, they ignored what would happen to the dissidents who fight dictators, people evading their abusers, queer teens, and even performers. In fact, it took a gaggle of famous drag queens to show Facebook that the name on your driver’s license isn’t the right one all the time. Sparkle party, indeed! Yet lawmakers did not learn that lesson; to this day, EFF is in Congress and in court to stop laws that would require personal identification to use the web.

Your privacy, your creativity, and the kaleidoscope of curiosity in your heart are the reasons why EFF defends your freedoms. For all the problems with technology, EFF understands that magical transformations still happen when you prioritize human rights and respect for the users.

Join EFF

Digital Rights For All. human rights For All.

I believe in EFF's mission to ensure that technology supports freedom, justice, and innovation for all people of the world; its members are the key to getting there. I support EFF with every part of me, and I hope you will donate to support internet freedom for all of us.

For the wild and woolly web,

Wolfgang Von Lycanz
a.k.a DJ Moombeamz
a.k.a. dalekcious42  
a.k.a. dnԀ-oʌosoʞ

_______________________________________

EFF is a member-supported U.S. 501(c)(3) organization celebrating TEN YEARS of top ratings from the nonprofit watchdog Charity Navigator! Your donation is tax-deductible as allowed by law.

 

Wolfgang Von Lycanz

Podcast Episode: Fighting Enshittification

3 weeks 4 days ago

The early internet had a lot of “technological self-determination" — you could opt out of things, protect your privacy, control your experience. The problem was that it took a fair amount of technical skill to exercise that self-determination. But what if it didn’t? What if the benefits of online privacy, security, interoperability, and free speech were more evenly distributed among all internet users?

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Fe6c14fe2-a0d1-41e7-bafa-f6cd4202625e%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

(You can also find this episode on the Internet Archive and on YouTube.)

This is the future that award-winning author and EFF Special Advisor Cory Doctorow wants us to fight for. His term “enshittification” — a downward spiral in which online platforms trap users and business customers alike, treating them more and more like commodities while providing less and less value — was selected by the American Dialect Society as its 2023 Word of the Year. But, he tells EFF’s Cindy Cohn and Jason Kelley, enshittification analysis also identifies the forces that used to make companies treat us better, helping us find ways to break the cycle and climb toward a better future. 

In this episode you’ll learn about: 

  • Why “intellectual property” is a misnomer, and how the law has been abused to eliminate protections for society 
  • How the tech sector’s consolidation into a single lobbying voice helped bulldoze the measures that used to check companies’ worst impulses 
  • Why recent antitrust actions provide a glimmer of hope that megacompanies can still be forced to do better for users 
  • Why tech workers’ labor rights are important to the fight for a better internet 
  • How legislative and legal losses can still be opportunities for future change 

Cory Doctorow is an award-winning science fiction author, activist, journalist and blogger, and a Special Advisor to EFF. He is the editor of Pluralistic and the author of novels including “The Bezzle” (2024), “The Lost Cause” (2023), “Attack Surface” (2020), and “Walkaway” (2017); young adult novels including “Homeland” (2013) and “Little Brother” (2008); and nonfiction books including “The Internet Con: How to Seize the Means of Computation” (2023) and “How to Destroy Surveillance Capitalism” (2021). He is EFF's former European director and co-founded the UK Open Rights Group. Born in Toronto, Canada, he now lives in Los Angeles. 

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here.

Transcript

CORY DOCTOROW
So interop, you know, it's the idea that you don't need to buy your washing machine from the same people who sold you your clothes. You can use anyone's washing soap in that washing machine. Your dishes go in, in any dishwasher. Anyone's gas or electricity go into your car, you can bring your luggage onto any airline.
You know, there's just this kind of general presumption that things work together and sometimes that's just a kind of happy accident or a convergence where, you know, the airlines basically all said, okay, if it's bigger than seventy-two centimeters, we're probably gonna charge you an extra fee. And the luggage makers all made their luggage smaller than seventy-two centimeters, or you know, what a carry-on constitutes or whatever. Sometimes it's very formal, right? Sometimes like you go to a standards body and you're like, this is the threading gauge and size of a standard light bulb. And that means that every light bulb that you buy is gonna fit into every light bulb socket.
And you don't have to like read the fine print on the light bulb to find out if you've bought a compatible light bulb. And, sometimes it's adversarial. Sometimes the manufacturer doesn't want you to do it, right? Like, so HP wants you to spend something like $10,000 a gallon on printer ink and most of us don't want to spend $10,000 a gallon on printer ink and so out there are some people who figured out how HP printers ask a cartridge, ‘Hey, are you a cartridge that came from HP?’.
And they figured out how to get cartridges that aren't made by HP to say ‘Why yes, I am. And you know, it's not like the person buying the cartridge is confused about this. They are specifically like typing into a search engine, ‘How do I avoid paying HP $10,000 a gallon?’

CINDY COHN
That's Cory Doctorow. He's talking about all the places in our lives where, whether we call it that or not, we get to enjoy the power of interoperability.
I'm Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I'm Jason Kelley, EFF's Activism Director. This is our podcast series How to Fix the Internet.

CINDY COHN
We spend a lot of time here at EFF warning about the things that could go wrong online -- and then of course jumping into the fray when they do go wrong. But on this show we're trying to envision what the world looks like if we start to get things right.

JASON KELLEY
Our guest today is Cory Doctorow. He is one of the world’s leading public thinkers about the digital world, as well as an author and activist. He writes both fiction and non fiction that has more ideas per page than anyone else we know.

CINDY COHN
We’re lucky enough that he’s been one of our colleagues at EFF for over 20 years and he’s one of my dearest friends. We had Cory on the podcast during our first season. I think he was our very first guest - but we thought it was time to check in again. And that’s not only because he’s so much fun to talk to, but also because the central idea he has championed for addressing the problems of platform monopolies – an idea called interoperability which we also call competitive compatibility – it’s started to get real traction in policy spaces both in the US and in Europe.
I quote Cory a lot on this show, like the idea that we don't want to go back to the good old days. We're trying to create the good new days. So I thought that it was a good place to start. What do the good new days look like in the Coryverse?

CORY DOCTOROW
So the old good internet was characterized by a very high degree of what I call like technological self-determination. Just the right to just decide how the digital tools you use work.
The problem was that it also required a high degree of technical skill. There are exceptions right. I think ad blockers are kind of our canonical exception for, you know, describing what a low-skill, high-impact element of technological self-determination is. Like more than half of all web users now run ad blockers. Doc Searls calls it the largest consumer boycott in human history.
And you don't have to be a brain surgeon or a hacker to install an ad blocker. It's just like a couple of clicks and away you go. And I think that a new good internet is one in which the benefits of technological self-determination, all the things you get beyond an ad blocker, like, you know, I'm speaking to you from a household that's running a pie hole, which is like a specialized data appliance that actually blocks ads in other things like smart TVs and apps and whatever.
I have a personal VPN that I run off my home network so that when I'm roaming - I just got back from Germany and they were blocking the port that I used for my mail server, and I could VPN into my house and get my email as though I were connected via my home - all of those things should just accrue to you with the ease that you get from an ad blocker because we can harness markets and tinkerers and cooperatives and people who aren't just making a thing to scratch their own itch, but are actually really invested in other people who aren't technically sophisticated being able to avail themselves of these tools too. That's the new good internet

CINDY COHN
I love that. I mean, you know, what is it? The future is here. It's just not evenly distributed. You just want to evenly distribute the future, and also make it simpler for folks to use.

CORY DOCTOROW
Yeah. You know, the problem of the old good internet was not the part where skilled technical practitioners didn't have to put up with nonsense from companies that didn't have their best interests at heart. Right?
The problem was that not everybody got that. Well, the good future of the internet is one in which we more evenly distribute those benefits. The bad future of the internet we're living in now is the one in which it's harder and harder, even for skilled practitioners, to enjoy those benefits.

CINDY COHN
And harder for the rest of us to get them, right? I hear two things, both as an end user, my world's gonna have a lot more choices, but good choices about things I can do to protect myself and places I can look for that help. And then as somebody who's a hacker or an innovator, you're gonna have a lot easier way to take your good idea, turn it into something and make it actually work, and then let people find it.

CORY DOCTOROW
And I think it's even more than that, right? Because I think that there's also the kind of incentives effect. You know, I'm not the world's biggest fan of markets as the best way to allocate all of our resources and solve all of our problems. But one thing that people who really believe in markets like to remind us of is that incentives matter.
And there is a kind of equilibrium in the product planning meeting where someone is always saying, ‘If we make it this bad, will someone type into a search engine, ‘How do I unrig this game?’ Because once they do that, then all bets are off, right? Think about again, back to ad blockers, right? If, if someone in the boardroom says, Hey, I've calculated that if we make these ads 20% more invasive we’ll increase our revenue per user by 2%.
Someone else who doesn't care about users necessarily, might say, yeah, but we think 20% of users will type ‘How do I block ads’ into a search engine as a result of this. And the expected revenue from that user doesn't just stay static at what we've got now instead of rising by 2%. The expected revenue from that user falls to zero forever.
We'll never make an advertising dime off of that user once they type ‘How do I block ads’ into a search engine. And so it isn't necessary even that the tools defend you. The fact that the tools might defend you changes the equilibrium, changes the incentives, changes the conduct of firms. And where it fails to do that, it then affords you a remedy.
So it's both belt and suspenders. Plan A and plan B.

JASON KELLEY
It sounds like we're veering happily towards some of the things that you've talked about lately with the term that you coined last year about the current moment in our digital world: Enshittification. I listened to your McLuhan lecture and it brought up a lot of similar points to what you're talking about now. Can you talk about this term? In brief, what does it mean, and, you know, why did the American Dialect Society call it the word of the year?

CORY DOCTOROW
Right. So I mean, the top level version of this is just that tech has these unique, distinctive technical characteristics that allow businesses to harm their stakeholders in ways that are distinct from the ways that other companies can just because like digital has got this flexibility and this fluidity.
And so it sets up this pattern that as the regulation of tech and as the competition for tech and as the force that workers provided as a check on tech's worst, worst impulses have all failed, we've got this dynamic where everything we use as a platform, and every platform is decaying in the same way, where they're shifting value first to users, to trap users inside a walled garden, and then bringing in business customers with the promise of funneling value from those users to those business customers, trapping those business customers, and then once everybody is held hostage, using that flexibility of digital tools to take that value away without releasing the users.
So even though the service is getting worse and worse for you, and it's less and less valuable to you, you still find yourself unable to leave. And you are even being actively harmed by it as the company makes it worse and worse.
And eventually it reaches a breaking point. Eventually things are so bad that we leave. But the problem is that that's like a catastrophic ending. That's the ending that, you know, everybody who loved LiveJournal had. Where they loved LiveJournal and the community really mattered to them.
And eventually they all left, but they didn't all end up in the same place. The community was shattered.
They just ended up fragmented and you can still hear people for whom LiveJournal was really important, saying like, I never got that back. I lost something that mattered to me. And so for me, the Enshittification analysis isn't just about like how do we stop companies from being bad, but it's about how we allow people who are trapped by bad companies to escape without having to give up as much as they have to give up now.

CINDY COHN
Right, and that leads right into adversarial interoperability, which is a term that I think was coined by Seth Schoen, EFF’s original staff technologist. It's an idea that you have really thought about a lot Cory and developed out. We heard you talk at the beginning of the episode, with that example about HP printers.

CORY DOCTOROW
That adversarial interoperability, it's been in our technology story for as long as we've had digital tools, because digital tools have this flexibility we've alluded to already. You know, the only kind of digital computer we can make is the Turing complete von Neumann machine.
It runs every program that's valid and that means that, you know, whenever a manufacturer has added an anti-feature or done something else abusive to their customers, someone else has been able to unlock it.
You know, when IBM was selling mainframes on the cheap and then charging a lot of money for printers and you know, keyboards and whatever, there were these things called plug compatible peripherals.
So, you know these companies they call the Seven Dwarfs, Fujitsu and all these other tech companies that we now think of as giants, they were just cloning IBM peripherals. When Apple wanted to find a way for its users to have a really good experience using Microsoft Office, which Microsoft had very steadfastly refused them and had, uh, made just this unbelievably terrible piece of software called, uh, office for the Mac that just didn't work and had all these compatibility problems, Steve Jobs just had his technologist reverse engineer Office, and they made iWork pages numbers in Keynote.
And it can read and write all the files from Excel, PowerPoint and Word. So this has always been in our story and it has always acted as a hedge on the worst impulses of tech companies.
And where it failed to act as a hedge, it created an escape valve for people who are trapped in those bad impulses. And as tech has become more concentrated, which itself is the result of a policy choice not to enforce antitrust law, which allowed companies to gobble each other up, become very, very concentrated.
It became easier for them to speak with one voice in legislative outlets. You know, when Seth coined the term adversarial interoperability, it was about this conspiracy among the giant entertainment companies to make it illegal to build a computer that they hadn't approved of called the Broadcast Flag.
And the reason the entertainment companies were able to foist this conspiracy on the tech industry, which was even then, between one and two orders of magnitude larger than the entertainment companies, is that the entertainment companies were like seven firms and they spoke with one voice and tech was a rabble.
It was hundreds of companies. We were in those meetings for the broadcast protection discussion group where you saw hundreds of companies at each other's throats not able to speak with one voice. Today, tech speaks with one voice, and they have taken those self-help measures, that adversarial interoperability, that once checked their worst impulses, and they have removed them from us.
And so we get what Jay Freeman calls felony contempt of business model where, you know, the act of reverse engineering a printer cartridge or an office suite or mobile operating system gives rise to both civil and criminal penalties and that means no one invests in it. People who do it take enormous personal risks. There isn't the kind of support chain.
You definitely don't get that kind of thing where it's like, ‘just click this button to install this thing that makes your experience better.’ To the extent that it even exists, it's like, download this mysterious software from the internet. Maybe compile it yourself, then figure out how to get it onto your device.
No one's selling you a dongle in the checkout line at Walmart for 99 cents that just jailbreaks your phone. Instead, it's like becoming initiated into the Masons or something to figure out how to jailbreak your phone.

CINDY COHN
Yes, we managed to free jailbreaking directly through the exceptions process in the DMCA but it hasn’t ended up really helping many people. We got an exception to one part of the law but the very next section prevents most people from getting any real benefit.

CORY DOCTOROW
At the risk of like teaching granny to suck eggs, we know what the deficiency in the, in the exceptions process is, right? I literally just explained this to a fact checker at the Financial Times who's running my Enshittification speech, who's like you said that it's illegal to jailbreak phones, and yet I've just found this process where they made it legal to jailbreak phones and it's like, yeah, the process makes it legal for you to jailbreak your phone. It doesn't make it legal for anyone to give you a tool to jailbreak your phone or for you to ask anyone how that tool should work or compare notes with someone about how that, so you can like, gnaw your own jailbreaking tool out of a whole log in secret, right? Discover the, discover the defect in iOS yourself.
Figure out how to exploit it yourself. Write an alternative version of iOS yourself. And install it on your phone in the privacy of your own home. And provided you never tell anyone what you've done or how you did it, the law will permit you to do this and not send you to prison.
But give anyone any idea how you're doing it, especially in a commercial context where it's, you know, in the checkout aisle at the Walmart for 99 cents, off to prison with you. Five-hundred-thousand-dollar fine and a five-year prison sentence for a first offense for violating Section 12 0 1 of the DMCA in a commercial way. Right? So, yeah, we have these exceptions, but they're mostly ornamental.

CINDY COHN
Well, I mean, I think that that's the, you know, it's profoundly weird, right? This idea that you can do something yourself, but if you help somebody else do it, that's illegal. It's a very strange thing. Of course, EFF is not like the digital Millennium Copyright Act since 1998 when it was passed, or probably 1995 when they started talking about it. But it is a situation in which, you know, we've chipped away at the law, and this is a thing that you've written a lot about. These fights are long fights and we have to figure out how to be in them for the long run and how to claim victory when we get even a small victory. So, you know, maybe this is a situation in which us crowing about some small victories, has led people to be misled about the overarching story which is still one where we've got a lot of work to do.

CORY DOCTOROW
Yeah, and I think that, you know, the way to understand this is as not just the DMCA, but also all the other things that we just colloquially call IP Law that constitute this thing that Jay calls felony contempt of business model. You know, there's this old debate among our tribe that, you know, IP is the wrong term to use. It's not really property. It doesn't crisply articulate a set of policies. Are we talking about trademark and patent and copyright, or do we wanna throw in broadcast rights and database rights and you know, whatever, but I actually think that in a business context, IP means something very, very specific.
When an investor asks a founder, ‘What IP do you have? What they mean is what laws can you invoke that will allow you to exert control over the conduct of your competitors, your critics, and your customers?’ That's what they mean. And oftentimes, each IP law will have an escape valve, like the DMCA's triennial exemptions. But you can layer one in front of the other, in front of the other in order to create something where all of the exemptions are plugged. So, you know, copyright has these exceptions but then you add trademark where like Apple is doing things like engraving nearly invisible apple logos on the components inside of its phones, so that when they're refurbished in the far east and shipped back as parts for independent repair, they ask the customs agency in the US to seize the parts for tarnishment of their trademark because the parts are now of an unknown quality and they bear their logo, which means that it will degrade the public's opinion of the reliability of an Apple product. So, you know, copyright and patent don't stop them from doing this, but we still have this other layer of IP and if you line the layers up in the right way, and this is what smart corporate lawyers do - they know the right pattern to line these different protections up, such that all of the exceptions that we're supposed to provide a public interest, that were supposed to protect us as the users or protect society - each one of those is choked off by another layer.

CINDY COHN
I think that’s one of my biggest frustrations in fixing the internet. We get stuck fighting one fight at a time and just when we pass one roadblock we have to navigate another. In fact, one that we haven’t mentioned yet is contract law, with terms of service and clickwrap license agreements that block innovation and interoperability. It starts to feel more like a game, you know, can our intrepid coder navigate around all the legal hurdles and finally get to the win where they can give us back control over our devices and tools?

CORY DOCTOROW
I mean, this is one of the things that's exciting about the antitrust action that we're getting now, is that I think we're gonna see a lot of companies being bound by obligations whose legitimacy they don't acknowledge and which they are going to flout. And when they do, presuming that the enforcers remain committed to enforcing the law, we are going to have opportunities to say to them, ‘Hey, you're gonna need to enter into a settlement that is gonna restrict your future conduct. You're gonna have to spin off certain things. You're gonna have to allow certain kinds of interop or whatever’.
That we got these spaces opening up. And this is how I think about all of this and it is very game-like, right? We have these turns. We're taking turns, our adversaries are taking turns. And what we want is not just to win ground, but we want to win high ground. We want to win ground from which we have multiple courses of action that are harder to head off. And one of the useful things about the Enshittification analysis is it tries to identify the forces that made companies treat us good. I think sometimes the companies treated us well because the people who ran them were honorable. But also you have to ask how those honorable people resisted their shareholders’ demands to shift value from the firm to their users or the other direction. What forces let them win, you know, in that fight. And if we can identify what forces made companies treat technology users better on the old good internet, then we can try and build up those forces for a new good internet. So, you know, one of the things that I think really helped the old good internet was the paradox of the worker power of the tech worker because tech workers have always been well compensated. They've always had a lot of power to walk out of the job and go across the street and get another job with someone better. Tech Workers had all of this power, which meant that they didn't ever really like form unions. Like tech union density historically has been really low. They haven't had formal power, they've had individual power, and that meant that they typically enjoyed high wages and quite cushy working conditions a lot of the time, right? Like the tech campus with the gourmet chef and the playground and the gym and the sports thing and the bicycles and whatever. But at the same time, this allowed the people they worked for to appeal to a sense of mission among these people. And it was, these were these like non-commercial ethical normative demands on the workforce. And the appeals to those let bosses convince workers to work crazy hours. Right? You know, the extremely hardcore Elon Musk demand that you sleep under your desk, right? This is where it comes from, this sense of mission which meant, for the bosses, that there was this other paradox, which was that if you motivate your workers with a sense of mission, they will feel a sense of mission. And when you say, ‘Hey, this product that you fought really hard for, you have to make worse, right? You've, you know, missed your gallbladder surgery and your mom's funeral and your kid's little league championship to make this product. We want you to stick a bunch of gross ads in it,’ the people who did that job were like, no, I feel a sense of mission. I will quit and walk across the street and get another job somewhere better if this is what you demand of me. One of the constraints that's fallen away is this labor constraint. You know, when Google does a stock buyback and then lays off 12,000 workers within a few months, and the stock buyback would pay their wages for 27 years, like the workers who remain behind get the message that the answer to, no, I refuse to make this product worse is fine, turn in your badge and don't let the door hit you in the ass on the way out. And one of the things we've always had a trade in at EFF is tech workers who really cared about their users. Right? That's been the core of our membership. Those have been the whistleblowers we sometimes hear from. Those have been our clients sometimes. And we often say when companies have their users’ backs, then we have the company's back. If we were to decompose that more fully, I think we would often find that the company that has its users' back really has a critical mass of indispensable employees who have their users’ back, that within the balance of power in the company, it's tipping towards users. And so, you know, in this moment of unprecedented union formation, if not union density, this is an area where, you know, you and I, Cindy have written about this, where, where tech rights can be workers' rights, where bossware can cut against labor rights and interoperable tools that defeat bossware can improve workers’ agency within their workplace, which is good for them, but it's good for the people that they feel responsibility for, the users of the internet.

CINDY COHN
Yeah. I remember in the early days when I first joined EFF and Adobe had had the FBI arrest Dmitri Sklyarov at DefCon because he developed a piece of software that allowed people to copy and move their Adobe eBooks into other formats and platforms. Some of EFF’s leadership went to Adobe’s offices to talk to their leadership and see if we could get them to back off.
I remember being told about the scene because there were a bunch of hackers protesting outside the Adobe building, and they could see Adobe workers watching them from the windows of that building. We knew in that moment that we were winning, that Adobe was gonna back down because their internal conversations were, how come we're the guys who are sending the FBI after a hacker?
We had something similar happen with Apple more recently when Apple announced that it was going to do client side scanning. We knew from the tech workers that we were in contact with inside the company that breaking end-to-end encryption was something that most of the workers didn't approve of. We actually flew a plane over Apple’s headquarters at One Infinite Loop to draw attention to the issue. Now whether it was the plane or not, it wasn't long before Apple backed down because they felt the pressure from inside, as well as outside. I think the tech workers are feeling disempowered right now, and it's important to keep telling these stories and reminding them that they do have power because the first thing that a boss who wants to control you does, is make you think you're all alone and you don't have any power. I appreciate that in the world we’re envisioning where we start to get tech right, we're not just talking about users and what users get. We're talking about what workers and creators and hackers and innovators get, which is much more control and the ability to say no or to say yes to something better than the thing that the company has chosen. I'm interested in continuing to try to tell these stories and have these conversations.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
And now back to our conversation with Cory Doctorow. Cory is well known for his writing and speaking but what some people may not realize is that he is a capital A Activist. I work with him closely on the activism team here at EFF, and I have seen firsthand how sometimes his eyes go red and he will throw everything he has into a fight. So I wanted to get him to talk a bit more about the activism side of his work, and what fuels that.

CORY DOCTOROW
I tried to escape EFF at one point. I actually was like, God, you know, the writing and the activism, I can't do both. I'm just gonna do one. And so I went off to do the writing for a few years, and I got so pissed off with things going wrong in the world that I wasn't actively engaged in trying to fix that I just lost it. And I was like, I, whatever negative effects accrue due to overwork are far less significant to me, both like intellectually and kind of emotionally, than the negative effects I get from feeling hopeless, right, and helpless and sitting on the sidelines while things that are just untenable, go on. And, you know, Cindy said it before, it's a long game, right? The activism game. We are sowing the seeds of a crop that we may never reap. And I am willing to understand and believe and make my peace with the idea that some of the stuff that I'm doing will be victories after I've left the field, right, it'll be for people who haven't even graduated high school yet, let alone going to work for EFF or one of our allies.
And so when I see red, when I get really angry, when I don't know, you know, the the DRM in browsers at the W3C or the European Union trying for, mandatory copyright filters for online services, I think like this is a fight we may not win, but it's a fight that we must fight, right? The stakes are too high not to win it, and if we lose it this time around, we will lay the groundwork for a future victory. We will create the people who are angry that the policy came out this way, who, when some opportunity opens up in the future, because you know these fights that we fight, the side that we're on is the side of producing something good and stable and beneficial. And the thing that we're fighting against has massive downstream harms, whether that's mandatory copyright filters or client-side scanning or breaking end-to-end encryption, right? Like if we lose a breaking end-to-end encryption fight, what we have lost is the safety of millions of people in whatever country that rule has been enacted, and that means that in a way that is absolutely deplorable and that the architects of these policies should be ashamed of, some of those people are gonna come to the most terrible harms in the future. And the thing that we should be doing because we have lost the fight to stop those harms from occurring, is be ready to when those harms occur, to be able to step in and say, not just we told you so, but here's how we fix it. Here's the thing that we're going to do to turn this crisis into the opportunity to precipitate a change.

JASON KELLEY
Yeah, that's right. Something that has always pleased me is when we have a guest here on the podcast and we’ve had many, who have talked about the blue ribbon campaign. And it’s clear that, you know, we won that fight, but years and years ago, we put together this coalition of people, maybe unintentionally, that still are with us today. And it is nice to imagine that, with the wins and the losses, we gain bigger numbers as we lay that groundwork.

CINDY COHN
And I think there is something also fun about trying to build the better world, being the good guys. I think there is something powerful about that. The fights are long, they're hard. I always say that, you know, the good guys throw better parties. And so on the one hand it's, yes, it's the anger; your eyes see red, we have to stop this bad thing from happening. But the other thing is that the other people who are joining with you in the fight are really good people to hang out with. And so I guess I, I wanna make sure that we're talking about both sides of a kind of activist life because they're both important. And if it wasn't for the fun part - fun when you win - sometimes a little gallows humor when you don't, that's as important as the anger side because if you're gonna be in it for the long run you can't just run on, you know, red-eyed anger alone.

CORY DOCTOROW
You know, I have this great laptop from this company Framework. I promise you this goes somewhere that, uh, is a user serviceable laptop. So it comes with a screwdriver. Even someone who's really klutzy like me can fix their laptop. And, uh, I drop my laptops all the time - and the screws had started coming loose on the bottom, and they were like, ‘hey, this sounds like a thing that we didn't anticipate when we designed it. Why don't we ship you a free one and you ship us back the broken one, we can analyze it for future ones’. So, I just did this, I swapped out the bottom cover of my laptop at the weekend, which meant that I had a new sticker surface for my laptop. And I found a save.org ‘some things are not for sale’ sticker, which was, you know, this incredible campaign that we ran with our lost and beloved colleague Elliot and putting that sticker on felt so good. You know, it was just like, yeah, this is, this is like a, this is like a head on a pike for me. This is great.

CINDY COHN
And for those who may not have followed that, just at the beginning of Covid actually, there was an effort by private equity to buy the control of the .org domain, which of course means EFF.org, but it means every other nonprofit. And we marshaled a tremendous coalition of nonprofits and others to essentially, you know, make the deal not happen. And save.org for, you know, the.orgs. And as Cory mentioned, our dear friend Elliot who was our activism director at the time, that was his last campaign before he got sick. And, we did, we, we won. We saved.org. Now that fight continues. Uh, things are not all perfect in .org land, but we did head that one off and that included a very funky, nerdy protest in front of an ICANN meeting that, uh, that a bunch of people came to.

CORY DOCTOROW
Top level domains still a dumpster fire. In other words, in other news, water's still wet. You know, the thing about that campaign that was so great, is it was one where we didn't have a path to victory. We didn't have a legal leg to stand on. The organization was just like operating in its own kind of bubble where it was fully insulated from, you know, formally, at least on paper, insulated from public opinion, from stakeholder opinions. It just got to do whatever it wanted. And we just like kind of threw everything at it. We tried all kinds of different tactics and cumulatively they worked and there were weird things that came in at the end. Like Xavier Becerra, who is then the Attorney General of California going like, well, you're kind of, you're a California nonprofit. Like, I think maybe we're gonna wanna look at this.
And then all of a sudden everyone was just like, no, no, no, no, no. But you know, it wasn't like Becerra saved it, right? It was like we built up the political pressure that caused the Attorney General of California who's got a thing or two on his plate, to kind of get up on his hind legs and go, ‘Hey, wait a second. What's going on here?’
And there've been so many fights like that over the years. You know, this is, this is the broadcast treaty at the UN. I remember when we went, our then colleague, Fred von Lohmann was like, ‘I know how to litigate in the United States 'cause we have like constitutional rights in the United States. The UN is not going to let NGOs set the agenda or sue. You can't force them to give you time.’ You know, it's like you have all the cards stacked against you there but we killed the broadcast flag and we did it like by being digitally connected with activists all over the world that allowed us to exploit the flexibility of digital tools to have a fluid improvisational style that allowed us at each turn to improvise in the moment, new tactics that went around the roadblocks that were put in our way. And some of them were surreal, like our handouts were being stolen and hidden in the toilets. Uh, but you know, it was a very weird fight.
And we trounced the most powerful corporations in the world in a forum that was completely stacked against us. And you know, that's the activist joy here too, right? It's like you go into these fights with the odds stacked against you. You never know whether or not there is a lurking potential for a novel tactic that your adversary is completely undefended on, where you can score really big, hard-to-anticipate wins. And I think of this as being related to a theory of change that I often discuss when people ask me about optimism and pessimism.
Because I don't like optimism and pessimism. I think they're both a form of fatalism. That optimism and pessimism are both the idea that the outcome of events are unrelated to human conduct, right? Things will get worse or things will get better. You just sit on the sidelines. It's coming either way. The future is a streetcar on tracks and it's going where it's going.
But I think that hope is this idea that if you're like, trying to get somewhere and you don't know how to get there, you're trying to ascend a gradient towards a better future - if you ascend that gradient to the extent that you can see the path from where you are now, that you can attain a vantage point from which you can see new possible ways of going that were obscured from where you were before, that doing something changes the landscape, changes where you're situated and may reveal something else you can do.

CINDY COHN
Oh, that's such a lovely place to end. Thank you so much, Cory, for taking time to talk with us. We're gonna keep walking that path, and we're gonna keep looking for the little edges and corners and ways, you know, that we can continue to push forward the better internet because we all deserve it.

JASON KELLEY
Thanks, Cory. It's really nice to talk to you.

CORY DOCTOROW
Oh, it was my pleasure.

JASON KELLEY
You know, I get a chance to talk to Cory more often than most people, and I'm still just overjoyed when it gets to happen. What did you think of that conversation, Cindy?

CINDY COHN
What I really liked about it is that he really grounds, you know, what could be otherwise, a kind of wonky thing - adversarial interoperability or competitive compatibility - in a list of very concrete things that have happened in the past and not the far past, the fairly recent past. And so, you know, building a better future really means just bringing some of the tools to bear that we've already brought to bear in other situations, just to our new kind of platform Enshittification world. Um, and I think it makes it feel much more doable than something that might be, you know, a pie in the sky. And then we all go to Mars and everything gets better.

JASON KELLEY
Yeah. You know, he's really good at saying, here's how we can learn from what we actually got right in the past. And that's something people don't often do in this, in this field. It's often trying to learn from what we got wrong. And the part of the conversation that I loved was just hearing him talk about how he got back into doing the work. You know, he said he wanted to do writing or activism, because he was just doing too much, but in reality, he couldn't do just one of the two because he cares so much about what's going on. It reminded me when he was saying, sort of, what gets his eyes to turn red of when we were speaking with Gaye Gordon-Byrne, about right to repair and how she had been retired and just decided after getting pulled back in again and again just to go wholly committed to to fighting for the right to repair after, you know that quote from The Godfather about being continually pulled back in. This is Cory and, and people like him, I think, to a tee.

CINDY COHN
Yeah, I think so too. That reminded me of what, what she said. And of course I was on the other side of it. I was one of the people that Cory was pinging over and over again.

JASON KELLEY
So you pulled him back in.

CINDY COHN
Well, I think he pulled himself back in. I was just standing there. Um, but, but it is, it is fun to watch somebody feel their passion grow so much that they just have to get back into the fight. And I think Gay really told that same trajectory of how, you know, sometimes something just bugs you enough that you decide, look, I gotta figure out how to get into this fight and, and, and make things better.

JASON KELLEY
And hopefully people listening will have that same feeling. And I know that, you know, many of our supporters do already.
Thanks for joining us for this episode of How to Fix the Internet. If you have any feedback or suggestions, we would be happy to hear from you. Visit EFF. org slash podcast and click on listener feedback. And while you're there, maybe you could become an EFF member and maybe you could pick up some merch. We've got very good t-shirts. Or you can just peruse to see what's happening in digital rights this week and every week. This podcast is licensed Creative Commons attribution. 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. In this episode, you heard Xena's Kiss slash Madea's Kiss by M. Wick, Probably Shouldn't by J. Lang featuring Mr. Yesterday, Come Inside by Zepp Herm featuring Snowflake, and Drops of H2O the Filtered Water Treatment by J. Lang featuring Airtone. Our theme music is by Nat Keefe of Beatmower with Reed Mathis. How to Fix the Internet is supported by the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. I hope you'll join us again. I'm Jason Kelley.

CINDY COHN
And I’m Cindy Cohn.

Josh Richman

Keep The Momentum Going for The Right to Repair

3 weeks 4 days ago

Thanks to support from local advocates across the country, we’ve been able to have a few strong years for the right to repair. Both California and Minnesota’s right to repair laws go into effect today, and we've even made some headway convincing large companies, like Apple, to come out in support of the right to repair.

That’s why EFF is celebrating the right to repair movement on “Repair Independence Day.” And we’re keeping up the momentum with our newest member t-shirt, “Fix Copyright”:

Celebrate creativity as a whole family!

With this shirt design, we have our eyes set on pushing back against companies who threaten individuals' rights to repair the devices and equipment they own—specifically referencing the ongoing battle farmers are facing against John Deere to repair their tractors.

If you can’t fix it, you don’t own it. That’s why we’re working with local and national advocacy groups to get strong right to repair legislation passed across the country, and challenging copyright law to make ownership better for everyone.

Support our work and even grab a Fix Copyright t-shirt for yourself! This time, we even have a variety of kids sizes, so the whole family can celebrate creativity together.

Christian Romero

Victory! Supreme Court Rules Platforms Have First Amendment Right to Decide What Speech to Carry, Free of State Mandates

3 weeks 4 days ago

The Supreme Court today correctly found that social media platforms, like newspapers, bookstores, and art galleries before them, have First Amendment rights to curate and edit the speech of others they deliver to their users, and the government has a very limited role in dictating what social media platforms must and must not publish. Although users remain understandably frustrated with how the large platforms moderate user speech, the best deal for users is when platforms make these decisions instead of the government.  

As we explained in our amicus brief, users are far better off when publishers make editorial decisions free from government mandates. Although the court did not reach a final determination about the Texas and Florida laws, it confirmed that their core provisions are inconsistent with the First Amendment when they force social media sites to publish user posts that are, at best, irrelevant, and, at worst, false, abusive, or harassing. The government’s favored speakers would be granted special access to the platforms, and the government’s disfavored speakers silenced. 

We filed our first brief advocating this position in 2018 and are pleased to see that the Supreme Court has finally agreed. 

Notably, the court emphasizes another point EFF has consistently made: that the First Amendment right to edit and curate user content does not immunize social media platforms and tech companies more broadly from other forms of regulation not related to editorial policy. As the court wrote: “Many possible interests relating to social media can meet that test; nothing said here puts regulation of NetChoice’s members off-limits as to a whole array of subjects.” The court specifically calls out competition law as one avenue to address problems related to market dominance and lack of user choice. Although not mentioned in the court’s opinion, consumer privacy laws are another available regulatory tool.  

We will continue to urge platforms large and small to adopt the Santa Clara Principles as a human rights framework for content moderation. Further, we will continue to advocate for strong consumer data privacy laws to regulate social media companies’ invasive practices, as well as more robust competition laws that could end the major platforms’ dominance.   

EFF has been urging courts to adopt this position for almost six years. We filed our first amicus brief in November 2018: https://www.eff.org/document/prager-university-v-google-eff-amicus-brief  

EFF’s must-carry laws issue page: https://www.eff.org/cases/netchoice-must-carry-litigation 

Press release for our SCOTUS amicus brief: https://www.eff.org/press/releases/landmark-battle-over-free-speech-eff-urges-supreme-court-strike-down-texas-and 

Direct link to our brief: https://www.eff.org/document/eff-brief-moodyvnetchoice

Related Cases: NetChoice Must-Carry Litigation
David Greene

Celebrate Repair Independence Day!

3 weeks 4 days ago

Right-to-repair advocates have spent more than a decade working for a simple goal: to make sure you can fix and tinker with your own stuff. That should be true whether we’re talking about a car, a tractor, a smartphone, a computer, or really anything you buy. Yet product manufacturers have used the growing presence of software on devices to make nonsense arguments about why tinkering with your stuff violates their copyright.

Our years of hard work pushing for consumer rights to repair are paying off in a big way. Case in point: Today—July 1, 2024—two strong repair bills are now law in California and Minnesota. As Repair Association Executive Director Gay Gordon-Byrne said on EFF's podcast about right to repair, after doggedly chasing this goal for years, we caught the car!

Sometimes it's hard to know what to do after a long fight. But it's clear for the repair movement. Now is the time to celebrate! That's why EFF is joining our friends in the right to repair world by celebrating Repair Independence Day.

EFF is joining our friends in the right to repair world by celebrating Repair Independence Day.

There are a few ways to do this. You could grab your tools and fix that wonky key on your keyboard. You could take a cracked device to a local repair shop. Or you can read up on what your rights are. If you live in California or Minnesota—or in Colorado or New York, where right to repair laws are already in effect—and want to know what the repair laws in your state mean for you, check out this tip sheet from Repair.org.

And what if you're not in one of those states? We still have good news for you. We're all seeing the fruits of this labor of love, even in states where there aren't specific laws. Companies have heard, time and again, that people want to be able to fix their own stuff. As the movement gains more momentum, device manufacturers started to offer more repair-friendly programs: Kobo offering parts and guides, Microsoft selling parts for controllers, Google committing to offering spare parts for Pixels for seven years, and Apple offering some self-service repairs.  

It's encouraging to see companies respond to our demands for the right to repair, though laws such as those going into effect today make sure they can't roll back their promises. And, of course, the work is not done. Repair advocates have won incredible victories in California and Minnesota (with another good law in Oregon coming online next July). But there are a still lots of things you should be able to fix without interference that are not covered by these bills, such as tractors.

We can't let up, especially now that we're winning. But today, it's time to enjoy our hard-won victories. Happy Repair Independence Day!

Hayley Tsukayama

The SFPD’s Intended Purchase of a Robot Dog Triggers Board of Supervisors’ Oversight Obligations

3 weeks 4 days ago

The San Francisco Police Department (SFPD) wants to get a robot quadruped, popularly known as a robot dog. The city’s Board of Supervisors has a regulatory duty to probe into this intended purchase, including potentially blocking it altogether.

The SFPD recently proposed the acquisition of a new robot dog in a report about the department’s existing military arsenal and its proposed future expansion. The particular model that SFPD claims they are exploring, Boston Dynamics’s Spot, is capable of intrusion and surveillance in a manner similar to drones and other unmanned vehicles and is able to hold “payloads” like cameras.

The SFPD’s disclosure came about as a result of a California law, A.B. 481, which requires police departments to make publicly available information about “military equipment,” including weapons and surveillance tools such as drones, firearms, tanks, and robots. Some of this equipment may come through the federal government’s military surplus program.

A.B. 481 also requires a law enforcement agency to seek approval from its local governing body when acquiring, using, or seeking funds for military equipment and submit a military equipment policy. That policy must be made publicly available and must be approved by the governing body of the jurisdiction on a yearly basis. As part of that approval process, the governing body must determine that the policy meets the following criteria:

  • The military equipment is necessary because there is no reasonable alternative that can achieve the same objective of officer and civilian safety
  • The proposed military equipment use policy will safeguard the public’s welfare, safety, civil rights, and civil liberties
  • If purchasing the equipment, the equipment is reasonably cost effective compared to available alternatives that can achieve the same objective of officer and civilian safety
  • Prior military equipment use complied with the military equipment use policy that was in effect at the time, or if prior uses did not comply with the accompanying military equipment use policy, corrective action has been taken to remedy nonconforming uses and ensure future compliance

Based on the oversight requirements imposed by A.B. 481, the San Francisco Board of Supervisors must ask the SFPD some important questions before deciding if the police department actually needs a robot dog: How will the SFPD use this surveillance equipment? Given that the robot dog does not have the utility of one of the department’s bomb disposal robots, why would this robot be useful? What can this robot do that other devices it already has at its disposal cannot do? Does the potential limited use of this device justify its expenditure? How does the SFPD intend to safeguard civil rights and civil liberties in deploying this robot into communities that may already be overpoliced?

If the SFPD cannot make a compelling case for the purchase of a robot quadruped, the Board of Supervisors has a responsibility to block the sale.

A.B. 481 serves as an important tool for democratic control of police’s acquisition of surveillance technology despite recent local efforts to undermine such oversight. In 2019, San Francisco passed a Community Control of Police Surveillance (CCOPS) ordinance, which required city departments like the SFPD to seek Board approval before acquiring or using new surveillance technologies, in a transparent process that offered the opportunity for public comment. This past March, voters scaled back this law by enacting Proposition E, which allows the SFPD a one-year “experimentation” period to test out new surveillance technologies without a use policy or Board approval. However, the state statute still governs military equipment, such as the proposed robot dog, which continues to need Board approval before purchasing and still requires a publicly available policy that takes into consideration the uses of the equipment and the civil liberties impacts on the public.

In 2022, the San Francisco Board of Supervisors banned police deployment of deadly force via remote control robot, so at least we know this robot dog will not be used in that way. It should also be noted that Boston Dynamics has vowed not to arm their robots. But just because this robot dog doesn’t have a bomb strapped to it, doesn’t mean it will prove innocuous to the public, useful to police, or at all helpful to the city. The Board of Supervisors has an opportunity and a responsibility to ensure that any procurement of robots comes with a strong justification from the SFPD, clear policy around how it can be used, and consideration of the impacts on civil rights and civil liberties. Just because narratives about rising crime have gained a foothold does not mean that elected officials get to abdicate any sense of reason or practicality in what technology they allow police departments to buy and use. When it comes to military equipment, the state of California has given cities an oversight tool—and San Francisco should use it. 

Matthew Guariglia

Now The EU Council Should Finally Understand: No One Wants “Chat Control”

3 weeks 4 days ago

The EU Council has now passed a 4th term without passing its controversial message-scanning proposal. The just-concluded Belgian Presidency failed to broker a deal that would push forward this regulation, which has now been debated in the EU for more than two years. 

For all those who have reached out to sign the “Don’t Scan Me” petition, thank you—your voice is being heard. News reports indicate the sponsors of this flawed proposal withdrew it because they couldn’t get a majority of member states to support it. 

Now, it’s time to stop attempting to compromise encryption in the name of public safety. EFF has opposed this legislation from the start. Today, we’ve published a statement, along with EU civil society groups, explaining why this flawed proposal should be withdrawn.  

The scanning proposal would create “detection orders” that allow for messages, files, and photos from hundreds of millions of users around the world to be compared to government databases of child abuse images. At some points during the debate, EU officials even suggested using AI to scan text conversations and predict who would engage in child abuse. That’s one of the reasons why some opponents have labeled the proposal “chat control.” 

There’s scant public support for government file-scanning systems that break encryption. Nor is there support in EU law. People who need secure communications the most—lawyers, journalists, human rights workers, political dissidents, and oppressed minorities—will be the most affected by such invasive systems. Another group harmed would be those whom the EU’s proposal claims to be helping—abused and at-risk children, who need to securely communicate with trusted adults in order to seek help. 

The right to have a private conversation, online or offline, is a bedrock human rights principle. When surveillance is used as an investigation technique, it must be targeted and coupled with strong judicial oversight. In the coming EU council presidency, which will be led by Hungary, leaders should drop this flawed message-scanning proposal and focus on law enforcement strategies that respect peoples’ privacy and security. 

Further reading: 

Joe Mullin

Betting on Your Digital Rights: EFF Benefit Poker Tournament at DEF CON 32

4 weeks ago

Hacker Summer Camp is almost here... and with it comes the Third Annual EFF Benefit Poker Tournament at DEF CON 32 hosted by security expert Tarah Wheeler.

Please join us at the same place and time as last year: Friday, August 9th, at high noon at the Horseshoe Poker Room. The fees haven’t changed; it’s still $250 to register plus $100 the day of the tournament with unlimited rebuys.

Tarah Wheeler—EFF board member and resident poker expert—has been working hard on the tournament since last year! Not only has she created a custom EFF playing card deck as a gift for each player, but she also recruited Cory Doctorow to emcee this year. Be sure to register today and see Cory in action!

Did we mention there will be Celebrity Bounties? Knock out Jake “MalwareJake” Williams, Deviant Ollam, or Runa Sandvik and get neat EFF swag plus the respect of your peers! As always, knock out Tarah’s dad, Mike, and she will donate $250 to the EFF in your name!


Find Full Event Details and Registration


%3Ciframe%20width%3D%22560%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube-nocookie.com%2Fembed%2FiVvTNC4BUqM%3Fsi%3D3i5YJfk6JWHxHICM%26autoplay%3D1%26mute%3D1%22%20title%3D%22YouTube%20video%20player%22%20frameborder%3D%220%22%20allow%3D%22accelerometer%3B%20autoplay%3B%20clipboard-write%3B%20encrypted-media%3B%20gyroscope%3B%20picture-in-picture%3B%20web-share%22%20referrerpolicy%3D%22strict-origin-when-cross-origin%22%20allowfullscreen%3D%22%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from youtube-nocookie.com

Anyone who pre-registers and plays will receive a custom EFF playing card deck (if you don’t show up to the tournament by 30 minutes after the start time your deck may be given away).

The winner will receive a treasure chest curated from Tarah’s own collection. The chest is filled with real gems, including emeralds, black pearls, amethysts, diamonds, and more! The winner will also receive our now traditional Jellybean Trophy! 

Have you played some poker before but could use a refresher on rules, strategy, table behavior, and general Vegas slang at the poker table? Tarah will run a poker clinic from 11 am-11:45 am just before the tournament. Even if you know poker pretty well, come a bit early and help out. Just show up and donate anything to EFF. Make it over $50 and Tarah will teach you chip riffling, the three biggest tells, and how to stare blankly and intimidatingly through someone’s soul while they’re trying to decide if you’re bluffing.

Register today and reserve your deck. Be sure to invite your friends to join you!

poker-treasure-chest-2.jpg

 

Melissa Srago

How the FTC Can Make the Internet Safe for Chatbots

4 weeks ago

No points for guessing the subject of the first question the Wall Street Journal asked FTC Chair Lina Khan: of course it was about AI.

Between the hype, the lawmaking, the saber-rattling, the trillion-dollar market caps, and the predictions of impending civilizational collapse, the AI discussion has become as inevitable, as pro forma, and as content-free as asking how someone is or wishing them a nice day.

But Chair Khan didn’t treat the question as an excuse to launch into the policymaker’s verbal equivalent of a compulsory gymnastics exhibition.

Instead, she injected something genuinely new and exciting into the discussion, by proposing that the labor and privacy controversies in AI could be tackled using her existing regulatory authority under Section 5 of the Federal Trade Commission Act (FTCA5).

Section 5 gives the FTC a broad mandate to prevent “unfair methods of competition” and “unfair or deceptive acts or practices.” Chair Khan has made extensive use of these powers during her first term as chair, for example, by banning noncompetes and taking action on online privacy.

At EFF, we share many of the widespread concerns over privacy, fairness, and labor rights raised by AI. We think that copyright law is the wrong tool to address those concerns, both because of what copyright law does and doesn’t permit, and because establishing copyright as the framework for AI model-training will not address the real privacy and labor issues posed by generative AI. We think that privacy problems should be addressed with privacy policy and that labor issues should be addressed with labor policy.

That’s what made Chair Khan’s remarks so exciting to us: in proposing that Section 5 could be used to regulate AI training, Chair Khan is opening the door to addressing these issues head on. The FTC Act gives the FTC the power to craft specific, fit-for-purpose rules and guidance that can protect Americans’ consumer, privacy, labor and other rights.

Take the problem of AI “hallucinations,” which is the industry’s term for the seemingly irrepressible propensity of chatbots to answer questions with incorrect answers, delivered with the blithe confidence of a “bullshitter.”

The question of whether chatbots can be taught not to “hallucinate” is far from settled. Some industry leaders think the problem can never be solved, even as startups publish (technically impressive-sounding, but non-peer reviewed) papers claiming to have solved the problem.

Whether the problem can be solved, it’s clear that for the commercial chatbot offerings in the market today, “hallucinations” come with the package. Or, put more simply: today’s chatbots lie, and no one can stop them.

That’s a problem, because companies are already replacing human customer service workers with chatbots that lie to their customers, causing those customers real harm. It’s hard enough to attend your grandmother’s funeral without the added pain of your airline’s chatbot lying to you about the bereavement fare.

Here’s where the FTC’s powers can help the American public:

The FTC should issue guidance declaring that any company that deploys a chatbot that lies to a customer has engaged in an “unfair and deceptive practice” that violates Section 5 of the Federal Trade Commission Act, with all the fines and other penalties that entails.

After all, if a company doesn’t get in trouble when its chatbot lies to a customer, why would they pay extra for a chatbot that has been designed not to lie? And if there’s no reason to pay extra for a chatbot that doesn’t lie, why would anyone invest in solving the “hallucination” problem?

Guidance that promises to punish companies that replace their human workers with lying chatbots will give new companies that invent truthful chatbots an advantage in the marketplace. If you can prove that your chatbot won’t lie to your customers’ users, you can also get an insurance company to write you a policy that will allow you to indemnify your customers against claims arising from your chatbot’s output.

But until someone does figure out how to make a “hallucination”-free chatbot, guidance promising serious consequences for chatbots that deceive users with “hallucinated” lies will push companies to limit the use of chatbots to low-stakes environments, leaving human workers to do their jobs.

The FTC has already started down this path. Earlier this month, FTC Senior Staff Attorney Michael Atleson published an excellent backgrounder laying out some of the agency’s thinking on how companies should present their chatbots to users.

We think that more formal guidance about the consequences for companies that save a buck by putting untrustworthy chatbots on the front line will do a lot to protect the public from irresponsible business decisions – especially if that guidance is backed up with muscular enforcement.

Cory Doctorow

Mississippi Can’t Wall Off Everyone’s Social Media Access to Protect Children

4 weeks ago

In what is becoming a recurring theme, Mississippi became the latest state to pass a law requiring social media services to verify users’ ages and block lawful speech to young people. Once again, EFF explained to the court why the law is unconstitutional.

Mississippi’s law (House Bill 1126) requires social media services to verify the ages of all users, to obtain parental consent for any minor users, and to block minor users from being exposed to “harmful” material. NetChoice, the trade association that represents some of the largest social media services, filed suit and sought to block the law from going into effect in July.

EFF submitted a friend-of-the-court brief in support of NetChoice’s First Amendment challenge to the statute to explain how invasive and chilling online age verification mandates can be. “Such restrictions frustrate everyone’s ability to use one of the most expressive mediums of our time—the vast democratic forums of the internet that we all use to create art, share photos with loved ones, organize for political change, and speak,” the brief argues.

Online age verification laws are fundamentally different and more burdensome than laws requiring adults to show their identification in physical spaces, EFF’s brief argues:

Unlike in-person age-gates, online age restrictions like Mississippi’s require all users to submit, not just momentarily display, data-rich government-issued identification or other proof-of-age, and in some commercially available methods, a photo.

The differences in online age verification create significant burdens on adults’ ability to access lawful speech online. Most troublingly, age verification requirements can completely block millions of U.S. adults who don’t have government-issued identification or lack IDs that would satisfy Mississippi’s verification requirements, such as by not having an up-to-date address or current legal name.

“Certain demographics are also disproportionately burdened when government-issued ID is used in age verification,” EFF’s brief argues. “Black Americans and Hispanic Americans are disproportionately less likely to have current and up-to-date driver’s licenses. And 30% of Black Americans do not have a driver’s license at all.”

Moreover, relying on financial and credit records to verify adults’ identities can also exclude large numbers of adults. As EFF’s brief recounts, some 20 percent of U.S. households do not have a credit card and 35 percent do not own a home.

The data collection required by age-verification systems can also deter people from using social media entirely, either because they want to remain anonymous online or are concerned about the privacy and security of any data they must turn over. HB 1126 thus burdens people’s First Amendment rights to anonymity and their right to privacy.

Regarding HB 1126’s threat to anonymity, EFF’s brief argued:

The threats to anonymity are real and multilayered. All online data is transmitted through a host of intermediaries. This means that when a website shares identifying information with its third-party age-verification vendor, that data is not only transmitted between the website and the vendor, but also between a series of third parties. Under the plain language of HB 1126, those intermediaries are not required to delete users’ identifying data and, unlike the digital service providers themselves, they are also not restricted from sharing, disclosing, or selling that sensitive data.

Regarding data privacy and security, EFF’s brief argued:

The personal data that HB 1126 requires platforms to collect or purchase is extremely sensitive and often immutable. By exposing this information to a vast web of websites and intermediaries, third-party trackers, and data brokers, HB 1126 poses the same concerns to privacy-concerned internet users as it does to the anonymity-minded users.

Finally, EFF’s brief argues that although HB 1126 contains data privacy protections for children that are laudable, they cannot be implemented without the state first demanding that every user verify their age so that services can apply those privacy protections to children. As a result, the state cannot enforce those provisions.

EFF’s brief notes, however, that should Mississippi pass “comprehensive data privacy protections, not attached to content-based, speech-infringing, or privacy-undermining schemes,” that law would likely be constitutional.

EFF remains ready to support Mississippi’s effort to protect all its residents’ privacy. HB 1126, however, unfortunately seeks to provide only children with privacy protections we all desperately need while at the same time restricting adults and children’s access to lawful speech on social media.

Aaron Mackey

Victory! Grand Jury Finds Sacramento Cops Illegally Shared Driver Data

4 weeks 1 day ago

For the past year, EFF has been sounding the alarm about police in California illegally sharing drivers' location data with anti-abortion states, putting abortion seekers and providers at risk of prosecution. We thus applaud the Sacramento County Grand Jury for hearing this call and investigating two police agencies that had been unlawfully sharing this data out-of-state.

The grand jury, a body of 19 residents charged with overseeing local government including law enforcement, released their investigative report on Wednesday. In it, they affirmed that the Sacramento County Sheriff's Office and Sacramento Police Department violated state law and "unreasonably risked" aiding the potential prosecution of "women who traveled to California to seek or receive healthcare services."

In May 2023, EFF, along with the American Civil Liberties Union of Northern California and the American Civil Liberties Union of Southern California, sent letters to 71 California police agencies demanding that they stop sharing automated license plate reader (ALPR) data with law enforcement agencies in other states. This sensitive location information can reveal where individuals work, live, worship, and seek medical care—including reproductive health services. Since the Supreme Court overturned Roe v. Wade with its decision in Dobbs v. Jackson Women’s Health Organization, ALPR data has posed particular risks to those who seek or assist abortions that have been criminalized in their home states.

Since 2016, California law has prohibited sharing ALPR data with out-of-state or federal law enforcement agencies. Despite this, dozens of rogue California police agencies continued sharing this information with other states, even after the state's attorney general issued legal guidance in October "reminding" them to stop.

In Sacramento County, both the Sacramento County Sheriff's Office and the Sacramento Police Department have dismissed calls for them to start obeying the law. Last year, the Sheriff's Office even claimed on Twitter that EFF's concerns were part "a broader agenda to promote lawlessness and prevent criminals from being held accountable." That agency, at least, seems to have had a change of heart: The Sacramento County Grand Jury reports that, after they began investigating police practices, the Sacramento County Sheriff's Office agreed to stop sharing ALPR data with out-of-state law enforcement agencies.

The Sacramento Police Department, however, has continued to share ALPR data with out-of-state agencies. In their report, the grand jury calls for the department to comply with the California Attorney General's legal guidance. The grand jury also recommends that all Sacramento law enforcement agencies make their ALPR policies available to the public in compliance with the law.

As the grand jury's report notes, EFF and California's ACLU affiliates "were among the first" organizations to call attention to police in the state illegally sharing ALPR data. While we are glad that many police departments have since complied with our demands that they stop this practice, we remain committed to bringing attention and pressure to agencies, like the Sacramento Police Department, that have not. In January, for instance, EFF and the ACLU sent a letter urging the California Attorney General to enforce the state's ALPR laws.

For nearly a decade, EFF has been investigating and raising the alarm about the illegal mass-sharing of ALPR data by California law enforcement agencies. The grand jury's report details what is just the latest in a series of episodes in which Sacramento agencies violated the law with ALPR. In December 2018, the Sacramento County Department of Human Assistance terminated its program after public pressure resulting from EFF's revelation that the agency was accessing ALPR data in violation of the law. The next year, EFF successfully lobbied the state legislature to order an audit of four agencies, including the Sacramento County Sheriff's Office, and how they use ALPR. The result was a damning report that the sheriff had fallen short of many of the basic requirements under state law.

Hudson Hongo

Drone As First Responder Programs Are Swarming Across the United States

4 weeks 1 day ago

Law enforcement wants more drones, and we’ll probably see many more of them overhead as police departments seek to implement a popular project justifying the deployment of unmanned aerial vehicles (UAVs): the “drone as first responder” (DFR).

Police DFR programs involve a fleet of drones, which can range in number from four or five to hundreds. In response to 911 calls and other law enforcement calls for service, a camera-equipped drone is launched from a regular base (like the police station roof) to get to the incident first, giving responding officers a view of the scene before they arrive. In theory and in marketing materials, the advance view from the drone will help officers understand the situation more thoroughly before they get there, better preparing them for the scene and assisting them in things such as locating wanted or missing individuals more quickly. Police call this “situational awareness.”

In practice, law enforcement's desire to get “a view of the scene” becomes a justification for over-surveilling neighborhoods that produce more 911 calls and for collecting information on anyone who happens to be in the drone’s path. For example, a drone responding to a vandalism case may capture video footage of everyone it passes along the way. Also, drones are subject to the same mission-creep issues that already plague other police tools designed to record the public; what is pitched as a solution to violent crime can quickly become a tool for policing homelessness or low-level infractions that otherwise wouldn't merit police resources.

With their birds-eye view, drones can observe individuals in previously private and constitutionally protected spaces, like their backyards, roofs, and even through home windows. And they can capture crowds of people, like protestors and other peaceful gatherers exercising their First Amendment rights. Drones can be equipped with cameras, thermal imaging, microphones, license plate readers, face recognition, mapping technology, cell-site simulators, weapons, and other payloads. Proliferation of these devices enables state surveillance even for routine operations and in response to innocuous calls —situations unrelated to the original concerns of terrorism or violent crime originally used to justify their adoption.

Drones are also increasingly tied into other forms of surveillance. More departments — including those in Las Vegas, Louisville, and New York City — are toying with the idea of dispatching drones in response to ShotSpotter gunshot detection alerts, which are known to send many false positive alerts. This could lead to drone surveillance of communities that happen to have a higher concentration of ShotSpotter microphones or other acoustic gunshot detection technology. Data revealed recently shows that a disproportionate number of these gunshot detection sensors  are located in Black communities in the United States. Also, artificial intelligence is also being added to drone data collection; connecting what's gathered from the sky to what has been gathered on the street and through other methods is a trending part of the police panopticon plan.

chula_vista_drone_program_2.jpg

A CVPD official explains the DFR program to EFF staff in 2022. Credit: Jason Kelley (EFF)

DFR programs have been growing in popularity since first launched by the Chula Vista Police Department in 2018. Now there are a few dozen departments with known DFR programs among the approximately 1,500 police departments known to have any drone program at all, according to EFF’s Atlas of Surveillance, the most comprehensive dataset of this kind of information. The Federal Aviation Administration (FAA) regulates use of drones and is currently mandated to prepare new regulations for how they can be operated beyond the operator’s line of sight (BVLOS), the kind of long-distance flight that currently requires a special waiver. All the while, police departments and the companies that sell drones are eager to move forward with more DFR initiatives.

Agency State Arapahoe County Sheriff's Office CO Beverly Hills Police Department CA Brookhaven Police Department GA Burbank Police Department CA Chula Vista Police Department CA Clovis Police Department CA Commerce City Police Department CO Daytona Beach Police Department FL Denver Police Department CO Elk Grove Police Department CA Flagler County Sheriff's Office FL Fort Wayne Police Department IN Fremont Police Department CA Gresham Police Department OR Hawthorne Police Department CA Hemet Police Department CA Irvine Police Department CA Montgomery County Police Department MD New York City Police Department NY Oklahoma City Police Department OK Oswego Police Department NY Redondo Beach CA Santa Monica Police Department CA West Palm Beach Police Department FL Yonkers Police Department NY Schenectady Police Department NY Queen Creek Police Department AZ Greenwood Village Police Department CO Hawthorne Police Department CA

Transparency around the acquisition and use of drones will be important to the effort to protect civilians from government and police overreach and abuse as agencies commission more of these flying machines. A recent Wired investigation raised concerns about Chula Vista’s program, finding that roughly one in 10 drone flights lacked a stated purpose, and for nearly 500 of its recent flights, the reason for deployment was an “unknown problem.” That same investigation also found that each average drone flight exposes nearly 5,000 city residents to enhanced surveillance, primarily in predominantly Black and brown neighborhoods.

“For residents we spoke to,” Wired wrote, “the discrepancy raises serious concerns about the accuracy and reliability of the department's transparency efforts—and experts say the use of the drones is a classic case of self-perpetuating mission creep, with their existence both justifying and necessitating their use.”

Chula Vista's "Drone-Related Activity Dashboard" indicates that more than 20 percent of drone flights are welfare checks or mental health crises, while only roughly 6% are responding to assault calls. Chula Vista Police claim that the DFR program lets them avoid potentially dangerous or deadly interactions with members of the public, with drone responses resulting in their department avoiding sending a patrol unit in response to 4,303 calls. However, this theory and the supporting data needs to be meaningfully evaluated by independent researchers.

This type of analysis is not possible without transparency around the program in Chula Vista, which, to its credit, publishes regular details like the location and reason for each of its deployments. Still, that department has also tried to prevent the public from learning about its program, rejecting California Public Records Act (CPRA) requests for drone footage. This led to a lawsuit in which EFF submitted an amicus brief, and ultimately the California Court of Appeal correctly found that drone footage is not exempt from CPRA requests.

While some might take for granted that the government is not allowed to conduct surveillance — intentional, incidental, or otherwise — on you in spaces like your fenced-in backyard, this is not always the case. It took a lawsuit and a recent Alaska Supreme Court decision to ensure that police in that state must obtain a warrant for drone surveillance in otherwise private areas. While some states do require a warrant to use a drone to violate the privacy of a person’s airspace, Alaska, California, Hawaii, and Vermont are currently the only states where courts have held that warrantless aerial surveillance violates residents’ constitutional protections against unreasonable search and seizure absent specific exceptions.

Clear policies around the use of drones are a valuable part of holding police departments accountable for their drone use. These policies must include rules around why a drone is deployed and guardrails on the kind of footage that is collected, the length of time it is retained, and with whom it can be shared.

A few state legislatures have taken some steps toward providing some public accountability over growing drone use.

  • In Minnesota, law enforcement agencies are required to annually report their drone programs' costs and the number of times they deployed drones with, including how many times they were deployed without a warrant.
  • In Illinois, the Drones as First Responders Act went into effect June 2023, requiring agencies to report whether they own drones; how many are owned; the number of times the drones were deployed, as well as the date, location, and reason for the deployment; and whether video was captured and then retained from each deployment. Illinois agencies also must share a copy of their latest use policies, drone footage is generally supposed to be deleted after 24 hours, and the use of face recognition technology is prohibited except in certain circumstances.
  • In California, AB 481 — which took effect in May 2022 with the aim of providing public oversight over military-grade police equipment — requires police departments to publicly share a regular inventory of the drones that they use. Under this law, police acquisition of drones and the policies governing their use require approval from local elected officials following an opportunity for public comment, giving communities an important chance to provide feedback.

DFR programs are just one way police are acquiring drones, but law enforcement and UAV manufacturers are interested in adding drones in other ways, including as part of regular patrols and in response to high-speed vehicle pursuits. These uses also create the risk of law enforcement bypassing important safeguards.  Reasonable protections for public privacy, like robust use policies, are not a barrier to public safety but a crucial part of ensuring just and constitutional policing.

Companies are eager to tap this growing market. Police technology company Axon —known for its Tasers and body-worn cameras — recently acquired drone company Dedrone, specifically citing that company’s efforts to push DFR programs as one reason for the acquisition. Axon since has established a partnership with Skydio in order to expand their DFR sales.

It’s clear that as the skies open up for more drone usage, law enforcement will push to procure more of these flying surveillance tools. But police and lawmakers must exercise far more skepticism over what may ultimately prove to be a flashy trend that wastes resources, infringes on people's rights, and results in unforeseen shifts in policing strategy. The public must be kept aware of how cops are coming for their privacy from above.

Beryl Lipton

Government Has Extremely Heavy Burden to Justify TikTok Ban, EFF Tells Appeals Court

4 weeks 1 day ago
New Law Subject to Strictest Scrutiny Because It Imposes Prior Restraint, Directly Restricts Free Speech, and Singles Out One Platform for Prohibition, Brief Argues

SAN FRANCISCO — The federal ban on TikTok must be put under the finest judicial microscope to determine its constitutionality, the Electronic Frontier Foundation (EFF) and others argued in a friend-of-the-court brief filed Wednesday to the U.S. Court of Appeals for the D.C. Circuit. 

The amicus brief says the Court must review the Protecting Americans from Foreign Adversary Controlled Applications Act — passed by Congress and signed by President Biden in April — with the most demanding legal scrutiny because it imposes a prior restraint that would make it impossible for users to speak, access information, and associate through TikTok. It also directly restricts protected speech and association, and deliberately singles out a particular medium for a blanket prohibition. This demanding First Amendment test must be used even when the government asserts national security concerns. 

The Court should see this law for what it is: “a sweeping ban on free expression that triggers the most exacting scrutiny under the First Amendment,” the brief argues, adding it will be extremely difficult for the government to justify this total ban. 

Joining EFF in this amicus brief are the Freedom of the Press Foundation, TechFreedom, Media Law Resource Center, Center for Democracy and Technology, First Amendment Coalition, and Freedom to Read Foundation. 

TikTok hosts a wide universe of expressive content from musical performances and comedy to politics and current events, the brief notes, and with more than 150 million users in the United States and 1.6 billion users worldwide, the platform hosts enormous national and international communities that most U.S. users cannot readily reach elsewhere. It plays an especially important and outsized role for minority communities seeking to foster solidarity online and to highlight issues vital to them. 

“The First Amendment protects not only TikTok’s US users, but TikTok itself, which posts its own content and makes editorial decisions about what user content to carry and how to curate it for each individual user,” the brief argues.  

Congress’s content-based justifications for the ban make it clear that the government is targeting TikTok because it finds speech that Americans receive from it to be harmful, and simply invoking national security without clearly demonstrating a threat doesn’t overcome the ban’s unconstitutionality, the brief argues. 

“Millions of Americans use TikTok every day to share and receive ideas, information, opinions, and entertainment from other users around the world lies, and that’s squarely within the protections of the First Amendment,” EFF Civil Liberties Director David Greene said. “By barring all speech on the platform before it can happen, the law effects the kind of prior restraint that the Supreme Court has rejected for the past century as unconstitutional in all but the rarest cases.” 

For the brief: https://www.eff.org/document/06-26-2024-eff-et-al-amicus-brief-tiktok-v-garland

For EFF’s stance on TikTok bans: https://www.eff.org/deeplinks/2023/03/government-hasnt-justified-tiktok-ban 

Contact:  DavidGreeneCivil Liberties Directordavidg@eff.org
Josh Richman

The Global Suppression of Online LGBTQ+ Speech Continues

1 month ago

A global increase in anti-LGBTQ+ intolerance is having a significant impact on digital rights. As we wrote last year, censorship of LGBTQ+ websites and online content is on the rise. For many LGBTQ+ individuals the world over, the internet can be a safer space for exploring identity, finding community, and seeking support. But with anti-LGBTQ+ bills restricting free expression and privacy to content moderation decisions that disproportionately impact LGBTQ+ users, digital spaces that used to seem like safe havens are, for many, no longer so.

EFF's mission is to ensure that technology supports freedom, justice, and innovation for all people of the world, and that includes LGBTQ+ communities, which all too often face threats, censorship, and other risks when they go online. This Pride month—and the rest of the year—we’re highlighting some of those risks, and what we’re doing to help change online spaces for the better.

Worsening threats in the Americas

In the United States, where EFF is headquartered, recent gains in rights have been followed by an uptick in intolerance that has led to legislative efforts, mostly at the state level. In 2024 alone, 523 anti-LGBTQ+ bills have been proposed by state legislatures, many of which restrict freedom of expression. In addition to these bills, a drive in mostly conservative areas to ban books in school libraries—many of which contain LGBTQ themes—is creating an environment in which queer youth feel even more marginalized.

At the national level, an effort to protect children from online harms—the Kids Online Safety Act (KOSA)—risks alienating young people, particularly those from marginalized communities, by restricting their access to certain content on social media. EFF spoke with young people about KOSA, and found that many are concerned that they will lose access to help, education, friendship, and a sense of belonging that they have found online. At a time when many young people have just come out of several years of isolation during the pandemic and reliance on online communities for support, restricting their access could have devastating consequences.

TAKE ACTION

TELL CONGRESS: OPPOSE THE KIDS ONLINE SAFETY ACT

Similarly, age-verification bills being put forth by state legislatures often seek to prevent access to material deemed harmful to minors. If passed, these measures would restrict access to vital content, including education and resources that LGBTQ+ youth without local support often rely upon. These bills often contain vague and subjective definitions of “harm” and are all too often another strategy in the broader attack on free expression that includes book bans, censorship of reproductive health information, and attacks on LGBTQ+ youth.

Moving south of the border, in much of South and Central America, legal progress has been made with respect to rights, but violence against LGBTQ+ people is particularly high, and that violence often has online elements to it. In the Caribbean, where a number of countries have strict anti-LGBTQ+ laws on the books often stepping from the colonial era, online spaces can be risky and those who express their identities in them often face bullying and doxxing, which can lead to physical harm.

In many other places throughout the world, the situation is even worse. While LGBTQ+ rights have progressed considerably over the past decade in a number of democracies, the sense of freedom and ease that these hard-won freedoms created for many are suffering serious setbacks. And in more authoritarian countries where the internet may have once been a lifeline, crackdowns on expression have coincided with increases in user growth and often explicitly target LGBTQ+ speech.

In Europe, anti-LGBTQ+ violence at a record high

In recent years, legislative efforts aimed at curtailing LGBTQ+ rights have gained momentum in several European countries, largely the result of a rise in right-wing populism and conservatism. In Hungary, for instance, the Orban government has enacted laws that restrict LGBTQ+ rights under the guise of protecting children. In 2021, the country passed a law banning the portrayal or promotion of LGBTQ+ content to minors. In response, the European Commission launched legal cases against Hungary—as well as some regions in Poland—over LGBTQ+ discrimination, with Commission President Ursula von der Leyen labeling the law as "a shame" and asserting that it clearly discriminates against people based on their sexual orientation, contravening the EU's core values of equality and human dignity​.

In Russia, the government has implemented severe restrictions on LGBTQ+ content online. A law initially passed in 2013 banning the promotion of “non-traditional sexual relations” among minors was expanded in 2022 to apply to individuals of all ages, further criminalizing LGBTQ+ content. The law prohibits the mention or display of LGBTQ+ relationships in advertising, books, media, films, and on online platforms, and has created a hostile online environment. Media outlets that break the law can be fined or shut down by the government, while foreigners who break the law can be expelled from the country. 

Among the first victims of the amended law were seven migrant sex workers—all trans women—from Central Asia who were fined and deported in 2023 after they published their profiles on a dating website. Also in 2023, six online streaming platforms were penalised for airing movies with LGBTQ-related scenes. The films included “Bridget Jones: The Edge of Reason”, “Green Book”, and the Italian film “Perfect Strangers.”

Across the continent, as anti-LGBTQ+ violence is at a record high, queer communities are often the target of online threats. A 2022 report by the European Digital Media Observatory reported a significant increase in online disinformation campaigns targeting LGBTQ+ communities, which often frame them as threats to traditional family values. 

Across Africa, LGBTQ+ rights under threat

In 30 of the 54 countries on the African continent, homosexuality is prohibited. Nevertheless, there is a growing movement to decriminalize LGBTQ+ identities and push toward achieving greater rights and equality. As in many places, the internet often serves as a safer space for community and organizing, and has therefore become a target for governments seeking to crack down on LGBTQ+ people.

In Tanzania, for instance, where consensual same-sex acts are prohibited under the country’s colonial-era Penal Code, authorities have increased digital censorship against LGBTQ+ content, blocking websites and social media platforms that provide support and information to the LGBTQ+ community .This crackdown is making it increasingly difficult for people to find safe spaces online. As a result of these restrictions, many online groups used by the LGBTQ+ community for networking and support have been forced to disband, driving individuals to riskier public spaces to meet and socialize​. 

In other countries across the continent, officials are weaponizing legal systems to crack down on LGBTQ+ people and their expression. According to Access Now, a proposed law in Kenya, the Family Protection Bill, seeks to ban a variety of actions, including public displays of affection, engagement in activities that seek to change public opinion on LGBTQ+ issues, and the use of the internet, media, social media platforms, and electronic devices to “promote homosexuality.” Furthermore, the prohibited acts would fall under the country’s Computer Misuse and Cybercrimes Act of 2018, giving law enforcement the power to monitor and intercept private communications during investigations, as provided by Section 36 of the National Intelligence Service Act, 2012. 

A draconian law passed in Uganda in 2023, the Anti-Homosexuality Act, introduced capital punishment for certain acts, while allowing for life imprisonment for others. The law further imposes a 20-year prison sentence for people convicted of “promoting homosexuality,” which includes the publication of LGBTQ+ content, as well as “the use of electronic devices such as the internet, mobile phones or films for the purpose of homosexuality or promoting homosexuality.”

In Ghana, if passed, the anti-LGBTQ+ Promotion of Proper Human Sexual Rights and Ghanaian Family Values Bill would introduce prison sentences for those who engage in LGBTQ+ sexual acts as well as those who promote LGBTQ+ rights. As we’ve previously written, ban all speech and activity on and offline that even remotely supports LGBTQ+ rights. Though the bill passed through parliament in March, he won’t sign the bill until the country’s Supreme Court rules on its constitutionality.

And in Egypt and Tunisia, authorities have integrated technology into their policing of LGBTQ+ people, according to a 2023 Human Rights Watch report. In Tunisia, where homosexuality is punishable by up to three years in prison, online harassment and doxxing are common, threatening the safety of LGBTQ+ individuals. Human Rights Watch has documented cases in which social media users, including alleged police officers, have publicly harassed activists, resulting in offline harm.

Egyptian security forces often monitor online LGBTQ+ activity and have used social media platforms as well as Grindr to target and arrest individuals. Although same-sex relations are not explicitly banned by law in the country, authorities use various morality provisions to effectively criminalize homosexual relations. More recently, prosecutors have utilized cybercrime and online morality laws to pursue harsher sentences.

In Asia, Cybercrime laws threaten expression

LGBTQ+ rights in Asia vary widely. While homosexual relations are legal in a majority of countries, they are strictly banned in twenty, and same-sex marriage is only legal in three—Taiwan, Nepal, and Thailand. Online threats are also varied, ranging from harassment and self-censorship to the censoring of LGBTQ+ content—such as in Indonesia, Iran, China, Saudi Arabia, the UAE, and Malaysia, among other nations—as well as legal restrictions with often harsh penalties.

The use of cybercrime provisions to target LGBTQ+ expression is on the rise in a number of countries, particularly in the MENA region. In Jordan, the Cybercrime Law of 2023, passed last August, imposes restrictions on freedom of expression, particularly for LGBTQ+ individuals. Articles 13 and 14 of the law impose penalties for producing, distributing, or consuming “pornographic activities or works” and for using information networks to “facilitate, promote, incite, assist, or exhort prostitution and debauchery, or seduce another person, or expose public morals.” Jordan follows in the footsteps of neighboring Egypt, which instituted a similar law in 2018.

The LGBTQ+ movement in Bangladesh is impacted by the Cyber Security Act, quietly passed in 2023. Several provisions of the Act can be used to target LGBTQ+ sites; Section 8 enables the government to shut down websites, while section 42 grants law enforcement agencies the power to search and seize a person’s hardware, social media accounts, and documents, both online and offline, without a warrant. And section 25 criminalizes published contents that tarnish the image or reputation of the country.

The online struggle is global

In addition to national-level restrictions, LGBTQ+ individuals often face content suppression on social media platforms. While some of this occurs as the result of government requests, much of it is actually due to platforms’ own policies and practices. A recent GLAAD case study points to specific instances where content promoting or discussing LGBTQ+ issues is disproportionately flagged and removed, compared to non-LGBTQ+ content. The GLAAD Social Media Safety Index also provides numerous examples where platforms inconsistently enforce their policies. For instance, posts that feature LGBTQ+ couples or transgender individuals are sometimes taken down for alleged policy violations, while similar content featuring heterosexual or cisgender individuals remains untouched. This inconsistency suggests a bias in content moderation that EFF has previously documented and leads to the erasure of LGBTQ+ voices in online spaces.

Likewise, the community now faces threats at the global level, in the form of the impending UN Cybercrime Convention, currently in negotiations. As we’ve written, the Convention would expand cross-border surveillance powers, enabling nations to potentially exploit these powers to probe acts they controversially label as crimes based on subjective moral judgements rather than universal standards. This could jeopardize vulnerable groups, including the LGBTQ+ community.

EFF is pushing back to ensure that the Cybercrime Treaty's scope must be narrow, and human rights safeguards must be a priority. You can read our written and oral interventions and follow our Deeplinks Blog for updates. Earlier this year, along with Access Now, we also submitted comment to the U.N. Independent Expert on protection against violence and discrimination based on sexual orientation and gender identity (IE SOGI) to inform the Independent Expert’s thematic report presented to the U.N. Human Rights Council at its fifty-sixth session.

But just as the struggle for LGBTQ+ rights and recognition is global, so too is the struggle for a safer and freer internet. EFF works year round to highlight that struggle and to ensure LGBTQ+ rights are protected online. We collaborate with allies around the world, and work to ensure that both states and companies protect and respect the rights of LGBTQ+ communities worldwide.

We also want to help LGBTQ+ communities stay safer online. As part of our Surveillance Self-Defense project, we offer a number of guides for safer online communications, including a guide specifically for LGBTQ+ youth.

EFF believes in preserving an internet that is free for everyone. While there are numerous harms online as in the offline world, digital spaces are often a lifeline for queer youth, particularly those living in repressive environments. The freedom of discovery, the sense of community, and the access to information that the internet has provided for so many over the years must be preserved. 



Jillian C. York

Hack of Age Verification Company Shows Privacy Danger of Social Media Laws

1 month ago

We’ve said it before: online age verification is incompatible with privacy. Companies responsible for storing or processing sensitive documents like drivers’ licenses are likely to encounter data breaches, potentially exposing not only personal data like users’ government-issued ID, but also information about the sites that they visit. 

This threat is not hypothetical. This morning, 404 Media reported that a major identity verification company, AU10TIX, left login credentials exposed online for more than a year, allowing access to this very sensitive user data. 

A researcher gained access to the company’s logging platform, “which in turn contained links to data related to specific people who had uploaded their identity documents,” including “the person’s name, date of birth, nationality, identification number, and the type of document uploaded such as a drivers’ license,” as well as images of those identity documents. Platforms reportedly using AU10TIX for identity verification include TikTok and X, formerly Twitter. 

Lawmakers pushing forward with dangerous age verifications laws should stop and consider this report. Proposals like the federal Kids Online Safety Act and California’s Assembly Bill 3080 are moving further toward passage, with lawmakers in the House scheduled to vote in a key committee on KOSA this week, and California's Senate Judiciary committee set to discuss  AB 3080 next week. Several other laws requiring age verification for accessing “adult” content and social media content have already passed in states across the country. EFF and others are challenging some of these laws in court. 

In the final analysis, age verification systems are surveillance systems. Mandating them forces websites to require visitors to submit information such as government-issued identification to companies like AU10TIX. Hacks and data breaches of this sensitive information are not a hypothetical concern; it is simply a matter of when the data will be exposed, as this breach shows. 

Data breaches can lead to any number of dangers for users: phishing, blackmail, or identity theft, in addition to the loss of anonymity and privacy. Requiring users to upload government documents—some of the most sensitive user data—will hurt all users. 

According to the news report, so far the exposure of user data in the AU10TIX case did not lead to exposure beyond what the researcher showed was possible. If age verification requirements are passed into law, users will likely find themselves forced to share their private information across networks of third-party companies if they want to continue accessing and sharing online content. Within a year, it wouldn’t be strange to have uploaded your ID to a half-dozen different platforms. 

No matter how vigilant you are, you cannot control what other companies do with your data. If age verification requirements become law, you’ll have to be lucky every time you are forced to share your private information. Hackers will just have to be lucky once. 

Jason Kelley
Checked
2 hours 25 minutes ago
EFF's Deeplinks Blog: Noteworthy news from around the internet
Subscribe to EFF update feed