Cars (and Drivers): 2024 in Review

3 months ago

If you’ve purchased a car made in the last decade or so, it’s likely jam-packed with enough technology to make your brand new phone jealous. Modern cars have sensors, cameras, GPS for location tracking, and more, all collecting data—and it turns out in many cases, sharing it.

Cars Sure Are Sharing a Lot of Information

While we’ve been keeping an eye on the evolving state of car privacy for years, everything really took off after a New York Times report this past March found that the car maker G.M. was sharing information about driver’s habits with insurance companies without consent.

It turned out a number of other car companies were doing the same by using deceptive design so people didn’t always realize they were opting into the program. We walked through how to see for yourself what data your car collects and shares. That said, cars, infotainment systems, and car maker’s apps are so unstandardized it’s often very difficult for drivers to research, let alone opt out of data sharing.

Which is why we were happy to see Senators Ron Wyden and Edward Markey send a letter to the Federal Trade Commision urging it to investigate these practices. The fact is: car makers should not sell our driving and location history to data brokers or insurance companies, and they shouldn’t make it as hard as they do to figure out what data gets shared and with whom.

Advocating for Better Bills to Protect Abuse Survivors

The amount of data modern cars collect is a serious privacy concern for all of us. But for people in an abusive relationship, tracking can be a nightmare.

This year, California considered three bills intended to help domestic abuse survivors endangered by vehicle tracking. Of those, we initially liked the approach behind two of them, S.B. 1394 and S.B. 1000. When introduced, both would have served the needs of survivors in a wide range of scenarios without inadvertently creating new avenues of stalking and harassment for the abuser to exploit. They both required car manufacturers to respond to a survivor's request to cut an abuser's remote access to a car's connected services within two business days. To make a request, a survivor had to prove the vehicle was theirs to use, even if their name was not on the loan or title.

But the third bill, A.B. 3139, took a different approach. Rather than have people submit requests first and cut access later, this bill required car manufacturers to terminate access immediately, and only require some follow-up documentation up to seven days later. Likewise, S.B. 1394 and S.B. 1000 were amended to adopt this "act first, ask questions later" framework. This approach is helpful for survivors in one scenario—a survivor who has no documentation of their abuse, and who needs to get away immediately in a car owned by their abuser. Unfortunately, this approach also opens up many new avenues of stalking, harassment, and abuse for survivors. These bills ended up being combined into S.B. 1394, which retained some provisions we remain concerned about.

It’s Not Just the Car Itself

Because of everything else that comes with car ownership, a car is just one piece of the mobile privacy puzzle.

This year we fought against A.B. 3138 in California, which proposed adding GPS technology to digital license plates to make them easier to track. The bill passed, unfortunately, but location data privacy continues to be an important issue that we’ll fight for.

We wrote about a bulletin released by the U.S. Cybersecurity and Infrastructure Security Agency about infosec risks in one brand of automated license plate readers (ALPRs). Specifically, the bulletin outlined seven vulnerabilities in Motorola Solutions' Vigilant ALPRs, including missing encryption and insufficiently protected credentials. The sheer scale of this vulnerability is alarming: EFF found that just 80 agencies in California, using primarily Vigilant technology, collected more than 1.6 billion license plate scans (CSV) in 2022. This data can be used to track people in real time, identify their "pattern of life," and even identify their relations and associates.

Finally, in order to drive a car, you need a license, and increasingly states are offering digital IDs. We dug deep into California’s mobile ID app, wrote about the various issues with mobile IDs— which range from equity to privacy problems—and put together an FAQ to help you decide if you’d even benefit from setting up a mobile ID if your state offers one. Digital IDs are a major concern for us in the coming years, both due to the unanswered questions about their privacy and security, and their potential use for government-mandated age verification on the internet.

The privacy problems of cars are of increasing importance, which is why Congress and the states must pass comprehensive consumer data privacy legislation with strong data minimization rules and requirements for clear, opt-in consent. While we tend to think of data privacy laws as dealing with computers, phones, or IoT devices, they’re just as applicable, and increasingly necessary, for cars, too.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Thorin Klosowski

Behind the Diner—Digital Rights Bytes: 2024 in Review

3 months ago

Although it feels a bit weird to be writing a year in review post for a site that hasn’t even been live for three months, I thought it would be fun to give a behind-the-scenes look at the work we did this year to build EFF’s newest site, Digital Rights Bytes. 

Since each topic Digital Rights Bytes aims to tackle is in the form of a question, why not do this Q&A style? 

Q: WHAT IS DIGITAL RIGHTS BYTES?

Great question! At its core, Digital Rights Bytes is a place where you can get honest answers to the questions that have been bugging you about technology. 

The site was originally pitched as ‘EFF University’ (or EFFU, pun intended) to help folks who aren’t part of our core tech-savvy base get up-to-speed on technology issues that may be affecting their everyday lives. We really wanted Digital Rights Bytes to be a place where newbies could feel safe learning about internet freedom issues, get familiar with EFF’s work, and find out how to get involved, without feeling too intimidated. 

Q: WHY DOES THE SITE LOOK SO DIFFERENT FROM OTHER EFF WORK?

With our main goal of attracting new readers, it was crucial to brand Digital Rights Bytes differently from other EFF projects. We wanted Digital Rights Bytes to feel like a place where you and your friend might casually chat over milkshakes—while being served pancakes by a friendly robot. We took that concept and ran with it, going forward with a full diner theme for the site. I mean, imagine the counter banter you could have at the Digital Rights Bytes Diner!

Take a look at the Digital Rights Bytes counter!

As part of this concept, we thought it made sense for each topic to be framed as a question. Of course, at EFF, we get a ton of questions from supporters and other folks online about internet freedom issues, including from our own family and friends. We took some of the questions we see fairly often, then decided which would be the most important—and most interesting—to answer.  

The diner concept is why the site has a bright neon logo, pink and cyan colors, and a neat vintage looking background on desktop. Even the gif that plays on the home screen of Digital Rights Bytes shows our animal characters chatting ‘round the diner (more on them soon!) 

Q: WHY DID YOU MAKE DIGITAL RIGHTS BYTES?

Here’s the thing: technology continues to expand, evolve, and change—and it’s tough to keep up! We’ve all been the tech noob, trying to figure out why our devices behave the way they do, and it can be pretty overwhelming.  

So, we thought that we could help out with that! And what better way to help educate newcomers than explaining these tech issues in short byte-sized videos: 

A clip from the device repair video.

It took some time to nail down the style for the videos on Digital Rights Bytes. But, after some trial and error, we landed on using animals as our lead characters. A) because they’re adorable. B) because it helped further emphasize the shadowy figures that were often trying to steal their data or make their tech worse for them. It’s often unclear who is trying to steal our data or rig tech to be worse for the user, so we thought this was fitting. 

In addition to the videos, EFF issue experts wrote concise and easy to read pages further detailing the topic, with an emphasis on linking to other experts and including information on how you can get involved. 

Q: HAS DIGITAL RIGHTS BYTES BEEN SUCCESSFUL?

You tell us! If you’re reading these Year In Review blog posts, you’re probably the designated “ask them every tech question in the world” person of your family. Why not send your family and friends over to Digital Rights Bytes and let us know if the site has been helpful to them! 

We’re also looking to expand the site and answer more common questions you and I might hear. If you have suggestions, you should let us know here or on social media! Just use the hashtag #DigitalRightsBytes and we’ll be sure to consider it. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Christian Romero

NSA Surveillance and Section 702 of FISA: 2024 in Review

3 months ago

Mass surveillance authority Section 702 of FISA, which allows the government to collect international communications, many of which happen to have one side in the United States, has been renewed several times since its creation with the passage of the 2008 FISA Amendments Act. This law has been an incessant threat to privacy for over a decade because the FBI operates on the “finders keepers” rule of surveillance which means that it thinks because the NSA has “incidentally” collected the US-side of conversations it is now free to sift through them without a warrant.

But 2024 became the year this mass surveillance authority was not only reauthorized by a lion’s share of both Democrats and Republicans—it was also the year the law got worse. 

After a tense fight, some temporary reauthorizations, and a looming expiration, Congress finally passed the Reforming Intelligence and Securing America Act (RISAA) in April, 20204. RISAA not only reauthorized the mass surveillance capabilities of Section 702 without any of the necessary reforms that had been floated in previous bills, it also enhanced its powers by expanding what it can be used for and who has to adhere to the government’s requests for data.

Where Section 702 was enacted under the guise of targeting people not on U.S. soil to assist with national security investigations, there are not such narrow limits on the use of communications acquired under the mass surveillance law. Following the passage of RISAA, this private information can now be used to vet immigration and asylum seekers and conduct intelligence for broadly construed “counter narcotics” purposes.

The bill also included an expanded definition of “Electronic Communications Service Provider” or ECSP. Under Section 702, anyone who oversees the storage or transmission of electronic communications—be it emails, text messages, or other online data—must cooperate with the federal government’s requests to hand over data. Under expanded definitions of ECSP there are intense and well-realized fears that anyone who hosts servers, websites, or provides internet to customers—or even just people who work in the same building as these providers—might be forced to become a tool of the surveillance state. As of December 2024, the fight is still on in Congress to clarify, narrow, and reform the definition of ECSP.

The one merciful change that occurred as a result of the 2024 smackdown over Section 702’s renewal was that it only lasts two years. That means in Spring 2026 we have to be ready to fight again to bring meaningful change, transparency, and restriction to Big Brother’s favorite law.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Matthew Guariglia

【沖縄リポート】土砂搬出、山が消えら台風直撃=浦島 悦子

3 months ago
                  辺野古新基地建設を強行する沖縄防衛局の横暴が留まるところを知らない。 沖縄戦の遺骨が混じる南部の土砂採掘が県民の強い反対で暗礁に乗り上げる中、奄美大島からの埋立用資材調達が再び浮上。奄美大島4市町村の採石場と港湾を9月に視察し、住民の深刻な被害状況の訴えを受けた辺野古土砂搬出反対全国連絡協議会の阿部悦子代表らは各市町村宛て要請行動に取り組んだ=写真=。 奄美大島には多くの採石場があり、住民はこれまでも粉塵・騒音・振動・赤土流出による海の汚染..
JCJ

Global Age Verification Measures: 2024 in Review

3 months ago

EFF has spent this year urging governments around the world, from Canada to Australia, to abandon their reckless plans to introduce age verification for a variety of online content under the guise of protecting children online. Mandatory age verification tools are surveillance systems that threaten everyone’s rights to speech and privacy, and introduce more harm than they seek to combat.

Kids Experiencing Harm is Not Just an Online Phenomena

In November, Australia’s Prime Minister, Anthony Albanese, claimed that legislation was needed to protect young people in the country from the supposed harmful effects of social media. Australia’s Parliament later passed the Online Safety Amendment (Social Media Minimum Age) Bill 2024, which bans children under the age of 16 from using social media and forces platforms to take undefined “reasonable steps” to verify users’ ages or face over $30 million in fines. This is similar to last year’s ban on social media access for children under 15 without parental consent in France, and Norway also pledged to follow a similar ban.

No study shows such harmful impact, and kids don’t need to fall into a wormhole of internet content to experience harm—there is a whole world outside the barriers of the internet that contributes to people’s experiences, and all evidence suggests that many young people experience positive outcomes from social media. Truthful news about what’s going on in the world, such as wars and climate change is available both online and by seeing a newspaper on the breakfast table or a billboard on the street. Young people may also be subject to harmful behaviors like bullying in the offline world, as well as online.

The internet is a valuable resource for both young people and adults who rely on the internet to find community and themselves. As we said about age verification measures in the U.S. this year, online services that want to host serious discussions about mental health issues, sexuality, gender identity, substance abuse, or a host of other issues, will all have to beg minors to leave and institute age verification tools to ensure that it happens. 

Limiting Access for Kids Limits Access for Everyone 

Through this wave of age verification bills, governments around the world are burdening internet users and forcing them to sacrifice their anonymity, privacy, and security simply to access lawful speech. For adults, this is true even if that speech constitutes sexual or explicit content. These laws are censorship laws, and rules banning  sexual content usually hurt marginalized communities and groups that serve them the most. History shows that over-censorship is inevitable.

This year, Canada also introduced an age verification measure, bill S-210, which seeks to prevent young people from encountering sexually explicit material by requiring all commercial internet services that “make available” explicit content to adopt age verification services. This was introduced to prevent harms like the “development of pornography addiction” and “the reinforcement of gender stereotypes and the development of attitudes favorable to harassment and violence…particularly against women.” But requiring people of all ages to show ID to get online won’t help women or young people. When these large services learn they are hosting or transmitting sexually explicit content, most will simply ban or remove it outright, using both automated tools and hasty human decision-making. This creates a legal risk not just for those who sell or intentionally distribute sexually explicit materials, but also for those who just transmit it–knowingly or not. 

Without Comprehensive Privacy Protections, These Bills Exacerbate Data Surveillance 

Under mandatory age verification requirements, users will have no way to be certain that the data they’re handing over is not going to be retained and used in unexpected ways, or even shared to unknown third parties. Millions of adult internet users would also be entirely blocked from accessing protected speech online because they are not in possession of the required form of ID

Online age verification is not like flashing an ID card in person to buy particular physical items. In places that lack comprehensive data privacy legislation, the risk of surveillance is extensive. First, a person who submits identifying information online can never be sure if websites will keep that information, or how that information might be used or disclosed. Without requiring all parties who may have access to the data to delete that data, such as third-party intermediaries, data brokers, or advertisers, users are left highly vulnerable to data breaches and other security harms at companies responsible for storing or processing sensitive documents like drivers’ licenses. 

Second, and unlike in-person age-gates, the most common way for websites to comply with a potential verification system would be to require all users to upload and submit—not just momentarily display—a data-rich government-issued ID or other document with personal identifying information. In a brief to a U.S. court, EFF explained how this leads to a host of serious anonymity, privacy, and security concerns. People shouldn't have to disclose to the government what websites they're looking at—which could reveal sexual preferences or other extremely private information—in order to get information from that website. 

These proposals are coming to the U.S. as well. We analyzed various age verification methods in comments to the New York Attorney General. None of them are both accurate and privacy-protective. 

The Scramble to Find an Effective Age Verification Method Shows There Isn't One

The European Commission is also currently working on guidelines for the implementation of the child safety article of the Digital Services Act (Article 28) and may come up with criteria for effective age verification. In parallel, the Commission has asked for proposals for a 'mini EU ID wallet' to implement device-level age verification ahead of the expected roll out of digital identities across the EU in 2026. At the same time, smaller social media companies and dating platforms have for years been arguing that age verification should take place at the device or app-store level, and will likely support the Commission's plans. As we move into 2025, EFF will continue to follow these developments as the Commission’s apparent expectation on porn platforms to adopt age verification to comply with their risk mitigation obligations under the DSA becomes clearer.

Mandatory age verification is the wrong approach to protecting young people online. In 2025, EFF will continue urging politicians around the globe to acknowledge these shortcomings, and to explore less invasive approaches to protecting all people from online harms

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Paige Collings

While the Court Fights Over AI and Copyright Continue, Congress and States Focus On Digital Replicas: 2024 in Review

3 months ago

The phrase “move fast and break things” carries pretty negative connotations in these days of (Big) techlash. So it’s surprising that state and federal policymakers are doing just that with the latest big issue in tech and the public consciousness: generative AI, or more specifically its uses to generate deepfakes.

Creators of all kinds are expressing a lot of anxiety around the use of generative artificial intelligence, some of it justified. The anxiety, combined with some people’s understandable sense of frustration that their works were used to develop a technology that they fear could displace them, has led to multiple lawsuits.

But while the courts sort it out, legislators are responding to heavy pressure to do something. And it seems their highest priority is to give new or expanded rights to protect celebrity personas–living or dead–and the many people and corporations that profit from them.

The broadest “fix” would be a federal law, and we’ve seen several proposals this year. The two most prominent are NO AI FRAUD (in the House of Representatives) and NO FAKES (in the Senate).  The first, introduced in January 2024, the Act purports to target abuse of generative AI to misappropriate a person’s image or voice, but the right it creates applies to an incredibly broad amount of digital content: any “likeness” and/or “voice replica” that is created or altered using digital technology, software, an algorithm, etc. There’s not much that wouldn’t fall into that category—from pictures of your kid, to recordings of political events, to docudramas, parodies, political cartoons, and more. It also characterizes the new right as a form of federal intellectual property. This linguistic flourish has the practical effect of putting intermediaries that host AI-generated content squarely in the litigation crosshairs because Section 230 immunity does not apply to federal IP claims. NO FAKES, introduced in April, is not significantly different.

There’s a host of problems with these bills, and you can read more about them here and here. 

A core problem is that these bills are modeled on the broadest state laws recognizing a right of publicity. A limited version of this right makes sense—you should be able to prevent a company from running an advertisement that falsely claims that you endorse its products—but the right of publicity has expanded well beyond its original boundaries, to potentially cover just about any speech that “evokes” a person’s identity, such as a phrase associated with a celebrity (like “Here’s Johnny,”) or even a cartoonish robot dressed like a celebrity. It’s become a money-making machine that can be used to shut down all kinds of activities and expressive speech. Public figures have brought cases targeting songs, magazine features, and even computer games

And states are taking swift action to further expand publicity rights. Take this year’s digital replica law in Tennessee, called the ELVIS Act because of course it is. Tennessee already gave celebrities (and their heirs) a property right in their name, photograph, or likeness. The new law extends that right to voices, expands the risk of liability to include anyone who distributes a likeness without permission and limits some speech-protective exceptions.  

Across the country, California couldn’t let Tennessee win the race for most restrictive/protective rules for famous people (and their heirs). So it passed AB 1836, creating liability for anyo ne person who uses a deceased personality’s name, voice, signature, photograph, or likeness, in any manner, without consent. There are a number of exceptions, which is better than nothing, but those exceptions are pretty confusing for people who don’t have lawyers to help sort them out.

These state laws are a done deal, so we’ll just have to see how they play out. At the federal level, however, we still have a chance to steer policymakers in the right direction. 

We get it–everyone should be able to prevent unfair and deceptive commercial exploitation of their personas. But expanded property rights are not the way to do it. If Congress really wants to protect performers and ordinary people from deceptive or exploitative uses of their images and voice, it should take a precise, careful and practical approach that avoids potential collateral damage to free expression, competition, and innovation. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Corynne McSherry

Electronic Frontier Alliance Fought and Taught Locally: 2024 in Review

3 months ago

The EFF-chaired Electronic Frontier Alliance (EFA) has had a big year! EFA is a loose network of local groups fighting for digital rights in the United States. With an ever-increasing roster of allies across the country, including significant growth on university campuses, EFA has also undergone a bit of a facelift. With the new branding comes more resources to support local organizing and popular education efforts around the country. 

If you’re a member of a local or state group in the United States that believes in digital rights, please learn more at our FAQ page. EFA groups include hackers, advocates, security educators, tech professionals, activists young and old, and beyond. If you think it would make a good fit, please fill out an application here. The Alliance has scores of members, which all did great work this year. This review highlights just a few.

A new look for EFA 

This past July, the organizing team completed the much needed EFA rebrand project with a brand new website. Thanks to the work of EFF’s Engineering and Design team, organizers now have up-to-date merch, pamphlets, and useful organizer toolkits for alliance members with a range of levels of organizing experience. Whether your group wants to lead advocacy letters to groups needing basic press strategies or organizing on college campuses we have resources to help. We also updated our allies directory to better showcase our local members, and make it easier for activists to find groups and get involved. We also put together a bluesky starter kit to make it easy to follow members into the new platform. This is a major milestone in our effort to build useful resources for the network, which we will continue to maintain and expand in the years ahead. 

More local groups heeded the call:

The alliance continued to grow, especially on college campuses, creating opportunities for some fun cross-campus collaborations in the year ahead. This year, nine local groups across eight states joined up: 

  • Stop Surveillance City, Seattle, WA: Stop Surveillance City is fighting against mass surveillance, criminalization and incarceration, and further divestment from people-centered policies. They advocate for investment in their communities and want to increase programs that address the root causes of violence. 
  • Cyber Security Club @FSU, Tallahassee, FL: The Cyber Security Club is a student group sharing resources to get students started in cybersecurity and competing in digital Capture the Flag (CTF) competitions. 
  • UF Student Infosec Team (UFSIT), Gainesville, FL: UFSIT is the cybersecurity club at the University of Florida. They are student-led and passionate about all things cybersecurity, and their goal is to provide a welcoming environment for students to learn more about all areas of information security, including penetration testing, reverse engineering, vulnerability research, digital forensics, and more. 
  • NICC, Newark, NJ: NICC is the New Jersey Institute of Technology’s official information & cybersecurity club. As a student-run organization, NICC started as a way to give NJIT students interested in cybersecurity, ethical hacking, and CTFs a group that would help them learn, grow, and hone their skills. 
  • DC919, Raleigh, NC: DEF CON Group 919 is a community group in the Raleigh/Durham area of North Carolina, providing a gathering place for hacking discussions, conference support, and workshop testing. 
  • Community Broadband PDX, Portland, OR: Their mission is to guide Portlanders to create a new option for fast internet access: publicly-owned and transparently operated, affordable, secure, fast, and reliable broadband infrastructure that is always available to every neighborhood and community. 
  • DC215, Philadelphia, PA: DC215 is another DEF CON group advancing knowledge and education with those interested in science, technology, and other areas of information security through project collaborations, group gatherings, and group activities to serve their city. 
  • Open Austin, Austin, TX: Open Austin's mission is to end disparities in Austin in access to technology. It envisions a city that respects and serves all people, by facilitating issues-based conversations between government and city residents, providing service to local community groups that leverage local expertise, and advocating for policy that utilizes technology to improve the community. 
  • Encode Justice Georgia: Encode Justice GA is the third Encode Justice to join EFA, mostly made up of high school students learning the tools of organizing by focusing on issues like algorithmic machine-learning and law enforcement surveillance. 
Alliance members are organizing for your rights: 

This year, we talked to the North Carolina chapter of Encode Justice, a network that includes over 1,000 high school and college students across over 40 U.S. states and 30 countries. A youth-led movement for safe, equitable AI, their mission is mobilizing communities for AI policies that are aligned with human values. The NC chapter is has led educational workshops, policy memos, and legislative campaigns on both the state and& city council level, while lobbying officials and building coalitions with other North Carolinians.

Local groups continued to take on fights to defend constitutional protections against police surveillance overreach around the country. We caught up with our friends at the Surveillance Technology Oversight Project (S.T.O.P.) in New York, which litigates and advocates for privacy, working to push back against local government mass surveillance. STOP worked to pass the Public Oversight of Surveillance Technology Act in the New York City Council and used the law to uncover previously unknown NYPD surveillance contracts. This year they made significant strides in their campaign to ‘Ban the Scan’ (face recognition) in both the state assembly and the city council.

Another heavy hitter in the alliance, Lucy Parsons Labs , took the private-sector Atlanta Police Foundation to court to seek transparency over its functions on behalf of law enforcement agencies, arguing that those functions should be open to the same public records requests as the government agencies they are being used for. 

Defending constitutional rights against encroachments by police agencies is an uphill battle, and our allies in San Diego’s TRUST Coalition were among those fighting to protect Community Control Over Police Surveillance requirements previously passed by their city council.

We checked-in with CCTV Cambridge on their efforts to address digital equity with their Digital Navigator program, as well as highlighting them for Digital Inclusion Week 2024. CCTV Cambridge does work across all demographics in their city. For example, they implemented a Youth Media Program where teens get paid while developing skills to become professional digital media artists. They also have a Foundational Technology program for the elderly and others who struggle with increasing demands of technology in their lives. 

This has been a productive year organizing for digital rights in the Pacific Northwest. We were able to catch up with several allies in Portland, Oregon, at the PDX People’s Digital Safety Fair on the campaign to bring high-speed broadband to their city, which is led by Community Broadband PDX and the Personal TelCo Project. With six active EFA members in Portland and three in neighboring Washington state, we awere excited to watch the growing momentum for digital rights in the region.  

Citizens Privacy Coalition crowdfunded a documentary on the impacts of surveillance in the Bay Area, called "Watch the Watchers." The film features EFF's Eva Galperin and addresses how to combat surveillance across policy, technological guardrails, and individual action. 

Allies brought knowledge to their neighbors: 

The Electronic Frontiers track at the sci-fi, fantasy, and comic book-oriented Dragon*Con in Atlanta celebrated its 25th anniversary, produced in coordination with EFA member Electronic Frontiers Georgia. The digital rights component to Dragon*Con had its largest number of panels yet on a wide variety of digital rights issues, from vehicle surveillance to clampdowns against first amendment-protected activities. Members of EF-Georgia, EFF, and allied organizations presented on a variety of topics, including:  

More of this year’s Dragon*Con panels can be found at EF-Georgia’s special Dragon*Con playlist

EFF-Austin led the alliance in recorded in-person events, with monthly expert talks in Texas and meet-ups for people in their city interested in privacy, security, and skills in tech. They worked with new EFA member Open Austin to talk about how Austinites can get involved through civic-minded technology efforts. Other discussions included: 

Forging ahead into 2025: 

In complicated times, the EFA team is committed to building bridges for local groups and activists, because we build power when we work together, whether to fight for our digital rights, or to educate our neighbors on the issues and the technologies they face. In the coming year, a lot of EFA members will be focused on defending communities that are under attack, spreading awareness about the role data and digital communications play in broader struggles, and cultivating skills and knowledge among their neighbors.

To learn more about how the EFA works, please check out our FAQ page, and apply to join us. 

Past EFA members profiles: 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

José Martinez

The Growing Intersection of Reproductive Rights and Digital Rights: 2024 in Review

3 months ago

Dear reader of our blog, surely by now you know the format: as we approach the end of the year, we look back on our work, count our wins, learn from our misses, and lay the groundwork strategies for a better future. It's been an intense year in the fight for reproductive rights and its intersections with digital civil liberties. Going after cops illegally sharing location data, fighting the data broker industry, and building coalitions with the broader movement for reproductive justice—we've stayed busy.

The Fight Against Warrantless Access to Real-Time Location Tracking

The location data market is an unregulated nightmare industry that poses an existential threat to everyone's privacy, but especially those embroiled in the fight for reproductive rights. In a recent blog post, we wrote about the particular dangers posed by LocateX, a deeply troubling location tracking tool that allows users to see the precise whereabouts of individuals based on the locations of their smartphone devices. cops shouldn't be able to buy their way around having to get a warrant for real-time location tracking of anyone they please, regardless of the context. In regressive states that ban abortion, however, the problems with LocateX illustrate just how severe the issue can be for such a large population of people.

Building Coalition Within Digital Civil Liberties and Reproductive Justice

Part of our work in this movement is recognizing our lane: providing digital security tips, promoting the rights to privacy and free expression, and making connections with community leaders to support and elevate their work. This year we hosted a livestream panel featuring various next-level thinkers and reproductive justice movement leaders. Make sure to watch it if you missed it! Recognizing and highlighting our shared struggles, interests, and avenues for liberation is exactly how movements are fought for and won.

The Struggle to Stop Cops from Illegally Sharing ALPR data

It's been a multi-year battle to stop law enforcement agencies from illegally sharing out-of-state ALPR (automatic license plate reader) data. Thankfully this year we were able to celebrate a win: a grand jury in Sacramento made the motion to investigate two police agencies who have been illegally sharing this type of data. We're glad to declare victory, but those two agencies are far from the only problem. We hope this sets a precedent that cops aren't above the law and will continue to fight for just that. This win will help us build momentum to continue this fight into the coming year.

Sharing What We Know About Digital Surveillance Risks

We'll be the first to tell you that expertise in digital surveillance threats always begins with We've learned a lot in the few years we've been researching the privacy and security risks facing this issue space, much of it gathered from conversation and trainings with on-the-ground movement workers. We gathered what we've learned from that work and distilled it into an accessible format for anyone that needs it. Behind the scenes, this research continues to inform the hands-on digital security trainings we provide to activists and movement workers.

As we proceed into an uncertain future where abortion access will continue to be a difficult struggle, we'll continue to do what we do best: standing vigilant for peoples' right to privacy, fighting bad Internet laws, protecting free speech online, and building coalition with others. Thank you for your support.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2024.

Daly Barnett

【月刊マスコミ評・出版】偽情報とディープフェイク、国民生活の困窮=荒屋敷 宏

3 months ago
 文春ムック『文藝春秋オピニオン 2025年の論点100』は、アメリカ大統領選挙の結果が出る前に印刷したため、トランプ氏かハリス氏か、誰が当選してもいいように編集されている。 『世界』1月号(岩波書店)の特集「そしてアメリカは去った」で、酒井啓子、三牧聖子、川島真の3氏による座談会「戦争を止められるか―『国際秩序』の果てから」は、現在の国際情勢を手際よく整理している。戦争を止めることは、ジャーナリズムにとっても、最大の目標であろう。 「現在、世界的に政権与党に逆風が吹いており..
JCJ