令和6年度無線設備試買テスト中間報告(第1次)の公表
「電気通信事業法の消費者保護ルールに関するガイドライン」の改正案 に対する意見募集の結果及び改正ガイドラインの公表
第730回 入札監理小委員会(開催案内)
「サービス産業動向調査」2024年(令和6年)8月分(速報)
情報通信行政・郵政行政審議会 電気通信事業部会(第148回)配付資料・議事概要・議事録
総務省顧問の発令
Triumphs, Trials, and Tangles From California's 2024 Legislative Session
California’s 2024 legislative session has officially adjourned, and it’s time to reflect on the wins and losses that have shaped Californians’ digital rights landscape this year.
EFF monitored nearly 100 bills in the state this session alone, addressing a broad range of issues related to privacy, free speech, and innovation. These include proposed standards for Artificial Intelligence (AI) systems used by state agencies, the intersection of AI and copyright, police surveillance practices, and various privacy concerns. While we have seen some significant victories, there are also alarming developments that raise concerns about the future of privacy protection in the state.
Celebrating Our VictoriesThis legislative session brought some wins for privacy advocates—most notably the defeat of four dangerous bills: A.B. 3080, A.B. 1814, S.B. 1076, and S.B. 1047. These bills posed serious threats to consumer privacy and would have undermined the progress we’ve made in previous years.
First, we commend the California Legislature for not advancing A.B. 3080, “The Parent’s Accountability and Child Protection Act” authored by Assemblymember Juan Alanis (Modesto). The bill would have created powerful incentives for “pornographic internet websites” to use age-verification mechanisms. The bill was not clear on what counts as “sexually explicit content.” Without clear guidelines, this bill will further harm the ability of all youth—particularly LGBTQ+ youth—to access legitimate content online. Different versions of bills requiring age verification have appeared in more than a dozen states. We understand Asm. Alanis' concerns, but A.B. 3080 would have required broad, privacy-invasive data collection from internet users of all ages. We are grateful that it did not make it to the finish line.
Second, EFF worked with dozens of organizations to defeat A.B. 1814, a facial recognition bill authored by Assemblymember Phil Ting (San Francisco). The bill attempted to expand the use of facial recognition software by police to “match” images from surveillance databases to possible suspects. Those images could then be used to issue arrest warrants or search warrants. The bill merely said that these matches can't be the sole reason for a warrant to be issued—a standard that has already failed to stop false arrests in other states. Police departments and facial recognition companies alike both currently maintain that police cannot justify an arrest using only algorithmic matches–so what would this bill really change? The bill only gave the appearance of doing something to address face recognition technology's harms, while allowing the practice to continue. California should not give law enforcement the green light to mine databases, particularly those where people contributed information without knowledge that it would be accessed by law enforcement. You can read more about this bill here, and we are glad to see the California legislature reject this dangerous bill.
EFF also worked to oppose and defeat S.B. 1076, by Senator Scott Wilk (Lancaster). This bill would have weakened the California Delete Act (S.B. 362). Enacted last year, the Delete Act provides consumers with an easy “one-click” button to request the removal of their personal information held by data brokers registered in California. By January 1, 2026. S.B. 1076 would have opened loopholes for data brokers to duck compliance. This would have hurt consumer rights and undone oversight on an opaque ecosystem of entities that collect then sell personal information they’ve amassed on individuals. S.B. 1076 would have likely created significant confusion with the development, implementation, and long-term usability of the delete mechanism established in the California Delete Act, particularly as the California Privacy Protection Agency works on regulations for it.
Lastly, EFF opposed S.B. 1047, the “Safe and Secure Innovation for Frontier Artificial Intelligence Models Act” authored by Senator Scott Wiener (San Francisco). This bill aimed to regulate AI models that might have "catastrophic" effects, such as attacks on critical infrastructure. Ultimately, we believe focusing on speculative, long-term, catastrophic outcomes from AI (like machines going rogue and taking over the world) pulls attention away from AI-enabled harms that are directly before us. EFF supported parts of the bill, like the creation of a public cloud-computing cluster (CalCompute). However, we also had concerns from the beginning that the bill set an abstract and confusing set of regulations for those developing AI systems and was built on a shaky self-certification mechanism. Those concerns remained about the final version of the bill, as it passed the legislature.
Governor Newsom vetoed S.B. 1047; we encourage lawmakers concerned about the threats unchecked AI may pose to instead consider regulation that focuses on real-world harms.
Of course, this session wasn’t all sunshine and rainbows, and we had some big setbacks. Here are a few:
The Lost Promise of A.B. 3048Throughout this session, EFF and our partners supported A.B. 3048, common-sense legislation that would have required browsers to let consumers exercise their protections under the California Consumer Privacy Act (CCPA). California is currently one of approximately a dozen states requiring businesses to honor consumer privacy requests made through opt–out preference signals in their browsers and devices. Yet large companies have often made it difficult for consumers to exercise those rights on their own. The bill would have properly balanced providing consumers with ways to exercise their privacy rights without creating burdensome requirements for developers or hindering innovation.
Unfortunately, Governor Newsom chose to veto A.B. 3048. His veto letter cited the lack of support from mobile operators, arguing that because “No major mobile OS incorporates an option for an opt-out signal,” it is “best if design questions are first addressed by developers, rather than by regulators.” EFF believes technologists should be involved in the regulatory process and hopes to assist in that process. But Governor Newsom is wrong: we cannot wait for industry players to voluntarily support regulations that protect consumers. Proactive measures are essential to safeguard privacy rights.
This bill would have moved California in the right direction, making California the first state to require browsers to offer consumers the ability to exercise their rights.
Wrong Solutions to Real ProblemsA big theme we saw this legislative session were proposals that claimed to address real problems but would have been ineffective or failed to respect privacy. These included bills intended to address young people’s safety online and deepfakes in elections.
While we defeated many misguided bills that were introduced to address young people’s access to the internet, S.B. 976, authored by Senator Nancy Skinner (Oakland), received Governor Newsom’s signature and takes effect on January 1, 2027. This proposal aims to regulate the "addictive" features of social media companies, but instead compromises the privacy of consumers in the state. The bill is also likely preempted by federal law and raises considerable First Amendment and privacy concerns. S.B. 976 is unlikely to protect children online, and will instead harm all online speakers by burdening free speech and diminishing online privacy by incentivizing companies to collect more personal information.
It is no secret that deepfakes can be incredibly convincing, and that can have scary consequences, especially during an election year. Two bills that attempted to address this issue are A.B. 2655 and A.B. 2839. Authored by Assemblymember Marc Berman (Palo Alto), A.B. 2655 requires online platforms to develop and implement procedures to block and take down, as well as separately label, digitally manipulated content about candidates and other elections-related subjects that creates a false portrayal about those subjects. We believe A.B. 2655 likely violates the First Amendment and will lead to over-censorship of online speech. The bill is also preempted by Section 230, a federal law that provides partial immunity to online intermediaries for causes of action based on the user-generated content published on their platforms.
Similarly, A.B. 2839, authored by Assemblymember Gail Pellerin (Santa Cruz), not only bans the distribution of materially deceptive or altered election-related content, but also burdens mere distributors (internet websites, newspapers, etc.) who are unconnected to the creation of the content—regardless of whether they know of the prohibited manipulation. By extending beyond the direct publishers and toward republishers, A.B. 2839 burdens and holds liable republishers of content in a manner that has been found unconstitutional.
There are ways to address the harms of deepfakes without stifling innovation and free speech. We recognize the complex issues raised by potentially harmful, artificially generated election content. But A.B. 2655 and A.B. 2839, as written and passed, likely violate the First Amendment and run afoul of federal law. In fact, less than a month after they were signed, a federal judge put A.B. 2839’s enforcement on pause (via a preliminary injunction) on First Amendment grounds.
Privacy Risks in State DatabasesWe also saw a troubling trend in the legislature this year that we will be making a priority as we look to 2025. Several bills emerged this session that, in different ways, threatened to weaken privacy protections within state databases. Specifically, A.B. 518 and A.B. 2723, which received Governor Newsom’s signature, are a step backward for data privacy.
A.B. 518 authorizes numerous agencies in California to share, without restriction or consent, personal information with the state Department of Social Services (DSS), exempting this sharing from all state privacy laws. This includes county-level agencies, and people whose information is shared would have no way of knowing or opting out. A. B. 518 is incredibly broad, allowing the sharing of health information, immigration status, education records, employment records, tax records, utility information, children’s information, and even sealed juvenile records—with no requirement that DSS keep this personal information confidential, and no restrictions on what DSS can do with the information.
On the other hand, A.B. 2723 assigns a governing board to the new “Cradle to Career (CTC)” longitudinal education database intended to synthesize student information collected from across the state to enable comprehensive research and analysis. Parents and children provide this information to their schools, but this project means that their information will be used in ways they never expected or consented to. Even worse, as written, this project would be exempt from the following privacy safeguards of the Information Practices Act of 1977 (IPA), which, with respect to state agencies, would otherwise guarantee California parents and students:
- the right for subjects whose information is kept in the data system to receive notice their data is in the system;
- the right to consent or, more meaningfully, to withhold consent;
- and the right to request correction of erroneous information.
By signing A.B. 2723, Gov. Newsom stripped California parents and students of the rights to even know that this is happening, or agree to this data processing in the first place.
Moreover, while both of these bills allowed state agencies to trample on Californians’ IPA rights, those IPA rights do not even apply to the county-level agencies affected by A.B. 518 or the local public schools and school districts affected by A.B. 2723—pointing to the need for more guardrails around unfettered data sharing on the local level.
A Call for Comprehensive Local ProtectionsA.B. 2723 and A.B. 518 reveal a crucial missing piece in Californians' privacy rights: that the privacy rights guaranteed to individuals through California's IPA do not protect them from the ways local agencies collect, share, and process data. The absence of robust privacy protections at the local government level is an ongoing issue that must be addressed.
Now is the time to push for stronger privacy protections, hold our lawmakers accountable, and ensure that California remains a leader in the fight for digital privacy. As always, we want to acknowledge how much your support has helped our advocacy in California this year. Your voices are invaluable, and they truly make a difference.
Let’s not settle for half-measures or weak solutions. Our privacy is worth the fight.
【時事マンガ】国民の鉄槌が下された=画・八方美人
Turtle power: Insights into community-centred connectivity in Indonesia
No Matter What the Bank Says, It's YOUR Money, YOUR Data, and YOUR Choice
The Consumer Finance Protection Bureau (CFPB) has just finalized a rule that makes it easy and safe for you to figure out which bank will give you the best deal and switch to that bank, with just a couple of clicks.
We love this kind of thing: the coolest thing about a digital world is how easy it is to switch from product or service to another—in theory. Digital tools are so flexible, anyone who wants your business can write a program to import your data into a new service and forward any messages or interactions that show up at the old service.
That's the theory. But in practice, companies have figured out how to use law - IP law, cybersecurity law, contract law, trade secrecy law—to literally criminalize this kind of marvelous digital flexibility, so that it can end up being even harder to switch away from a digital service than it is to hop around among traditional, analog ones.
Companies love lock-in. The harder it is to quit a product or service, the worse a company can treat you without risking your business. Economists call the difficulties you face in leaving one service for another the "switching costs" and businesses go to great lengths to raise the switching costs they can impose on you if you have the temerity to be a disloyal customer.
So long as it's easier to coerce your loyalty than it is to earn it, companies win and their customers lose. That's where the new CFPB rule comes in.
Under this rule, you can authorize a third party - another bank, a comparison shopping site, a broker, or just your bookkeeping software - to request your account data from your bank. The bank has to give the third party all the data you've authorized. This data can include your transaction history and all the data needed to set up your payees and recurring transactions somewhere else.
That means that—for example—you can authorize a comparison shopping site to access some of your bank details, like how much you pay in overdraft fees and service charges, how much you earn in interest, and what your loans and credit cards are costing you. The service can use this data to figure out which bank will cost you the least and pay you the most.
Then, once you've opened an account with your new best bank, you can direct it to request all your data from your old bank, and with a few clicks, get fully set up in your new financial home. All your payees transfer over, all your regular payments, all the transaction history you'll rely on at tax time. "Painless" is an admittedly weird adjective to apply to household finances, but this comes pretty darned close.
Americans lose a lot of money to banking fees and low interest rates. How much? Well, CFPB economists, using a very conservative methodology, estimate that this rule will make the American public at least $677 million better off, every year.
Now, that $677 million has to come from somewhere, and it does: it comes from the banks that are currently charging sky-high fees and paying rock-bottom interest. The largest of these banks are suing the CFPB in a bid to block the rule from taking effect.
These banks claim that they are doing this to protect us, their depositors, from a torrent of fraud that would be unleashed if we were allowed to give third parties access to our own financial data. Clearly, this is the only reason a giant bank would want to make it harder for us to change to a competitor (it can't possibly have anything to do with the $677 million we stand to save by switching).
We've heard arguments like these before. While EFF takes a back seat to no one when it comes to defending user security (we practically invented this), we reject the idea that user security is improved when corporations lock us in (and leading security experts agree with us).
This is not to say that a bad data-sharing interoperability rule wouldn't be, you know, bad. A rule that lacked the proper safeguards could indeed enable a wave of fraud and identity theft the likes of which we've never seen.
Thankfully, this is a good interoperability rule! We liked it when it was first proposed, and it got even better through the rulemaking process.
First, the CFPB had the wisdom to know that a federal finance agency probably wasn't the best—or only—group of people to design a data-interchange standard. Rather than telling the banks exactly how they should transmit data when requested by their customers, the CFPB instead said, "These are the data you need to share and these are the characteristics of a good standards body. So long as you use a standard from a good standards body that shares this data, you're in compliance with the rule." This is an approach we've advocated for years, and it's the first time we've seen it in the wild.
The CFPB also instructs the banks to fail safe: any time a bank gets a request to share your data that it thinks might be fraudulent, they have the right to block the process until they can get more information and confirm that everything is on the up-and-up.
The rule also regulates the third parties that can get your data, establishing stringent criteria for which kinds of entities can do this. It also limits how they can use your data (strictly for the purposes you authorize) and what they need to do with the data when that has been completed (delete it forever), and what else they are allowed to do with it (nothing). There's also a mini "click-to-cancel" rule that guarantees that you can instantly revoke any third party's access to your data, for any reason.
The CFPB has had the authority to make a rule like this since its founding in 2010, with the passage of the Consumer Financial Protection Act (CFPA). Back when the CFPA was working its way through Congress, the banks howled that they were being forced to give up "their" data to their competitors.
But it's not their data. It's your data. The decision about who you share it with belongs to you, and you alone.