Ninth Circuit Gets It: Interoperability Isn’t an Automatic First Step to Liability
A federal appeals court just gave software developers, and users, an early holiday present, holding that software updates aren’t necessarily “derivative,” for purposes of copyright law, just because they are designed to interoperate the software they update.
This sounds kind of obscure, so let’s cut through the legalese. Lots of developers build software designed to interoperate with preexisting works. This kind of interoperability is crucial to innovation, particularly in a world where a small number of companies control so many essential tools and platforms. If users want to be able to repair, improve, and secure their devices, they must be able to rely on third parties to help. Trouble is, Big Tech companies want to be able to control (and charge for) every possible use of the devices and software they “sell” you – and they won’t hesitate to use the law to enforce that control.
Courts shouldn’t assist, but unfortunately a federal district court did just that in the latest iteration of Oracle v. Rimini. Rimini provides support to improve the use and security of Oracle products, so customers don’t have to depend entirely on Oracle itself . Oracle doesn’t want this kind of competition, so it sued Rimini for copyright infringement, arguing that a software update Rimini developed was a “derivative work” because it was intended to interoperate with Oracle's software, even though the update didn’t use any of Oracle’s copyrightable code. Derivative works are typically things like a movie based on a novel, or a translation of that novel. Here, the only “derivative” aspect was that Rimini’s code was designed to interact with Oracle’s code.
Unfortunately, the district court initially sided with Oracle, setting a dangerous precedent. If a work is derivative, it may infringe the copyright in the preexisting work from which it, well, derives. For decades, software developers have relied, correctly, on the settled view that a work is not derivative under copyright law unless it is substantially similar to a preexisting work in both ideas and expression. Thanks to that rule, software developers can build innovative new tools that interact with preexisting works, including tools that improve privacy and security, without fear that the companies that hold rights in those preexisting works would have an automatic copyright claim to those innovations.
Rimini appealed to the Ninth Circuit, on multiple grounds. EFF, along with a diverse group of stakeholders representing consumers, small businesses, software developers, security researchers, and the independent repair community, filed an amicus brief in support explaining that the district court ruling on interoperability was not just bad policy, but also bad law.
The Ninth Circuit agreed:
In effect, the district court adopted an “interoperability” test for derivative works—if a product can only interoperate with a preexisting copyrighted work, then it must be derivative. But neither the text of the Copyright Act nor our precedent supports this interoperability test for derivative works.
The court goes on to give a primer on the legal definition of derivative work, but the key point is this: a work is only derivative if it “substantially incorporates the other work.”
Copyright already reaches far too broadly, giving rightsholders extraordinary power over how we use everything from music to phones to televisions. This holiday season, we’re raising a glass to the judges who sensibly reined that power in.
Customs & Border Protection Fails Baseline Privacy Requirements for Surveillance Technology
U.S. Customs and Border Protection (CBP) has failed to address six out of six main privacy protections for three of its border surveillance programs—surveillance towers, aerostats, and unattended ground sensors—according to a new assessment by the Government Accountability Office (GAO).
In the report, GAO compared the policies for these technologies against six of the key Fair Information Practice Principles that agencies are supposed to use when evaluating systems and processes that may impact privacy, as dictated by both Office of Management and Budget guidance and the Department of Homeland Security's own rules.
These include:
- Data collection. "DHS should collect only PII [Personally Identifiable Information] that is directly relevant and necessary to accomplish the specified purpose(s)."
- Purpose specification. "DHS should specifically articulate the purpose(s) for which the PII is intended to be used."
- Information sharing. "Sharing PII outside the department should be for a purpose compatible with the purpose for which the information was collected."
- Data security. "DHS should protect PII through appropriate security safeguards against risks such as loss, unauthorized access or use, destruction, modification, or unintended or inappropriate disclosure."
- Data retention. "DHS should only retain PII for as long as is necessary to fulfill the specified purpose(s)."
- Accountability. "DHS should be accountable for complying with these principles, including by auditing the actual use of PII to demonstrate compliance with these principles and all applicable privacy protection requirements."
These baseline privacy elements for the three border surveillance technologies were not addressed in any "technology policies, standard operating procedures, directives, or other documents that direct a user in how they are to use a Technology," according to GAO's review.
CBP operates hundreds of surveillance towers along both the northern and southern borders, some of which are capable of capturing video more than seven miles away. The agency has six large aerostats (essentially tethered blimps) that use radar along the southern border, with others stationed in the Florida Keys and Puerto Rico. The agency also operates a series of smaller aerostats that stream video in the Rio Grande Valley of Texas, with the newest one installed this fall in southeastern New Mexico. And the report notes deficiencies with CBP's linear ground detection system, a network of seismic sensors and cameras that are triggered by movement or footsteps.
The GAO report underlines EFF's concerns that the privacy of people who live and work in the borderlands is violated when federal agencies deploy militarized, high-tech programs to confront unauthorized border crossings. The rights of border communities are too often treated as acceptable collateral damage in pursuit of border security.
CBP defended its practices by saying that it does, to some extent, address FIPS in its Privacy Impact Assessments, documents written for public consumption. GAO rejected this claim, saying that these assessments are not adequate in instructing agency staff on how to protect privacy when deploying the technologies and using the data that has been collected.
In its recommendations, the GAO calls on the CBP Commissioner to "require each detection, observation, and monitoring technology policy to address the privacy protections in the Fair Information Practice Principles." But EFF calls on Congress to hold CBP to account and stop approving massive spending on border security technologies that the agency continues to operate irresponsibly.