渡部通信(2/6) : 天下は一人の天下にあらず
お知らせ:インシデント報告Webフォームメンテナンス(2025/02/06)終了のお知らせ
薬剤耐性菌に関するワーキンググループ(第57回)の開催について【2月17日開催】
農薬第四専門調査会(第40回)の開催について(非公開)【2月14日開催】
農薬第一専門調査会(第34回)の開催について(非公開)【2月13日開催】
Closing the Gap in Encryption on Mobile
It’s time to expand encryption on Android and iPhone. With governments around the world engaging in constant attacks on user’s digital rights and access to the internet, removing glaring and potentially dangerous targets off of people’s backs when they use their mobile phones is more important than ever.
So far we have seen strides for at least keeping messages private on mobile devices with end-to-end encrypted apps like Signal, WhatsApp, and iMessage. Encryption on the web has been widely adopted. We even declared in 2021 that “HTTPS Is Actually Everywhere.” Most web traffic is encrypted and for a website to have a reputable presence with browsers, they have to meet certain requirements that major browsers enforce today. Mechanisms like certificate transparency, Cross-origin resource sharing (CORS) rules, and enforcing HTTPS help prevent malicious activity happening to users every day.
Yet, mobile has always been a different and ever expanding context. You access the internet on mobile devices through more than just the web browser. Mobile applications have more room to spawn network requests in the app without the user ever knowing where and when a request was sent. There is no “URL bar” to see the network request URL for the user to see and check. In some cases, apps have been known to “roll their own” cryptographic processes outside of non-standard encryption practices.
While there is much to discuss on the privacy issues of TikTok and other social media apps, for now, let’s just focus on encryption. In 2020 security researcher Baptiste Robert found TikTok used their own “custom encryption” dubbed “ttEncrypt.” Later research showed this was a weak encryption algorithm in comparison to just using HTTPS. Eventually, TikTok replaced ttEncrypt with HTTPS, but this is an example of one of the many allowed practices mobile applications can engage in without much regulation, transparency, or control by the user.
Android has made some strides to protect users’ traffic in apps, like allowing you to set private DNS. Yet, Android app developers can still set a flag to use clear text/unencrypted requests. Android owners should be able to block app requests engaging in this practice. While security settings can be difficult for users to set themselves due to lack of understanding, it would be a valuable setting to provide. Especially since users are currently being bombarded on their devices to turn on features they didn’t even ask for or want. This flag can’t possibly capture all clear text traffic due to the amount of network access “below” HTTPS in the network stack apps can control. However, it would be a good first step for a lot of apps that still use HTTP/unencrypted requests.
As for iOS, Apple introduced a feature called iCloud Private Relay. In their words “iCloud Private Relay is designed to protect your privacy by ensuring that when you browse the web in Safari, no single party — not even Apple — can see both who you are and what sites you're visiting.” This helps shield your IP address from websites you’re visiting. This is a useful alternative for people using VPNs to provide IP masking. In several countries engaging in internet censorship and digital surveillance, using a VPN can possibly put a target on you. It’s more pertinent than ever to be able to privately browse on your devices without setting off alarms. But Private Relay is behind a iCloud+ subscription and only available on Safari. It would be better to make this free and expand Private Relay across more of iOS, especially apps.
There are nuances as to why Private Relay isn’t like a traditional VPN. The “first hop” exposes the IP address to Apple and your Internet Service Provider. However, the website names requested cannot be seen by either party. Apple is vague with its details about the “second relay,” stating, “The second internet relay is operated by third-party partners who are some of the largest content delivery networks (CDNs) in the world.” Cloudflare is confirmed as the third-party, and its explanation goes further to expound that the standards used for Private Relay are TLS 1.3, QUIC, and MASQUE.
The combination of protocols used in Private Relay could be utilized on Android by using Cloudflare’s 1.1.1.1 app. Which would be the “closest” match from a technical standpoint for Android, and be applied globally instead of just the browser. A more favorable outcome would be utilizing this technology on mobile in a way that doesn’t use just one company to distribute modern encryption. Android’s Private DNS setting allows for various options of providers, but that covers just the encrypted DNS part of the request.
VPNs are another tool that can be used to mask an IP address and circumvent censorship, especially in cases where someone distrusts their Internet Service Provider (ISP). But using VPNs for this sole purpose should start to become obsolete with modern encryption protocols that can be deployed to protect the user. Better encryption practices across mobile platforms would lessen the need for people to flock to potentially nefarious VPN apps that put the user in danger. Android just announced a new badge program that attempts to address this issue by getting VPNs to adhere to Play Store guidelines for security and Mobile Application Security Assessment (MASA) Level 2 validation. While this attempt is noted, when mass censorship is applied, users may not always go to the most reputable VPN or even be able to access reputable VPNs because Google and Apple comply with app store take downs. So widening encryption outside of VPN usage is essential. Blocking clear text requests by apps, allowing users to restrict an app’s network access, and expanding Apple’s Private Relay would be steps in the right direction.
There are many other privacy leaks apps can engage in that expose what you are doing. In the case of apps acting badly by either rolling their own, unverified cryptography or using HTTP, users should be able to block network access to those apps. Just because the problem of mobile privacy is complex, doesn’t mean that complexity should stop potential. We can have a more private internet on our phones. “Encrypt all the things!” includes the devices we use the most to access the web and communicate with each other every day.
JCA-NETのセミナー案内 : Xからの離脱相談会ほか
Joint statement from civil society for the AI Action Summit
Tansaが「人質司法 なぜ労組は狙われたのか」の連載を開始
Paraguay’s Broadband Providers Continue to Struggle to Attain Best Practices at Protecting Users’ Data
Paraguay’s five leading broadband service providers made some strides in making their privacy policies more accessible to the public, but continue to fall short in their commitments to transparency, due process in sharing metadata with authorities, and promoting human rights—all of which limits their user’s privacy rights, according to the new edition of TEDIC’s ¿Quién Defiende Tus Datos? (“Who Defends Your Data").
The report shows that, in general, providers operating as subsidiaries of foreign companies are making more progress in committing to user privacy than national internet providers. But the overall performance of the country’s providers continues to lag behind their counterparts in the region.
As in its four previous reports about Paraguay, TEDIC evaluated Claro, Personal, and Tigo, which are subsidiaries, and national providers Copaco and Vox.
The companies were evaluated on seven criteria: whether they provide clear and comprehensive information about how they collect, share, and store user data; require judicial authorization to disclose metadata and communication content to authorities; notify users whose data is turned over to the government; publicly take a stance to support rights protections; publish transparency reports; provide guidelines for security forces and other government bodies on how to request user information, and make their website accessible to people with disabilities.
Tigo performed best, demonstrating 73% overall compliance with the criteria, while Vox came in last, receiving credit for complying with only 5% of the requirements.
Paraguay’s full study is available in Spanish. The following table summarizes the report’s evaluations.
Privacy, Judicial Authorization Policies LagThe report shows that Claro, Personal, and Tigo provide relatively detailed information on data collection and processing practices, but none clearly describe data retention periods, a crucial aspect of data protection. Copaco, despite having a privacy policy, limits its scope to data collected on its applications, neglecting to address data processing practices for its services, such as Internet and telephone. Vox has no publicly available privacy policy.
On the plus side, three out of the five providers in the report met all criteria in the privacy policy category. No company disclosed its policies about data collection when TEDIC reports began in 2017. The progress, though slow, is notable given that Paraguay doesn’t have a comprehensive data protection law—one of the few Latin American countries without one. There is a bill pending in Paraguay’s Parliament, but it hasn't been finally approved so far.
All five providers require a court order before handing over user information, but the report concludes that their policies don’t cover communications metadata, despite the fact that international human rights standards applicable to surveillance, established in the rulings of the Inter-American Court of Human Rights in the cases Escher v. Brazil (2009) and CAJAR v. Colombia (2023), state that these should also be protected under privacy guarantees like the communications content.
None of the five ISPs has a policy of notifying users when their data is requested by the authorities. This lack of transparency, already identified in all previous editions of QDTD, raises significant concerns about user rights and due process protections in Paraguay.
While no providers have made a strong commitment to publicly promote human rights, Tigo met three out of four requirements to receive full credit in this category and Claro received half credit due to the policies of their parent companies, rather than from the direct commitment of their local units. Tigo and Claro are also the companies with the most security campaigns for their users, identified throughout the editions of ¿Quién Defiende Tus Datos?
Claro and Tigo also provide some transparency about government requests for user data, but these reports are only accessible on their parent company websites and, even then, the regional transparency reports do not always provide detailed country-level breakdowns, making it difficult to assess the specific practices and compliance rates of their national subsidiaries
Victory! EFF Helps Defeat Meritless Lawsuit Against Journalist
Jack Poulson is a reporter, and when a confidential source sent him the police report of a tech CEO’s arrest for felony domestic violence, he did what journalists do: reported the news.
The CEO, Maury Blackman, didn’t like that. So he sued Poulson—along with Amazon Web Service, Substack, and Poulson’s non-profit, Tech Inquiry—to try and force Poulson to take down his articles about the arrest. Blackman argued that a court order sealing the arrest allowed him to censor the internet—despite decades of Supreme Court and California Court of Appeals precedent to the contrary.
This is a classic SLAPP: strategic lawsuit against public participation. Fortunately, California’s anti-SLAPP statute provides a way for defendants to swiftly defeat baseless claims designed to chill their free speech.
The court granted Poulson’s motion to strike Blackman’s complaint under the anti-SLAPP statute on Tuesday.
In its order, the court agreed that the First Amendment protects Poulson’s right to publish and report on the incident report.
This is an important ruling.
Under Bartnicki v. Vopper, the First Amendment protects journalists who report on truthful matters of public concern, even when the information they are reporting on was obtained illegally by someone else. Without it, reporters would face liability when they report on information provided by whistleblowers that companies or the government wants to keep secret.
Those principles were upheld here: Although courts have the power to seal records in appropriate cases, if and when someone provides a copy of a sealed record to a reporter, the reporter shouldn’t be forced to ignore the newsworthy information in that record. Instead, they should be allowed to do what journalists do: report the news.
And thanks to the First Amendment, a journalist who hasn’t done anything illegal to obtain the information has the right to publish it.
The court agreed that Poulson’s First Amendment defense defeated all of Blackman’s claims. As the court said:
"This court is persuaded that the First Amendment’s protections for the publication of truthful speech concerning matters of public interest vitiate Blackman’s merits showing…in this case there is no evidence that Poulson and the other defendants knew the arrest was sealed before Poulson reported on it, and all defendants’ actions in not taking down the arrest information after Blackman informed them of the sealing order was not so wrongful or unlawful that they are not protected."
The court also agreed that CEOs like Blackman cannot rewrite history by obtaining court orders that seal unflattering information—like an arrest for felony domestic violence. Blackman argued that, because, under California law, sealed arrests are “deemed” not to have occurred for certain legal purposes, reporting that he had been arrested was somehow false—and actionable. It isn’t.
The court agreed with Poulson: statutory language that alleviates some of the consequences of an arrest “cannot alter how past events unfolded.”
Simply put, no one can use the legal system to rewrite history.
EFF is thrilled that the court agrees.
DDoSed by Policy: Website Takedowns and Keeping Information Alive
Who needs a DDoS (Denial of Service) attack when you have a new president? As of February 2nd, thousands of web pages and datasets have been removed from U.S. government agencies following a series of executive orders. The impacts span the Department of Veteran Affairs and the Center of Disease Control and Prevention, all the way to programs like Head Start.
Government workers had just two days to carry out sweeping takedowns and rewrites due to a memo from the Office of Personnel Management. The memo cites a recent executive order attacking Trans people and further stigmatizing them by forbidding words used to accurately describe sex and gender. The result was a government-mandated censorship to erase these identities from a broad swatch of websites, resources, and scientific research regardless of context. This flurry of confusion comes on the heels of another executive order threatening CDC research by denying funding for government programs which promoted diversity, equity, and inclusion or climate justice. What we’re left with has been an anti-science, anti-speech, and just plain dangerous fit of panic with untold impacts on the most vulnerable communities.
The good news is technologists, academics, librarians, and open access organizations rushed to action to preserve and archive the information once contained on these sites. While the memo’s deadline has passed, these efforts are ongoing and you can still help.
New administrations often revise government pages to reflect new policies, though they are usually archived, not erased. These takedowns are alarming because they go beyond the usual changes in power, and could deprive the public of vital information, including scientific research impacting many different areas ranging from life saving medical research to the deadly impacts of climate change.
To help mitigate the damage, institutions like the Internet Archive provided essential tools to fight these memory holes, such as their “End of Term” archives, which include public-facing websites (.gov, .mil, etc) in the Legislative, Executive, and Judicial branches of the government. But anyone can use the Wayback Machine for other sites and pages: if you have something that needs archiving, you can easily do so here. Submitted links will be backed up and can be compared to previous versions of the site. Even if you do not have direct access to a website's full backup or database, saving the content of a page can often be enough to restore it later. While the Wayback archive is surprisingly extensive, some sites or changes still slip through the cracks, so it is always worth submitting them to be sure the archive is complete.
Academics are also in a unique position to protect established science and historical record of this public data. Library Innovation Lab at Harvard Law School, for example, has been preserving websites for courts and law journals. This has included hundreds of thousands of valuable datasets from data.gov, government git repositories, and more. This initiative is also building new open-source tools so that others can also make verifiable backups.
The impact of these executive orders go beyond public-facing website content. The CDC, impacted by both executive orders, also hosts vital scientific research data. If someone from the CDC were interested in backing up vital scientific research that isn’t public-facing, there are other road maps as well. Sci-Hub, a project to provide free and unrestricted access to all scientific knowledge that contains 85 million scientific articles, was kept alive by individuals downloading and seeding 850 torrents containing Sci-Hub’s 77 TB library. A community of “data hoarders,” independent archivists who declare a “rescue target” and build a “rescue team” of storage and seeders, are also archiving public datasets, like those formerly available at data.cdc.gov, which were not saved in the Internet Archive’s End of Term Archive.
Dedicating time to salvage, upload, and stop critical data from going dark, as well as rehosting later, is not for everyone, but is an important way to fight back against these kinds of takedowns.
Maintaining Support for Open InformationThis widespread deletion of information is one of the reasons EFF is particularly concerned with government-mandated censorship in any context: It can be extremely difficult to know how exactly to comply, and it’s often easier to broadly remove huge swathes of information rather than risk punishment. By rooting out inconvenient truths and inconvenient identities, untold harms are done to the people most removed from power, and everyone’s well being is diminished.
Proponents of open information who have won hard fought censorship battles in the past that helped to create the tools and infrastructure needed to protect us in this moment. The global collaborative efforts afforded by digital technology means the internet rarely forgets, all thanks to the tireless work of institutional, community, and individuals in the face of powerful and erratic censors.
We appreciate those who have stepped in. These groups need constant support, especially our allies who have had their work threatened, and so EFF will continue to advocate for both their efforts and for policies which protect progress, research, and open information.