Speaking Freely: Ethan Zuckerman

2 months ago

Ethan Zuckerman is a professor at the University of Massachusetts at Amherst, where he teaches Public Policy, Communication and Information. He is starting a new research center called the Institute for Digital Public Infrastructure. Over the years, he’s been a tech startup guy (with Tripod.com), a non-profit founder (Geekcorps.org) and co-founder (Globalvoices.org), and throughout it all, a blogger.

This interview has been edited for length and clarity.*

York: What does free speech or free expression mean to you? 

It is such a complicated question. It sounds really easy, and then it gets really complicated really quickly. I think freedom of expression is this idea that we want to hear what people think and feel and believe, and we want them to say those things as freely as possible. But we also recognize at the same time that what one person says has a real effect on what other people are able to say or feel comfortable saying. So there’s a naive version of freedom of expression which sort of says, “I’m going to say whatever I want all the time.” And it doesn’t do a good job of recognizing that we are in community. And that the ways in which I say things may make it possible or not possible for other people to say things. 

So I would say that freedom of expression is one of these things that, on the surface, looks super simple. You want to create spaces for people to say what they want to say and speak their truths no matter how uncomfortable they are. But then you go one level further than that and you start realizing, oh, okay, what I’m going to do is create spaces that are possible for some people to speak and not for other people to speak. And then you start thinking about how you create a multiplicity of spaces and how those spaces interact with one another. So it’s one of these fractally complicated questions. The first cut at it is super simple. And then once you get a little bit into it it gets incredibly complicated. 

York: Let’s dig into that complexity a bit. You and I have known each other since about 2008, and the online atmosphere has changed dramatically in that time. When we were both, I would say, pretty excited about how the internet was able to bring people together across borders, across affinities, etc. What are some of the changes you’ve seen and how do you think we can preserve a sense of free expression online while also countering some of these downsides or harms? 

Let’s start with the context you and I met in. You and I both were very involved in early years with Global Voices. I’m one of the co-founders along with Rebecca MacKinnon and a whole crew of remarkable people who started this online community as a way of trying to amplify voices that we don’t hear from very often. A lot of my career on the internet has been about trying to figure out whether we can use technology to help amplify voices of people in parts of the world where most of us haven’t traveled, places that we seldom hear from, places that don’t always get attention in the news and such. So Rebecca and I, at the beginning of the 2000s, got really interested in ways that people were using blogs and new forms of technology to report on what was going on. And for me it was places like Sub-Saharan Africa. Rebecca was interested in places like North Korea and sort of getting a picture of what was going on in some of those places, through the lens, often, of Chinese business people who were traveling to those places. 

And we started meeting bloggers who were writing from Iraq, which was under US attack at that point. Who were writing from countries like Madagascar, which had a lot going on politically, but almost no one knew about it or was hearing about it. So you and I started working in this context of, can we amplify these voices? Can we help people speak freely and have an audience? Because that’s one of these interesting problems— you can speak freely if you’re anonymous and on an onion site, etc, but no one’s going to hear you. So can we help people not just speak freely, but can we help find an audience associated with it? And some of the work that I was doing when you and I first met was around things like anonymous blogging with wordpress and Tor. And literally building guides to help people who are whistleblowers in closed societies speak online. 

You and I were also involved with the Berkman Center at Harvard, and we were both working on questions of censorship. One of the things that’s so interesting for me—to sort of go back in history—is to think about how censorship has changed online. Who those opponents to speech are. We started with the assumption that it was going to be the government of Saudi Arabia, or the government of Tunisia, or the government of China, who was going to block certain types of speech at the national level. You know, “You can’t say this. You’re going to be taken down, or, at worst, arrested for saying this.” We then pivoted, to a certain extent, to worries about censorship by companies, by platforms. And you did enormous amounts of work on this! You were at war with Facebook, now Meta, over their work on the female-presenting nipple. Now looking at the different ways which companies might decide that something was allowable speech or unallowable speech based on standards that had nothing to do with what their users thought, but really what the platforms’ decisions were. 

Somewhere in the late 20-teens, I think the battlefield shifted a little bit. And I think there are still countries censoring the internet, there are still platforms censoring the internet, but we got much better at censorship by each other. And, for me, this begins in a serious way with Gamergate. Where you have people—women, critics of the gaming industry—talking about feminist counter-narratives in video games. And the reaction from certain members of an online community is so hostile and so abusive, there’s so much violent misogyny named at people like Anita Sarkeesian and sort of other leaders in this field, that it’s another form of silencing speech. Basically the consequences for some people speaking are now so high, like the amount of abuse you’re going to suffer, whether it’s swatting, whether it’s people releasing a videogame to beat you up—and that’s what happened to Anita—it doesn’t silence you in the same way that, like, the Great Firewall or having your blog taken down might silence you. But the consequences for speech get so high that they really shift and change the speech environment. And part of what’s so tricky about this is some of the people who are using speech to silence speech talk about their right to free speech and how free speech protects their ability to do this. And in some sense, they’re right. In another sense, they’re very wrong. They’re using speech to raise the consequences for other people’s speech and make it incredibly difficult for certain types of speech to take place. 

So I feel like we’ve gone from these very easy enemies—it’s very easy to be pissed off at the Saudis or the Chinese, it’s really satisfying to be pissed off at Facebook or any of the other platforms. But once we start getting to the point where we’re sort of like, hey, your understanding of free speech is creating an environment where it’s very hard or it’s very dangerous for others to speak, that’s where it gets super complicated. And so I would say I’ve gone from a firm supporter of free speech online, to this sort of complicated multilayered, “Wow, there’s a lot to think about in this” that I sort of gave you based on your opening question. 

York: Let’s unpack that a bit, because it’s complicated for me as well. I mean, over the years my views have also shifted. But right now we are seeing an uptick in attempts to censor legitimate speech from the various bills that we’re seeing across the African continent against LGBTQ+ speech, Saudi Arabia is always an evergreen example, Sudan just shut down the internet again, Israel shut down the internet in Palestine, Iran still has some sort of ongoing shutdown, etc etc, I mean name a country and there’s probably something ongoing. And, of course, including the US with the Kids Online Safety Act (KOSA), which will absolutely have a negative impact on free expression for a lot of people. And of course we’re also seeing abortion-related speech being chilled in the US. So, with all of those examples, how do we separate the questions of how we deal with this idea of crowding or censoring eachother’s speech with the very real, persistent threats to speech that we’re seeing? 

I think it is totally worthwhile to mention that actors in this situation have different levels of power. So when you look at something like the Kids Online Safety Act (KOSA), which has the real danger of essentially leaving what is prohibited speech up to individual state attorneys general. And we are seeing different American state attorneys general essentially say we are going to use this to combat “transgenderism,” we’re going to use this to combat—what they see as—the “LGBTQ agenda”, but a lot of the rest of us see as humanity and people having the ability to express their authentic selves. When you have a state essentially saying, “We’re going to censor content accessible to people under 18,” first of all, I don’t think it will pass Supreme Court muster. I think even under the crazy US Supreme Court at the moment, that’s actually going to get challenged successfully. 

When I talk about this progression from state censorship to platform censorship to individual censorship, there is a decreasing amount of power. States have guns, they can arrest you. There’s a lot of things Facebook can do to you, but they can’t, at this point, arrest you. They do have enormous power in terms of large swaths of the online environment, and we need to hold that sort of power accountable as well. But these things have to be an “and”, not an “or.” 

And, at the same time, as we are deeply concerned about state power and we’re deeply concerned about platform power, we also have to recognize that changes to a speech environment can make it incredibly difficult for people to participate or not participate. So one of the examples of this, in many ways, is changes to Twitter under Elon Musk. Where technical changes as well as moderation changes have made this a less safe space for a lot of people. And under the heading of free speech, you now have an environment where it is a whole lot easier to be harassed and intimidated to the point where it may not be easy to be on the platform anymore. Particularly if you are, say, a Muslim woman coming from India, for instance. This is a subject that I’m spending a lot of time with my friend and student Ifat Gazia looking at, how Hindutva is sort of using Twitter to gang up on Kashmirian women and create circumstances where it’s incredibly unsafe and unpleasant for them to be speaking where anything they say will turn into misogynistic trolling as well as attempts to get them kicked off the platform. And so, what’s become a free speech environment for Hindu nationalism turns out to make that a much less safe environment for the position that Kashmir should be independent or that Muslims should be equal Indian citizens. And so, this then takes us to this point of saying we want either the State or the platform to help us create a level playing field, help us create a space in which people can speak. But then suddenly we have both the State and the platform coming in and saying, “you can say this, and not say this.” And that’s why it gets so complicated so fast. 

York: There are many challenges to anonymous speech happening around the world. One example that comes to mind is the UK’s Online Safety Act, which digs into it a bit. We also both have written about the importance of anonymity for protecting vulnerable communities online. Have your views on anonymity or pseudonymity changed over the years? 

One of the things that was so interesting about early blogging was that we started seeing whistleblowers. We started seeing people who had information from within governments finding ways to express what was going on, within their states and within their countries. And I think to a certain extent, kind of leading up to the rise of WikiLeaks, there was this sort of idea that anonymity was almost a mark of authenticity. If you had to be anonymous perhaps it was because you were really close to the truth. Many of us took leaks very seriously. We took this idea that this was a leak, this was the unofficial narrative, we should pay an enormous amount of attention to it. I think, like most things in a changing media environment, the notion of leaking and the notion of protected anonymity has gotten weaponized to a certain extent. I think, you know, Wikileaks is its own complicated narrative where things which were insider documents within, say, Kenya, early on in WikiLeak’s history, sort of turned into giant document dumps with the idea that there must be something in here somewhere that’s going to turn out to be important. And, often, there was something in there, and there was also a lot of chaff in there. I think people learned how to use leaking as a strategy. And now, anytime you want people to pay attention to a set of documents, you say, I’m going to go ahead and “leak” them. 

At the same time, we’ve also seen people weaponize anonymity. And a story that you and I are both profoundly familiar with is Gay Girl in Damascus. Where you had someone using anonymity to claim that she was a lesbian living in a conservative community and talking about her experiences there. But of course it turned out to be a middle aged male Scotsman who had taken on this identity in the hopes of being taken more seriously. Because, of course, everyone knows that middle aged white men never get a voice in online dialogues, he had to make himself into a queer, Syrian woman to have a voice in that dialogue. Of course, the real amusing part of that, and what we found out in unwinding that situation, was that he was in a relationship with another fake lesbian who was another dude pretending to be a lesbian to have a voice online. So there’s this way in which we went from this very sort of naive, “it’s anonymous, therefore it’s probably a very powerful source,” to, “it’s anonymous, it’s probably yet another troll.” 

I think the answer is anonymity is really complicated. Some people really do need anonymity. And it’s really important to construct ways in which people can speak freely. But anyone who has ever worked with whistleblowers—and I have—will tell you that finding a way to actually put your name to something gives it vastly more power. So I think anonymity remains important, we’ve got to find ways to defend and protect it. I think we’re starting to find that the sort of Mark Zuckerberg idea, “you get rid of anonymity and the web will be wonderful”, is complete crap. There’s many communities that end up being very healthy with persistent pseudonyms or even anonymity. It has more to do with the space and the norms associated with it. But anonymity is neither the one size fits all solution to making whistleblowing safe, nor is it the “oh no, if you let anonymity in your community will collapse.” Like everything in this space, it turns out to be complicated and nuanced. And both more and less important than we tend to think. 

York: Tell me about an early experience that shaped your views on free expression. 

The story of Hao Wu is the story I want to tell here. When I think about freedom of expression online, I find myself thinking a lot about his story. Hao Wu is a documentary filmmaker. At this point, a very accomplished documentary filmmaker. He has made some very successful films, including one called The People’s Republic of Desire about Chinese live-streaming, which has gotten a great deal of celebration. He has a new film out called 76 Days about the lockdown of Wuhan. But I got to know him very indirectly, and it was from the fact that he was making a film in China about the phenomenon of underground Christian churches. And he got arrested and held for five months, and we knew about him through the Global Voices community because he had been an active blogger. We’d been paying attention to some of the work he was doing and suddenly he’d gone silent. 

I ended up working with Rebecca MacKinnon, who speaks Chinese and was in touch with all the folk involved, and I was doing the websites and such, building a free Hao Wu blog. And using that, and sort of platforming his sister, as a chance to advocate for his release. And what was so fascinating about this was Rebecca and I spent months writing about and talking about what was going on, and encouraging his sister to speak out, but she—completely understandably—was terrified about the consequences for her own life and her own career and family. At a certain point she was willing to write online and speak out, but that experience of sort of realizing that something that feels very straightforward and easy from your perspective, miles and miles away from the political situation, like, here’s this young man who is a filmmaker and a blogger and clearly a smart, interesting person, he should be able to speak freely, of course we’re going to advocate for his release. And then talking to his family and seeing the genuine terror that his sister had, that her life could be entirely transformed, and transformed negatively, by advocating for something as simple as her brother’s release. 

It’s interesting, I think about our mutual friend Alaa Abd El-Fattah, who has spent most of his adult life in Egyptian prisons, getting detained again and again and again. His family, his former partner, and many of his friends have spent years and years and years advocating for him. This whole process of advocating for someone’s ability to speak, advocating for someone’s ability to take political action, advocating for someone’s ability to make art—the closer you get to the situation, the harder it gets. Because the closer you are to the situation, the more likely that the injustice that you’re advocating to have overturned, is one that you’re experiencing as well. And it’s really interesting. I think it makes it very easy to advocate from a distance, and often much harder to advocate when you’re much closer to a situation. I think any situations where we find ourselves yelling about something on the other side of the world, it’s a good moment to sort of check and ask, are the people who are yelling the people who are directly affected by this—are they not yelling because the danger is so high, are they not yelling because maybe we misunderstand and are advocating for something that seems right and seems obvious but is actually much more complicated than we might otherwise think? 

York: Your lab is advocating for what you call a pluraverse. So you recognize that all these major platforms are going to continue to exist, people are going to continue to use them, but as we’re seeing a multitude of mostly decentralized platforms crop up, how do we see the future of moderation on those platforms? 

It’s interesting, I spend a ton of my time these days going out and sort of advocating for a pluraverse vision of the internet. And a lot of my work is trying to both set up small internet communities with very specific foci associated with them and thinking about an architecture that allows for a very broad range of experiences. One thing I found in all this is that small platforms often have much more restrictive rules than you would expect, and often for the better. And I’ll give a very tangible example. 

I am a large person. I am, for the first time in a long time, south of 300 pounds. But for a long time I have been around between 290 and 310 for most of my adult life. And I started running about six months ago. I was inspired by a guy named Martinus Evans, who ran his first marathon at 380 pounds, and started a running club called the Slow AF Running Club, which has a very active online community and advocates for fitness and running at any size. And so I now log on to this group probably three or four times a week to log my runs, get encouragement, etc. I had to write an essay to join this community. I had to sign on to an incredible set of rules, including no weight talk, no weight loss talk, no body talk. All sorts of things. And you might say, I have freedom of speech! I have freedom of expression! Well, I’m choosing to set that aside so that I can be a member of this community and get support in particular ways. And in a pluraverse, if I want to talk about weight loss or bodies or something like that I can do it somewhere else! But to be a part of this extremely healthy online community that’s really helping me out a lot, I have to sort of agree and put certain things in a box. 

And this is what I end up referring to as “small rooms.” Small rooms have a purpose. They have a community. They might have a very tight set of speech regulations. And they’re great—for that specific conversation. They’re not good for broader conversations. If I want to advocate for body positivity. If I want to advocate for healthy at any weight, any number of other things, I’m going to need to step into a bigger room. I’m going to need to go to Twitter or Facebook or something like that. And there the rules are going to be very different. They’re going to be much broader. They’re going to encourage people to come back and say, “Shut up you fat fuck.” And that is in fact what happens when you encounter some of these things on a space like Reddit. So this world of small rooms and big rooms is a world in which you might find yourself advocating for very tight speech restrictions if the community chooses them on specific platforms. And you might be advocating for very broad open rules in the large rooms with the notion that there’s always going to be conflict and there’s a need for moderation. 

Here is one of the problems that always comes up in these spaces. What happens if the community wants to have really terrible rules? What if the community is KiwiFarms and the rules are we’re going to find trans people and we’re going to harass them, preferably to death? What if that tiny room is Stormfront and we’re going to party like it’s 1939? We’re going to go right back to going after white nationalism and Christian nationalism and anti-Jewish and anti-Muslim? And things get really tricky when the group wants to trade Child Sexual Abuse Material (CSAM), because they certainly do. Or they want to create un-permissioned nonconsensual sexual imagery? What if it’s a group that wants to make images of Taylor Swift doing lots of things that she has never done or certainly has not circulated photos of? 

So I’ve been trying to think about this architecturally. So I think the way that I want to handle this architecturally is to have the friendly neighborhood algorithm shop. And the friendly neighborhood algorithm shop lets you do two things. It lets you view social media on a client that you control through a set of algorithms that you care about. So if you want to go in and say, “I don’t want any politics today,” or “I want politics, but only highly-verified news,” or “frankly, today give me nothing but puppies.” I think you should have the ability to choose algorithms that are going to filter your media, and choose to use them that way. But I also think the friendly neighborhood algorithm shop needs to serve platforms. And I think some platforms may say, “Hey, we’re going to have this set of rules and we’re going to enforce them algorithmically, and here are the ones we’re going to enforce by hand.” And I think certain algorithms are probably going to become de rigeur. 

I think having a check for known CSAM is probably a bare minimum for running a responsible platform these days. And having these sorts of tools that Facebook and such have created to scan large sets of images for  known CSAM, making those tools available to even small platform operators is probably a very helpful thing to do. I don’t think you’re going to require someone to do this for a Mastodon node, but I think it’s going to be harder and harder to run a Mastodon node if you don’t have some of those basic protections in place. Now this gets real hard really quickly. It gets real hard because we know that some other databases out there—including databases of extremist and terrorist content—are not reviewable. We are concerned that those databases may be blocking content that is legitimate political expression, and we need to figure out ways to be able to audit these and make sure that they’re used correctly. We also, around CSAM specifically, are starting to experience a wave of people generating novel CSAM that may not actually involve an actual child, but are recombinations of images to create new scenarios. I’ve got be honest with you, I don’t know what we’re going to do there. I don’t know how we anticipate it and block it, I don’t even know the legal status of blocking some of that imagery where there is not an actual child harmed. 

So these aren’t complete solutions. But I think getting to the point where we’re running a lot of different communities, we have an algorithmic toolkit that’s available to try to do some of that moderation that we want around the community, and there is an expectation that you’re doing that work. And if you’re not, it may be harder and harder to keep that community up and running and have people interact and interoperate with you. I think that’s where I find myself doing a lot of thinking and a lot of advocacy these days. 

We did a piece a few months ago called “The Three Legged Stool,” which is our manifesto for how to do a pluraverse internet and also have moderation and governability. It’s this sort of idea that you want to have quite a bit of control through what we call the loyal client, but you also want the platforms to have the ability to use these sorts of things. So you’ve got folks out there who are basically saying, “Oh no, Mastodon is going to become a cesspit of CSAM.” And, you know, there’s some evidence of that. We’re starting to see some pockets of that. The truth is, I don’t think Mastodon is where it’s mostly happening. I think it’s mostly on much more closed channels. But something we’ve seen from day one is that when you have the ability to do user-generated content, you’re going to get pornography and some of that pornography is going to go beyond the bounds of the galley. And you’re going to end up with that line between pornography and other forms of imagery that are legally prohibited. So there’s gotta be some architectural solution, and I think at some point, running a node without having thought about those technical and architectural solutions is going to start feeling deeply irresponsible. And I think there may be ways in which not only does it end up being irresponsible, but people may end up refusing services to you if you’re not putting those basic protections into place. 

York: Do you have a free speech or free expression hero? 

Oh, that’s interesting. I mean I think this one is probably one that a lot of people are going to say, but it’s Maria Ressa. I think the places in which free expression, to me, feel absolutely the most important to defend is in holding power to account. And what Maria was doing with Rappler in the Philippines was trying to hold an increasingly autocratic government responsible for its actions. And in the process found herself facing very serious consequences—imprisonment, loss of employment, those sorts of things—and managed to find a way to turn that fight into something that called an enormous amount of attention to the Duterte government and opened global conversations about how important it is to protect journalistic freedom of expression. So I’m not saying that journalistic freedom of expression is the only freedom of expression that’s important, I think enormous swaths of freedom of expression are important, but I think it’s particularly important. And I think freedom of expression in the face of real power and real consequences is particularly worth lauding and praising. And I think Maria has done something very interesting which is she has implicated a whole bunch of other actors, not just the Philippines government, but also Facebook and also the sort of economic model of surveillance capitalism. And she encouraged people to think about how all of these are playing into freedom of expression conversations. So I think that ability to take a struggle where the consequences for you are very personal and very individual and turn it into a global conversation is incredibly powerful.

Jillian C. York

被害者切り捨てでつながる「水俣」と「福島」〜水俣病患者説明会「マイク切り事件」で見えた環境行政の冷酷非道/黒鉄好

2 months ago
5月1日、熊本県水俣市で開催された伊藤信太郎環境相と水俣病の被害者団体などとの懇談会で、水俣病患者が発言中にもかかわらず、制限時間3分が経過したことを理由に環境省側がマイクの電源を切った問題は、被害者のみならず世論の広範な怒りを引き起こしている。この「マイク切り事件」をめぐって驚くべき事実が判明した。 当日、説明会で進行役を務めたのは、環境省大臣官房環境保健部企画課特殊疾病対策室の木内哲平室長である。環境省組織規則によると、特殊疾病対策室は公害健康被害の認定・補償給付、予防、公害保健福祉事業の他、水俣病に対する業務を担当する部門である。木内室長は、懇談会終了後に来場者から上がった「マイクの電源を切ったのか」との質問には直接答えず「事務局の不手際」としたが、このことも環境省への反発をさらに拡大することになった。この木内氏、一体どういう人物なのか。

【出版界の動き】生成AIが創作者の権利と意欲を損なう危機=出版部会

2 months ago
◆「AIと著作権」を巡って 高い機能を持つ生成AI が多種多様な分野に進出し、創作者の著作権が侵害されるケースが頻出し、社会的な混乱が生じている。あらためて創作者や権利者から、現在の著作権法に謳われる法規制について、その見直しが叫ばれている。政府も各種対応に追われている。 この緊急テーマになっている 「AIと著作権」を巡り、どこまでコンセンサスが得られ、どこから先に議論の余地があるのか、上野達弘・奥邨弘司 編著『AIと著作権』(勁草書房 2月刊)が注目されている。世界各国の最..
JCJ

Podcast Episode: Chronicling Online Communities

2 months 1 week ago

From Napster to YouTube, some of the most important and controversial uses of the internet have been about building community: connecting people all over the world who share similar interests, tastes, views, and concerns. Big corporations try to co-opt and control these communities, and politicians often promote scary narratives about technology’s dangerous influences, but users have pushed back against monopoly and rhetoric to find new ways to connect with each other.

%3Ciframe%20height%3D%2252px%22%20width%3D%22100%25%22%20frameborder%3D%22no%22%20scrolling%3D%22no%22%20seamless%3D%22%22%20src%3D%22https%3A%2F%2Fplayer.simplecast.com%2Fa6343252-b5e8-4970-9307-1d0e5e9a0e66%3Fdark%3Dtrue%26amp%3Bcolor%3D000000%22%20allow%3D%22autoplay%22%3E%3C%2Fiframe%3E Privacy info. This embed will serve content from simplecast.com

   

(You can also find this episode on the Internet Archive and on YouTube.)

Alex Winter is a leading documentarian of the evolution of internet communities. He joins EFF’s Cindy Cohn and Jason Kelley to discuss the harms of behavioral advertising, what algorithms can and can’t be blamed for, and promoting the kind of digital literacy that can bring about a better internet—and a better world—for all of us. 

In this episode you’ll learn about: 

  • Debunking the monopolistic myth that communicating and sharing data is theft. 
  • Demystifying artificial intelligence so that it’s no longer a “black box” impervious to improvement. 
  • Decentralizing and democratizing the internet so more, diverse people can push technology, online communities, and our world forward. 
  • Finding a nuanced balance between free speech and harm mitigation in social media. 
  • Breaking corporations’ addiction to advertising revenue derived from promoting disinformation. 

Alex Winter is a director, writer and actor who has worked across film, television and theater. Best known on screen for “Bill & Ted’s Excellent Adventure” (1989) and its sequels as well as “The Lost Boys” (1987), “Destroy All Neighbors” (2024) and other films, he has directed documentaries including “Downloaded” (2013) about the Napster revolution; “Deep Web” (2015) about the online black market Silk Road and the trial of its creator Ross Ulbricht; “Trust Machine” (2018) about the rise of bitcoin and the blockchain; and “The YouTube Effect” (2022). He also has directed critically acclaimed documentaries about musician Frank Zappa and about the Panama Papers, the biggest global corruption scandal in history and the journalists who worked in secret and at great risk to break the story.   

Resources: 

What do you think of “How to Fix the Internet?” Share your feedback here

Transcript

ALEX WINTER
I think that people keep trying to separate the Internet from any other social community or just society, period. And I think that's very dangerous because I think that it allows them to be complacent and to allow these companies to get more powerful and to have more control and they're disseminating all of our information. Like, that's where all of our news, all of how anyone understands what's going on on the planet. 

And I think that's the problem, is I don't think we can afford to separate those things. We have to understand that it's part of society and deal with making a better world, which means we have to make a better internet.

CINDY COHN
That’s Alex Winter. He’s a documentary filmmaker who is also a deep geek.  He’s made a series of films that chronicle the pressing issues in our digital age.  But you may also know him as William S. Preston, Esquire - aka Bill of the Bill and Ted movies. 

I’m Cindy Cohn, the executive director of the Electronic Frontier Foundation.

JASON KELLEY
And I’m Jason Kelley, EFF’s Activism Director. This is our podcast series, How to Fix the Internet. 

CINDY COHN
On this show, we’re trying to fix the internet – or at least trying to envision what the world could look like if we get things right online. You know, at EFF we spend a lot of time pointing out the way things could go wrong – and then of course  jumping in to fight when they DO go wrong. But this show is about envisioning – and hopefully helping create – a better future.

JASON KELLEY
Our guest today, Alex Winter, is an actor and director and producer who has been working in show business for most of his life. But as Cindy mentioned, in the past decade or so he has become a sort of chronicler of our digital age with his documentary films. In 2013, Downloaded covered the rise and fall, and lasting impact, of Napster. 2015’s Deep Web – 

CINDY COHN
Where I was proud to be a talking head, by the way. 

JASON KELLEY
– is about the dark web and the trial of Ross Ulbricht who created the darknet market the Silk Road. And 2018’s Trust Machine was about blockchain and the evolution of cryptocurrency. And then most recently, The YouTube Effect looks at the history of the video site and its potentially dangerous but also beneficial impact on the world. That’s not to mention his documentaries on The Panama Papers and Frank Zappa. 

CINDY COHN
Like I said in the intro, looking back on the documentaries you’ve made over the past decade or so, I was struck with the thought that you’ve really become this chronicler of our digital age – you know, capturing some of the biggest online issues, or even shining a light a bit on some of the corners of the internet that people like me might live in, but others might not see so much. . Where does that impulse come from you?

ALEX WINTER
I think partly my age. I came up, obviously, before the digital revolution took root, and was doing a lot of work around the early days of CGI and had a lot of friends in that space. I got my first computer probably in ‘82 when I was in college, and got my first Mac in ‘83, got online by ‘84, dial-up era and was very taken with the nascent online communities at that time, the BBS and Usenet era. I was very active in those spaces. And I'm not at all a hacker, I was an artist and I was more invested in the spaces in that way, which a lot of artists were in the eighties and into the nineties, even before the web.

So I was just very taken with the birth of internet based communities and the fact that it was such a democratized space and I mean that, you know, literally – that it was such an interesting mix of people from around the world who felt free to speak about whatever topics they were interested in, there were these incredible people from around the world who were talking about politics and art and everything  in extremely a robust way.

But I also, um, It really seemed clear to me that this was the beginning of something, and so my interest from the doc side has always been charting the internet in terms of community, and what the impact of that community is on different things, either political or whatever. And that's why my first doc was about Napster, because, you know, fast forward to 1998, which for many people is ancient history, but for us was the future.

And you're still in a modem dial up era and you now have an online community that has over a hundred million people on it in real time around the world who could search each other's hard drives and communicate.  What made me, I think, want to make docs was Napster was the beginning of realizing this disparity between the media or the news or the public's perception of what the internet was and what my experience was.

Where Sean Fanning was kind of being tarred as this pirate and criminal. And while there were obviously ethical considerations with Napster in terms of the  distribution of music, that was not my experience. My experience was this incredibly robust community and that had extreme validity and significance in sort of human scale.

And that's, I think, what really prompted me to start telling stories in this space. I think if anyone's interested in doing anything, including what you all do there, it's because you feel like someone else isn't saying what you want to be said, right? And so you're like, well, I better say it because no one else is saying it. So I think that was the inspiration for me to spend more time in this space telling stories here.

CINDY COHN
That's great. I mean, I do, and the stuff I hear in this is that, you know, first of all, the internet kind of erased distance so you could talk to people all over the world from this device in your home or in one place. And that people were really building community. 

And I also hear this, in terms of Napster, this huge disconnect between the kind of business model view of music, and music fan’s views of music. One of the most amazing things for me was realizing that I could find somebody who had a couple of songs that I really liked and then look at everything else they liked. And it challenged this idea that only kind of professional music critics who have a platform can suggest music to you and opened up a world, like literally felt like something just like a dam broke, and it opened up a world to music. It sounds like that was your experience as well.

ALEX WINTER
It was, and I think that really aptly describes the, the almost addictive fascination that people had with Napster and the confusion, even retrospectively, that that addiction came from theft, from this desire to steal in large quantities. I mean obviously you had kids in college dorm rooms pulling down gigabytes of music but the pull, the attraction to Napster was exactly what you just said – like I would find friends in Japan and Africa and Eastern Europe who had some weird like Coltrane bootleg that I'd never heard and then I was like, oh, what else do they have? And then here's what I have, and I have a very eclectic music collection. 

Then you start talking about art then you start talking about politics because it was a very robust forum So everyone was talking to each other. So it really was community and I think that gets lost because the narrative wants to remain the narrative, in terms of gatekeepers, in terms of how capitalism works, and that power dynamic was so completely threatened by, by Napster that, you know, the wheels immediately cranked into gear to sort of create a narrative that was, if you use this, you're just a terrible human being. 

And of course what it created was the beginning of this kind of online rebellion where people before weren't probably, didn't think of themselves as technical, or even that interested in technology, were saying, well, I'm not this thing that you're saying I am, and now I'm really going to rebel against you. Now I'm really going to dive into this space. And I think that it actually created more people sort of entering online community and building online communities, because they didn't feel like they were understood or being adequately represented.

And that led all the way to the Arab Spring and Occupy, and so many other things that came up after that.

JASON KELLEY
The community's angle that you're talking about is probably really, I think, useful to our audience. Because I think they probably find themselves, I certainly find myself in a lot of the kinds of communities that you've covered. Which often makes me think, like, how is this guy inside my head?

How do you think about the sort of communities that you need to, or want to chronicle. I know you mentioned this disconnect between the way the media covers it and the actual community. But like, I'm wondering, what do you see now? Are there communities that you've missed the boat on covering?

Or things that you want to cover at this moment that just aren't getting the attention that you think they should?

ALEX WINTER
I honestly just follow the things that interest me the most. I don't particularly … look, because I don't see myself as a, you know, in brackets as a chronicler of anything. I'm not that self, you know, I have a more modest view of myself. So I really just respond to the things that I find interesting, that on two tracks, one that I'm personally being impacted by.

So I'm not really like an outsider viewing, like, what will I cover next or what topics should I address, but what's really impacting me personally, I was hugely invested in Napster. I mean, I was going into my office on weekends and powering every single computer up all weekend onto Napster for the better part of a year. I mean, Fanning laughed at me when I met him, but -

CINDY COHN  
Luckily, the statute of limitations may have run on that, that's good.

ALEX WINTER
Yeah, exactly. 

JASON KELLEY  
Yeah, I'm sure you're not alone.

ALEX WINTER
Yeah, but I mean as I told Don Ienner when I did the movie I was like I was like dude I'd already bought all this music like nine times over on vinyl, on cassette, on CD. I think I even had elcasets at one point. So the record industry still owes me money as far as I’m concerned.

CINDY COHN
I agree.

ALEX WINTER
But no, it was really a personal investment. Even, you know, my interest in the blockchain and Bitcoin, which I have mixed feelings about, I really tried to cover that almost more from a political angle. I was interested, same with DeepWeb in a way, but I was interested in how the sort of counter narrators were building online and how people were trying to create systems and spaces online once online became corporatized, which it really did as soon as the web appeared, what did people do in response to the corporatization of these spaces? 

And that's why I was covering Lowry Love's case in England, and eventually Barrett Brown's case, and then the Silk Road, which I was mostly interested in for the same reason as Napster, which was, who were these people, what were they talking about, what drew them to this space, because it was a very clunky, clumsy way to buy drugs, if that was really what you wanted to do, and Bitcoin is a terrible tool for crime, as everyone now, I think, knows, but didn't so well back then.

So what was really compelling people, and a lot of that was, again, it was Silk Road was very much like the sort of alt rec world of the early Usenet days. A lot of divergent voices and politics and, and things like that. 

So YouTube is different because it was, Gayle Ayn Hurd had approached me and asked me if I wanted to tackle this with her, the producer. And I'd been looking at Google, largely. And that was why I had a personal interest. And I've got three boys, all of whom came up in the YouTube generations. They all moved off of regular TV and onto their laptops at a certain point in their childhood, and just were on YouTube for everything.

So I wanted corporatization of the internet, about what was the societal impact of the fact that our, our largest online community, which is YouTube, is owned by arguably the largest corporation on the planet, which is also a monopoly, which is also a black box.

And what does that mean? What are the societal  implications of that? So that was the kind of motive there, but it still was looking at it as a community largely.

CINDY COHN
So the conceit of the show is that we're trying to fix the internet and I want to know, you've done a lot to shine these stories in different directions, but what does it look like if we get it right? What are the things that we will see if we build the kind of online communities that are better than I think the ones that are getting the most attention now.

ALEX WINTER
I think that, you know, I've spent the last two years since I made the film and up until very recently on the road, trying to answer that question for myself, really, because I don't believe I have the answer that I need to bestow upon the world. I have a lot of questions, yeah. I do have an opinion. 

But right now, I mean, I generally feel like many people do that we slept – I mean, you all didn't, but many people slept on the last 20 years, right? And so there's a kind of reckoning now because we let these corporations get away with murder, literally and figuratively. And I think that we're in a phase of debunking various myths, and I think that's going to take some time before we can actually even do the work to make the internet better. 

But I think, you know, I have a big problem, a large thesis that I had in making The YouTube Effect was to kind of debunk the theory of the rabbit hole and the algorithm as being some kind of all encompassing evil. Because I think, sort of like we're seeing in AI now with this rhetoric about AI is going to kill everybody. To me, those are very agenda based narratives. They convince the public that this is all beyond them, and they should just go back to their homes, and keep buying things and eating food, and ignore these thorny areas of which they have no expertise, and leave it to the experts.

And of course, that means the status quo is upheld. The corporations keep doing whatever they want and they have no oversight, which is what they want. Every time Sam Altman says, AI is going to kill the world, he's just saying, Open AI is a black box, please leave us alone and let us make lots of money and go away. And that's all that means. So I think that we have to start looking at the internet and technology as being run by people. There aren't even that many people running it, there's only a handful of people running the whole damn thing for the most part. They have agendas, they have motives, they have political affiliations, they have capitalist orientation.

So I think really being able to start looking at the internet in a much more specific way, I know that you all have been doing this for a long time, most people do not. So I think more of that, more calling people on the carpet, more specificity. 

The other thing that we're seeing, and again, I'm preaching to the choir here with EFF, but like any time the public or the government or the media wakes up to something that they're behind, their inclination of how to fix it is way wrong, right?

And so that's the other place that we're at right now, like with COSA and the DSA and the Section 230 reform discussions, and they're bananas. And you feel like you're screaming into a chasm, right? Because if you say these things, people treat you like you're some kind of lunatic. Like, what do you mean you don't want to turn off Section 230? That would solve everything! I'm like, it wouldn't, it would just break the internet! So I feel a little, you know, like a Cassandra, but you do feel like you're yowling into a void. 

And so I do think that it's going to take a minute to fix the internet. And I think that one of the things that I think we'll get there, I think the new generations are smarter, the stakes are higher for them. You know kids in school… Well, I don't think the internet or social media is necessarily bad for kids, like, full stopping. There's a lot of propaganda there, but I think that, you know, they don't want harms. They want a safer environment for themselves. They don't want to stop using these platforms. They just want them to work better. 

But what's happened in the last couple of years, I think is a good thing, is that people are breaking off and forming their own communities again, even kids, like even my teenagers started doing it during COVID. Even on Discord, they would create their own servers, no one could get on it but them. There was no danger of, like, being infiltrated by crazy people. All their friends were there. They could bring other friends in, they could talk about whatever issues they wanted to talk about. So there's a kind of return to, of kind of fractured or fragmented or smaller set of communities.

And I think if the internet continues to go that way, that's a good thing, right? That you don't have to be on Tik TOK or YouTube or whatever to find your people. And I think for grownups would be the silver lining of what happened with Twitter, with, you know, Elon Musk buying it and immediately turning it into a Nazi crash pad is that the average adult realized they didn't have to be there either, right? That they don't have to just use one place that the internet is filled with little communities that they could go to to talk to their friends. 

So I think we're back in this kind of Wild West like we almost were pre-web and at the beginning of the web and I think that's good.  But I do think there's an enormous amount of misinformation and some very bad policy all over the world that is going to cause a lot of harm.

CINDY COHN
I mean, that's kind of my challenge to you is once we've realized that things are broken, how do we evaluate all the people who are coming in and claiming that they have the fix? And you know, in The YouTube effect, you talked to Carrie Goldberg. She has a lot of passion.

I think she's wrong about the answer. She's, I think, done a very good job illuminating some of the problems, especially for specific communities, people facing domestic violence and doxing and things like that. But she's rushed to a really dangerous answer for the internet overall. 

So I guess my challenge is, how do we help people think critically about not just the problems, but the potential issues with solutions? You know, the TikTok bans are something that's going on across the country now, and it feels like the Napster days, right?

ALEX WINTER
Yeah, totally.

CINDY COHN
People have focused on a particular issue and used it to try to say, Oh, we're just going to ban this. And all the people who use this technology for all the things that are not even remotely related to the problem are going to be impacted by this “ban-first” strategy.

ALEX WINTER
Yeah. I mean, it's media literacy. It's digital literacy. One of the most despairing things for me making docs in this space is how much prejudice there is to making docs in this space. You know, people consider the internet, especially, you know, a huge swath of, because obviously the far right has their agenda, which is just to silence everybody they don't agree with, right? I mean, the left can do the same thing, but the right is very good at it.  

The left, where they make mistakes, or, you know, center to left, is that they're ignorant about how these technologies work, and so their solutions are wrong. We see that over and over. They have really good intentions, but the solutions are wrong, and they don't actually make sense to how these technologies work. We're seeing that in AI. That was an area that I was trying to do as much work as I could in during the The Hollywood strike to educate people about AI'because they were so completely misinformed and their fixes were not fixes. They were not effective and they would not be legally binding. And it was despairing only because it's kind of frowned upon to say anything about technology other than don't use it.

CINDY COHN
Yeah.

ALEX WINTER
Right? Like, even other documentaries are like the thesis is like, well, just, you know, tell your kids they can't be on, like, tell them to read more literature.

Right? And it just drives me crazy because I'm like, I'm a progressive lefty and my kids are all online and guess what? They still read books and like, play music and go outside. So it's this kind of very binary black or white attitude towards technology that like, ‘Oh, it's just bad. Why can't we go back to the days?’

CINDY COHN
And I think there's a false sense that if we just could turn back the clock pre internet, everything was perfect. Right? My friend Cory Doctorow talks about this, like how we need to build the great new world, not the good old world. And I think that's true even for, you know, Internet oldies like you and me who are thinking about maybe the 80s and 90s.

Like, I think we need to embrace where we are now and then build the better world forward. Now, I agree with you strongly about decentralization in smaller communities. As somebody who cares about free speech and privacy, I don't see a way to solve the free speech and privacy problems of the giant platforms.

We're not going to get better dictators. We need to get rid of the dictators and make a lot more smaller, not necessarily smaller, but different spaces, differently governed spaces. But I agree with you that there is this rush to kind of turn back the clock and I think we should try to turn it forward. And again, I kind of want to push you a little bit. What does the turning it forward world look like?

ALEX WINTER
I mean, I have really strong opinions about that. I mean, thankfully, my kids are very tech savvy, like any kid. And I pay attention to what they're doing, and I find it fascinating. And the thing about thinking backwards is that it's a losing proposition. Because the world will leave you behind.

Because the world's not going to go backwards. And the world is only going to go forward. And so you either have a say in what that looks like, or you don't. 

I think two things have to happen. One is media literacy and a sort of weakening of this narrative that it's all bad, so that more people, intelligent people, are getting involved in the future. I think that will help adults get immersed into new technologies and new communities and what's going on. I think at the same time that we have to be working harder to attack the tech monopolies. 

I think being involved as opposed to being, um, abstinent. is really, really important. Um, and I think more of that will happen with new generations, so uh, and because then your eyes and your ears are open, and you'll find new communities and, and the like, but at the same time we have to work much harder at um, uh, this idea that we're allowing the big tech to police themselves is just ludicrous, and there's still the world that we're in, and it just drives me crazy and Uh, you know, they have one agenda, which is profit, and they don't care about anything else, and, and power.

And I think that's the danger of AI. I mean, it's not the, we're not all gonna die by robots. It's just, it's just this sort of capitalist machine is just gonna roll along unchecked. That's the problem, and it will eat labor, and it will eat other companies, and that's the problem.

CINDY COHN  
I mean, I think that's one of the tricky parts about, you know, kind of the, the Sam Altman shift, right, from don't regulate us to please regulate us. Behind that, please regulate us is, you know, and we'll, we'll tell you what the regulations look like because we're the only ones, these giant gurus who can understand enough about it to figure out how to regulate us.

And I just think that's, you know, it's, it's important to recognize that it's a pivot, but I think you could get tricked into thinking that's actually better. And I don't actually think it is.

ALEX WINTER
It’s a 100 percent agenda based. I mean, it's not only not better, it's completely self serving. And I think that as long as we are following these people as opposed to leading them, we're going to have a problem.

CINDY COHN:
Absolutely.

JASON KELLEY
Let’s pause for just a moment to say thank you to our sponsor. “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

And now back to our conversation with Alex Winter about YouTube.

ALEX WINTER
There's a lot of information there that's of extreme value, medical, artistic,historical, political. In the film, we go to great length to show that Caleb Kane, who got kind of pulled into and, and radicalized, um, by the, the proliferation of far right, um, neo and even neo Nazi and nationalist, uh, white supremacist content, which is still proliferate on YouTube, um, because it really is not algorithm oriented, it’s business and incentive based, how he himself was unindoctrinated by ContraPoints, by Natalie Wynn's channel. 

And you have to understand that, you know, more teenagers watch YouTube than Netflix. Like, it is everything. Iit is by an order of magnitude, so much more of how they spend their time, um, consuming media than anything else. And they're watching their friends talk, they're watching political speakers talk, they're watching, you know, my son who's like, his various interests from photography to weightlifting to whatever, he's young. All of that's coming from YouTube. All of it.

And they're pretty good at discerning the crap from, you know, unless like now it's like a lot of the studies show you have to be generally predisposed to this kind of content to really go down, the sort of darker areas those younger people can be.

You know, I often say that the greatest solution to people who end up getting radicalized on YouTube is more YouTube. Right? Is to find the people on YouTube who are doing good. And I think that's one of the big misunderstandings about disinfo is that you can consume good sources. You just have to find them. And people are actually better at discerning truth from lies if that's really what they want to do as opposed to, like, I just want to get a wash in QAnon or whatever. 

I think YouTube started not necessarily with pure intentions, but I think that they did start with some good intentions in terms of intentionally democratizing the landscape and voices and allowing people in marginalized groups, and under autocratic governments. They allowed and they, and they promoted that content and they created the age of the democratized influencer.

That was intentional. And I would argue that they did a better job of that than my industry did. And I think my industry followed their lead. I think the diversity initiatives in Hollywood came after Hollywood, because Hollywood's Like everyone else is driven by money only and they were like, Oh my God, there are these giant trans and African and Chinese influencers that have huge audiences, we should start allowing more people to have a voice in our business too. Cause we'll make money off of them. But I think that now, YouTube has grown so big and so far beyond them, and it's making them so much money and they're so incentivized to promote disinformation, propaganda, sort of violent, um, content because it, it just makes so much money for them on the ad side, uh, that it's sort of a runaway train at this point.

CINDY COHN
One of the things that EFF has taken a stand on is about banning behavioral advertising. And I think one of the things you did in The YouTube Effect is kind of take a hard look at, you know, how, how big a role the algorithm is actually playing. And I think the movie kind of points that it's not as big a role as people who, uh, who want an easy answer to the problem are, are saying.

We've been thinking about this from the privacy perspective, and we decided that behavioral advertising was behind so many of the problems we had, and I wondered, um, how you think about that, because that is the kind of tracking and targeting that feeds some of those algorithms, but it does a lot more.

ALEX WINTER
Yeah, I think that there's absolutely no doubt for all the hue and cry that they can't moderate their content. And I think that we're beginning, again, this is an area you, you, that you, that EFF specifically specializes in. But I think in terms of the area of free speech, and what constitutes free speech as opposed to what they could actually be doing to mitigate harms is very nuanced.

And it serves them to say that it is not. That it's not nuanced and it's either, either they're going to be shackling free speech or they should be left alone to do whatever they want, which is make money off of advertising, a lot of which is harmful. So I think getting into the weeds on that is extremely important.

You know, a recent example was just how they stopped deplatforming all the Stop the Steal content, which they were doing very successfully. The just flat out  you know, uh, election 2020 election propaganda and, you know, and that gets people hurt. I mean, it can get people killed and it's not, it's really not hard to do, um, but they make more money if they allow this kind of rampant, aggressive, propagandized advertising as well as content on their platform.

I just think that we have to be looking at advertising and how it functions in a very granular way, because these are,  the whole thesis of YouTube, such as we had one, is that this is not about an algorithm, it's about a business model. 

These are business incentives, it's no different, I've been saying this everywhere, it's like, it's exactly the same as, as the, the Hurst and Pulitzer wars of the late 1800s, it's the same. It's just, we want to make money. We know what attracts eyeballs. We want to advertise and make money from ad revenue from pumping out this garbage because people eat it up. It's really similar to that. That doesn't require an algorithm. 

CINDY COHN
My dream is Alex Winter makes a movie that helps us evaluate all the things that people who are worried about the internet are jumping in to say that we ought to do, and helps give people that kind of evaluative  power, because we do see over and over again this rush to go to censorship, which, you know, is problematic, for free expression, but also just won't work, this kind of gliding over the idea that privacy has anything to do with online harms and that standing up for privacy will do anything.

I just feel like sometimes, this literacy place needs to be both about the problems and about critically thinking about the things that are being put forward as solutions.

ALEX WINTER
Yeah, I mean, I've been writing a lot about that for the last two years. I've written, I think, I don't know, countless op eds. And there are way smarter people than me, like you all and Cory Doctorow, writing about this like crazy. And I think all of that is having an impact. I think that we are building the building blocks of proper internet literacy are being set. 

CINDY COHN
Well I appreciate that you've got three kids who are, you know, healthy and happy using the internet because I think those stories get overlooked as well. Not that there aren't real harms. It's just that there's this baby with the bathwater kind of approach that we find in policymaking.

ALEX WINTER
Yeah, completely. So I think that people feel like their arms are being twisted. That they have to say these hyper negative things, or fall in line with these narratives. You know, a movie requires characters, right? And I would need a court case or something to follow to find the way in and I've always got my eyes on that. But I do think we're at it. We're at a kind of a critical point.

It's really funny because when I made this film I'm friends with a lot of different film critics. I've just been around a long time I like, you know reading good film criticism and one of them who I respect greatly was like I don't want to review your movie because I really didn't like it and I don't want to give you a really bad review.

And I said, well, why didn't you like it? It's like, because I did just didn't like your perspective. And I was like, well, what didn't you like about my replicas? Like, well, you just weren't hard enough on YouTube. Like you, you didn't just come right out and say, they're just terrible and no one should be using it.

And I was like, You're the problem. and here's so much of that, um, that I feel like there is a, uh, you know, there's a bias that is going to take time to overcome. No matter what anyone says or whatever film anyone makes, there's just, we just have to kind of keep chipping away at it.

JASON KELLEY
Well, it's a shame we didn't get a chance to talk to him about Frank Zappa. But what we did talk to him about was probably more interesting to our audience. The thing that stood out to me was the way he sees these technologies and sort of focuses his documentaries on the communities that they facilitate.

And that was just sort of a, I think, useful way to think about, you know, everything from the deep web to blockchain to YouTube. To Napster, just like he sees these as building communities and those communities are not necessarily good or bad, but they have some really positive elements and that led him to this really interesting idea of, of a future of smaller communities, which I think, I think we all agree with.

Does that sound sort of like what you pulled away from the conversation, Cindy?

CINDY COHN
I think that's right. And I also think he was really smart at noticing the difference between what it was like to be inside some of those communities and how they got portrayed in broader society. And pointing out that when corporate interests, who were the copyright interests, saw what was happening on Napster, they very quickly put together a narrative that everybody was pirates, that was very different than how it felt to be inside that community and having access to all of that information and that disconnect, you know, what happens when the people who control our broader societal conversation, who are often corporate interests with their own commercial interests at heart.

And what it's like to be inside the communities is what connected the Silk Road story with the Napster story. And in some ways YouTube is interesting because it's actually gigantic. It's not a little corner of the internet, but yet, I think he's trying to lift up, you know, both the issues that we see in YouTube that are problematic, but also all the other things inside YouTube that are not problematic and as he pointed out in the story about Caleb Cain, you know, can be part of the solution to pulling people out of the harms. 

So I really appreciate this focus. I think it really hearkens back to, you know, one of the coolest things about the internet when it first came along was this idea that we could build communities free of distance and outside of the corporate spaces.

JASON KELLEY
Yeah. And the point you're making about his recognition of. Who gets to decide what's to blame, I think leads us right to the conversation around YouTube, which is it's easy to blame the algorithm when what's actually driving a lot of the problems we see with the site are corporate interests and engagement with the kind of content that gets people riled up and also makes a lot of money.

And I just love that he's able to sort of parse out these nuances in a way that surprisingly few people do, um, you know, across media and journalism and certainly in unfortunately government.

CINDY COHN
Yeah, and I think that, you know, it's, it's fun to have a conversation with somebody who kind of gets it at this level about the problems with, and he, you know, name checked issues that EFF has been working on for a long time, whether that's COSA or Section 230 or algorithmic issues. About how wrongheaded the solutions are and how it kind of drives it.

I appreciate that it kind of drives him crazy in the way it drives me crazy that once you've articulated the harms, people seem to rush towards solutions, or at least are pushed towards solutions that are not getting out of this corporate control, but rather in some ways putting us deeper in that.

And he's already seeing that in the AI push for regulation. I think he's exactly right about that. I don't know if I convinced him to make his next movie about all of these solutions and how to evaluate them. I'll have to keep trying. He may not, that may not be where he gets his inspiration.

JASON KELLEY
We'll see, I mean, at least if nothing else, EFF is in many of the documentaries that he has made and my guess is that will continue to be a voice of reason in the ones he makes in the future.

CINDY COHN
I really appreciate that Alex has taken his skills and talents and platforms to really lift up the kind of ordinary people who are finding community online and help us find ways to keep that part, and even lift it up as we move into the future.

JASON KELLEY

Thanks for joining us for this episode of how to fix the internet.

If you have feedback or suggestions, we'd love to hear from you. Visit EFF. org slash podcast and click on listener feedback. While you're there, you can become a member, donate, maybe pick up some merch and just see what's happening in digital rights this week and every week.

We’ve got a newsletter, EFFector, as well as social media accounts on many, many, many platforms you can follow.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. 

In this episode you heard Perspectives by J.Lang featuring Sackjo22 and Admiral Bob 

You can find their names and links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

I hope you’ll join us again soon. I’m Jason Kelley.

CINDY
And I’m Cindy Cohn.

Josh Richman

[B] センター大谷も見たい  ドジャース弱点の守備位置  ESPN記者が指摘

2 months 1 week ago
米スポーツ専門局ESPNのブラッドフォード・ドゥーリトル記者は5月21日付のウェブ記事で、「大谷を中堅手として想像してみよう」と読者に呼びかけ、ドジャース最大の弱点である中堅手の問題をリカバーするため、大谷が中堅手としてプレーする可能性について論じている。右肘の怪我が癒えるまでは外野手としてもプレーはできないが、ドジャースのデーブ・ロバーツ監督は大谷のドジャース入団の際「ポストシーズンでは外野もできるのではないか」とワールドシリーズなどでの外野手としての出場を示唆していた。(市橋嗣郎)
日刊ベリタ

[B] センター大谷も見たい  ドジャース弱点の守備位置 ESPN記者が指摘

2 months 1 week ago
米スポーツ専門局ESPNのブラッドフォード・ドゥーリトル記者は5月21日付のウェブ記事で、「大谷を中堅手として想像してみよう」と読者に呼びかけ、ドジャース最大の弱点である中堅手の問題をリカバーするため、大谷が中堅手としてプレーする可能性について論じている。右肘の怪我が癒えるまでは外野手としてもプレーはできないが、ドジャースのデーブ・ロバーツ監督は大谷のドジャース入団の際「ポストシーズンでは投げられるのではないか」とワールドシリーズなどでの登板の可能性を示唆していた。
日刊ベリタ

[B] 野添憲治の《秋田県における朝鮮人強制連行8》 岩手とまたがる花輪鉱山 鹿角市花輪 

2 months 1 week ago
秋田県と岩手県にまたがる鉱区を持つ花輪鉱山は幕藩時代から続く古い鉱山で、一時休山となっていたが、昭和に入って経営が日本鉱業に移ってから硫化鉄興さんとして栄えた。戦時体制下の1940年(昭和15年)以降、たくさんの朝鮮人連行者が「官斡旋・徴用」で送りこまれた。国家による強制連行であったことが分かる。(大野和興)
日刊ベリタ

Shots Fired: Congressional Letter Questions DHS Funding of ShotSpotter

2 months 1 week ago

There is a growing pile of evidence that cities should drop Shotspotter, the notorious surveillance system that purportedly uses acoustic sensors to detect gunshots, due to its inaccuracies and the danger it creates in communities where it’s installed. In yet another blow to the product and the surveillance company behind it—SoundThinking—Congress members have sent a letter calling on the Department of Homeland Security to investigate how it provides funding to local police to deploy the product.

The seven page letter, from Senators Ed Markey, Ron Wyden and Elizabeth Warren, and Representative Ayanna Pressley, begins by questioning the “accuracy and effectiveness” of ShotSpotter, and then outlines some of the latest evidence of its abysmal performance, including multiple studies showing false positive rates—i.e. incorrectly classifying non-gunshot sounds as gunshots—at 70% or higher. In addition to its ineffectiveness, the Congress members voiced their serious concerns regarding ShotSpotter’s contribution to discrimination, civil rights violations, and poor policing practices due to the installation of most ShotSpotter sensors in overwhelmingly “Black, Brown and Latin[e] communities” at the request of local law enforcement. Together, the inefficacy of the technology and the placements can result in the deployment of police to what they expect to be a dangerous situation with guns drawn, increasing the chances of all-too-common police violence against civilians in the area.

In light of the grave concerns raised by the use of ShotSpotter, the lawmakers are demanding that DHS investigate its funding, and whether it’s an appropriate use of taxpayer dollars. We agree: DHS should investigate, and should end its program of offering grants to local law enforcement agencies to contract with SoundThinking. 

The letter can be read in its entirety here.

Hannah Zhao