ユン大統領弾劾!新宿駅に響き渡った韓国の若者たちの声
中国:ウェブ活動家の阮曉寰さん、2審も有期刑7年の判決
アリの一言:ハン・ガン氏が受賞講演で語ったこと
情報通信審議会 情報通信技術分科会 技術戦略委員会 社会実装加速化WG(第2回)
【映画の鏡】音楽創造の原点を淡々と 『シンペイ 歌こそすべて』大衆と向き合う生き方描く=鈴木 賀津彦
たんぽぽ舎メルマガ (12/14)12.8「とめよう!原発依存社会への暴走 関電包囲大集会」に約700人
[B] 「バイデンネタニヤフ連合軍シリア壊滅」【西サハラ最新情報】 平田伊都子
Speaking Freely: Prasanth Sugathan
Interviewer: David Greene
This interview has been edited for length and clarity.*
Prasanth Sugathan is Legal Director at Software Freedom Law Center, India. (SFLC.in). Prasanth is a lawyer with years of practice in the fields of technology law, intellectual property law, administrative law and constitutional law. He is an engineer turned lawyer and has worked closely with the Free Software community in India. He has appeared in many landmark cases before various Tribunals, High Courts and the Supreme Court of India. He has also deposed before Parliamentary Committees on issues related to the Information Technology Act and Net Neutrality.
David Greene: Why don’t you go ahead and introduce yourself.
Sugathan: I am Prasanth Sugathan, I am the Legal Director at the Software Freedom Law Center, India. We are a nonprofit organization based out of New Delhi, started in the year 2010. So we’ve been working at this for 14 years now, working mostly in the area of protecting rights of citizens in the digital space in India. We do strategic litigation, policy work, trainings, and capacity building. Those are the areas that we work in.
Greene: What was your career path? How did you end up at SFLC?
That’s an interesting story. I am an engineer by training. Then I was interested in free software. I had a startup at one point and I did a law degree along with it. I got interested in free software and got into it full time. Because of this involvement with the free software community, the first time I think I got involved in something related to policy was when there was discussion around software patents. When the patent office came out with a patent manual and there was this discussion about how it could affect the free software community and startups. So that was one discussion I followed, I wrote about it, and one thing led to another and I was called to speak at a seminar in New Delhi. That’s where I met Eben and Mishi from the Software Freedom Law Center. That was before SFLC India was started, but then once Mishi started the organization I joined as a Counsel. It’s been a long relationship.
Greene: Just in a personal sense, what does freedom of expression mean to you?
Apart from being a fundamental right, as evident in all the human rights agreements we have, and in the Indian Constitution, freedom of expression is the most basic aspect for a democratic nation. I mean without free speech you can not have a proper exchange of ideas, which is most important for a democracy. For any citizen to speak what they feel, to communicate their ideas, I think that is most important. As of now the internet is a medium which allows you to do that. So there definitely should be minimum restrictions from the government and other agencies in relation to the free exchange of ideas on this medium.
Greene: Have you had any personal experiences with censorship that have sort of informed or influenced how you feel about free expression?
When SFLC.IN was started in 2010 our major idea was to support the free software community. But then how we got involved in the debates on free speech and privacy on the internet was when in 2011 there were the IT Rules were introduced by the government as a draft for discussion and finally notified. This was on regulation of intermediaries, these online platforms. This was secondary legislation based on the Information Technology Act (IT Act) in India, which is the parent law. So when these discussions happened we got involved in it and then one thing led to another. For example, there was a provision in the IT Act called Section 66-A which criminalized the sending of offensive messages through a computer or other communication devices. It was, ostensibly, introduced to protect women. And the irony was that two women were arrested under this law. That was the first arrest that happened, and it was a case of two women being arrested for the comments that they made about a leader who expired.
This got us working on trying to talk to parliamentarians, trying to talk to other people about how we could maybe change this law. So there were various instances of content being taken down and people being arrested, and it was always done under Section 66-A of the IT Act. We challenged the IT Rules before the Supreme Court. In a judgment in a 2015 case called Shreya Singhal v. Union of India the Supreme Court read down the rules relating to intermediary liability. As for the rules, the platforms could be asked to take down the content. They didn’t have much of an option. If they don’t do that, they lose their safe harbour protection. The Court said it can only be actual knowledge and what actual knowledge means is if someone gets a court order asking them to take down the content. Or let’s say there’s direction from the government. These are the only two cases when content could be taken down.
Greene: You’ve lived in India your whole life. Has there ever been a point in your life when you felt your freedom of expression was restricted?
Currently we are going through such a phase, where you’re careful about what you’re speaking about. There is a lot of concern about what is happening in India currently. This is something we can see mostly impacting people who are associated with civil society. When they are voicing their opinions there is now a kind of fear about how the government sees it, whether they will take any action against you for what you say, and how this could affect your organization. Because when you’re affiliated with an organization it’s not just about yourself. You also need to be careful about how anything that you say could affect the organization and your colleagues. We’ve had many instances of nonprofit organizations and journalists being targeted. So there is a kind of chilling effect when you really don’t want to say something you would otherwise say strongly. There is always a toning down of what you want to say.
Greene: Are there any situations where you think it’s appropriate for governments to regulate online speech?
You don’t have an absolute right to free speech under India’s Constitution. There can be restrictions as stated under Article 19(2) of the Constitution. There can be reasonable restrictions by the government, for instance, for something that could lead to violence or something which could lead to a riot between communities. So mostly if you look at hate speech on the net which could lead to a violent situation or riots between communities, that could be a case where maybe the government could intervene. And I would even say those are cases where platforms should intervene. We have seen a lot of hate speech on the net during India’s current elections as there have been different phases of elections going on for close to two months. We have seen that happening with not just political leaders but with many supporters of political parties publishing content on various platforms which aren’t really in the nature of hate speech but which could potentially create situations where you have at least two communities fighting each other. It’s definitely not a desirable situation. Those are the cases where maybe platforms themselves could regulate or maybe the government needs to regulate. In this case, for example, when it is related to elections, the Election Commission also has its role, but in many cases we don’t see that happening.
Greene: Okay, let’s go back to hate speech for a minute because that’s always been a very difficult problem. Is that a difficult problem in India? Is hate speech well-defined? Do you think the current rules serve society well or are there problems with it?
I wouldn’t say it’s well-defined, but even in the current law there are provisions that address it. So anything which could lead to violence or which could lead to animosity between two communities will fall in the realm of hate speech. It’s not defined as such, but then that is where your free speech rights could be restricted. That definitely could fall under the definition of hate speech.
Greene: And do you think that definition works well?
I mean the definition is not the problem. It’s essentially a question of how it is implemented. It’s a question of how the government or its agency implements it. It’s a question of how platforms are taking care of it. These are two issues where there’s more that needs to be done.
Greene: You also talked about misinformation in terms of elections. How do we reconcile freedom of expression concerns with concerns for preventing misinformation?
I would definitely say it’s a gray area. I mean how do you really balance this? But I don’t think it’s a problem which cannot be addressed. Definitely there’s a lot for civil society to do, a lot for the private sector to do. Especially, for example, when hate speech is reported to the platforms. It should be dealt with quickly, but that is where we’re seeing the worst difference in how platforms act on such reporting in the Global North versus what happens in the Global South. Platforms need to up their act when it comes to handling such situations and handling such content.
Greene: Okay, let’s talk about the platforms then. How do you feel about censorship or restrictions on freedom of expression by the platforms?
Things have changed a lot as to how these platforms work. Now the platforms decide what kind of content gets to your feed and how the algorithms work to promote content which is more viral. In many cases we have seen how misinformation and hate speech goes viral. And content that is debunking the misinformation which is kind of providing the real facts, that doesn’t go as far. The content that debunks misinformation doesn’t go viral or come up in your feed that fast. So that definitely is a problem, the way platforms are dealing with it. In many cases it might be economically beneficial for them to make sure that content which is viral and which puts forth misinformation reaches more eyes.
Greene: Do you think that the platforms that are most commonly used in India—and I know there’s no TikTok in India— serve free speech interests or not?
When the Information Technology Rules were introduced and when the discussions happened, I would say civil society supported the platforms, essentially saying these platforms ensured we can enjoy our free speech rights, people can enjoy their free speech rights and express themselves freely. How the situation changed over a period of time is interesting. Definitely these platforms are still important for us to express these rights. But when it comes to, let’s say, content being regulated, some platforms do push back when the government asks them to take down the content, but we have not seen that much. So whether they’re really the messiahs for free speech, I doubt. Over the years, we have seen that it is most often the case that when the government tells them to do something, it is in their interest to do what the government says. There has not been much pushback except for maybe Twitter challenging it in the court. There have not been many instances where these platforms supported users.
Greene: So we’ve talked about hate speech and misinformation, are there other types of content or categories of online speech that are either problematic in India now or at least that regulators are looking at that you think the government might try to do something with?
One major concern which the government is trying to regulate is about deepfakes, with even the Prime Minister speaking about it. So suddenly that is something of a priority for the government to regulate. So that’s definitely a problem, especially when it comes to public figures and particularly women who are in politics who often have their images manipulated. In India we see that at election time. Even politicians who have been in the field for a long time, their images have been misused and morphed images have been circulated. So that’s definitely something that the platforms need to act on. For example, you cannot have the luxury of, let’s say, taking 48 hours to decide what to do when something like that is posted. This is something which platforms have to deal with as early as possible. We do understand there’s a lot of content and a lot of reporting happening, but in some cases, at least, there should be some prioritization of these reporting related to non-consensual sexual imagery. Maybe then the priority should go up.
Greene: As an engineer, how do you feel about deepfake tech? Should the regulatory concerns be qualitatively different than for other kinds of false information?
When it comes to deepfakes, I would say the problem is that it has become more mainstream. It has become very easy for a person to use these tools that have become more accessible. Earlier you needed to have specialized knowledge, especially when it came to something like editing videos. Now it’s become much easier. These tools are made easily available. The major difference now is how easy it is to access these applications. There can not be a case of fully regulating or fully controlling a technology. It’s not essentially a problem with the technology, because there would be a lot of ethical use cases. Just because something is used for a harmful purpose doesn’t mean that you completely block the technology. There is definitely a case for regulating AI and regulating deepfakes, but that doesn’t mean you put a complete stop to it.
Greene: How do you feel about TikTok being banned in India?
I think that’s less a question of technology or regulation and more of a geopolitical issue. I don’t think it has anything to do with the technology or even the transfer of data for that matter. I think it was just a geopolitical issue related to India/ China relations. The relations have kind of soured with the border disputes and other things, I think that was the trigger for the TikTok ban.
Greene: What is your most significant legal victory from a human rights perspective and why?
The victory that we had in the fight against the 2011 Rules and the portions related to intermediary liability, which was shot down by the Supreme Court. That was important because when it came to platforms and when it came to people expressing their critical views online, all of this could have been taken down very easily. So that was definitely a case of free speech rights being affected without much recourse. So that was a major victory.
Greene: Okay, now we ask everyone this question. Who is your free speech hero and why?
I can’t think of one person, but I think of, for example, when the country went through a bleak period in the 1970s and the government declared a national state of emergency. During that time we had journalists and politicians who fought for free speech rights with respect to the news media. At that time even writing something in the publications was difficult. We had many cases of journalists who were fighting this, people who had gone to jail for writing something, who had gone to jail for opposing the government or publicly criticizing the government. So I don’t think of just one person, but we have seen journalists and political leaders fighting back during that state of emergency. I would say those are the heroes who could fight the government, who could fight law enforcement. Then there was the case of Justice H.R. Khanna, a judge who stood up for citizen’s rights and gave his dissenting opinion against the majority view, which cost him the position of Chief Justice. Maybe I would say he’s a hero, a person who was clear about constitutional values and principles.
【Bookガイド】12月の“推し本”紹介=萩山 拓(ライター)
L’Allemagne envisage de scruter les réseaux sociaux des demandeurs de visas Schengen
"Un rapport récent de l’organisation britannique de défense des libertés civiles, Statewatch, révèle que la police fédérale allemande envisage d’examiner les comptes de réseaux sociaux des individus sollicitant des visas Schengen."
Full story here.
Other coverage in: