EngageMedia is publishing transcripts of the keynote panels from the Asia-Pacific Digital Rights Forum held last January 12 and 13, 2023.
Read the transcript for the Day 1 keynote panel below. The transcript has been edited for clarity.
Chat Garcia Ramilo
Chat Garcia Ramilo is the current executive director of the Association for Progressive Communications, an international network of civil society organisations dedicated to empowering and supporting people working for peace, human rights, development, and protection of the environment through the strategic use of information and communications technologies.
[There are] three things I wanted to focus on in the first 10 minutes. One is the impact of digital authoritarianism on our exercise of rights. The second is on access, use of the internet, and also [the] consumption and production of digital technologies. The third point is on governance. I think it’s very important as we look at digital rights in the region and definitely globally – looking at how the internet and digital technologies are governed. It’s very relevant at this point.
So I think the one thing that you’ve mentioned, and I totally agree, and this is also what we’ve learned with our members, is the increased control and restrictions of expression, and that has impacted on our rights. Expression – whether they’re freedom of expression, opinion, and religion, et cetera – that really impacts our freedoms and our exercise of our rights. And this has been done through different ways.
One definitely is through technology. Surveillance is a big issue in the region. [In] Thailand, the deployment of surveillance technology has been happening in many countries. The second is through policies and laws that have really impacted censorship in different ways, allowing for greater control of people’s data and privacy, as we’ve seen in tracking technologies, et cetera.
I think the other thing that has happened in the region is the impunity that has been allowed, whether they’re the tech sector or whether they’re non-state actors. And this has really resulted in many of the issues that we experience in all our countries – misinformation, disinformation, fomenting hate – that has really resulted in real-life harm for many. And recently, as we’ve seen in the Philippines, and [as] we’ll probably see in many different countries, [it] is really impacting elections. As many have said, that has really eroded democracy in many of our countries.
I think the other thing is abuse and harm online, because of the impunity and the lack of safety nets that the states require from companies, especially, you know, the tech companies, platforms, et cetera.
The other thing I wanted to say here is that the impact, of course, is not the same. I think it’s most severe for those who already have been historically marginalised and discriminated against. These are communities of people – whether they’re women, gender-diverse people, religious minorities, ethnic minorities. We’ve seen that time and again, and I think there are many examples, and those of you who are in this panel session will know and would have experienced many of these incidents.
The last thing I want to mention here is that as CSOs and as human rights defenders and digital activists who actually work in this field, governments have also targeted organisations and human rights defenders who have actually been very active in defending rights and have been involved in digital rights. [This is done] through laws that restrict activities of organisations like in India, or implementing financial restrictions.
So more and more, there is a real need for collaboration and support. I think one of the things that we need to really see and talk about is how our own work and our own resources and how our own activists is being impacted, because at the end of the day, if we cannot do our work, then it has a big impact in terms of how we can support the advocacy and the movement for rights in our countries. So that’s the first point.
The second point is in relation to access to the internet as well as consumption and production of digital technologies and infrastructure. I think from my perspective, it’s becoming more important for us to engage infrastructure much more directly because this really impacts the exercise of our rights. Because of the pandemic, there has been an expansion, [an] explosion, of digitalisation. It’s become a basic need for us, for many. It’s impacted education, health, entitlements, the whole gamut of services that we as people need and are entitled to, and part of the problem here is that there’s still a big lack, there’s still a gap, there’s still lack of access, and there’s still lack of meaningful access. You may say that there’s already access, but when we’re talking about education, health, et cetera, government services, there’s really a need for more meaningful access and universal affordable access.
I think one of the issues here is there’s been a lot of public-private partnerships, but part of the problem is that many of these [are] quite opaque. These public-private partnerships [are] not transparent. We need more transparency.
A lot of the infrastructure is really controlled by [the] private sector, which is needed. But at the same time, I think there is also the need for us to look at how there can be much more public access, discussion, or development of public ownership – to see internet and digital technologies as part of commons. At the moment, much of this is privatised in the same way as many are privatised. I think when we’re thinking of technologies and infrastructure, it’s really not regulated in the public interest. There needs to be much more regulation in the public interest and there needs to be much more protection of commons and the public good.
Which brings me to the next point around governance. Governance here is important because one, as I said earlier, with the explosion of digital transformation, there’s also been a proliferation of processes and venues for the discussion of governance and understanding of governance. And sometimes, it’s not easy for organisations like us to participate or to make sense of all of these because of resources, because there are some processes that are not as accessible as others, et cetera. So there’s inaccessibility. There’s also, I think especially in local governance, the danger of governance and laws that really could impact rights negatively.
The third thing is that there’s also fragmentation of the spaces and fragmentation of governance of the internet where it’s sort of being seen as binaries, or one over the other. One thing to really look at here is the need, again, for coming back to [the] kinds of principles and values that we need to remember or to advocate for when it comes to governance. Secondly, how then are we able to participate and really provide input and advocate for policies? [For] laws that truly protect our rights as well as ensure that there is meaningful access.
If you’re looking at our digital rights in the region and to some extent globally, [these] are the three things, right? One is how it impacts our rights, looking at our rights. Number two is really the infrastructure, understanding the infrastructure, understanding digitalisation. And the third point is really governance and how we as advocates and activists help engage and really look at the kinds of public policies, values, and principles that will really promote public interest and public good.
Helani Galpaya is the Chief Executive Officer of LIRNEasia, a think tank that catalyses policy change through research to improve people’s lives in the Asia-Pacific by facilitating their use of hard and soft infrastructures through the use of knowledge, information and technology.
I’ll amplify some of the points Chat made because we’re in agreement and maybe add a few data points to some others.
The first challenge – it’s a minor challenge, but still a challenge, is what do digital rights mean? And I think we still have the issue of not having a universally-accepted definition. I think that’s perfectly okay. I mean, minimal definitions. You know, people talk about the right to express yourself and access information. A more structured approach, certainly that’s very helpful in our work, is to look at human rights – digital rights to be human rights as applied to online space.
What are those human rights? Again, because of a more structured approach, we go back to the two big covenants, the International Covenant on Civil and Political Rights (ICCPR) and the International Covenant on Economic, Social, and Cultural Rights. The first one gets a lot of discussion because privacy sits there.
I think the second one is as important, and this is why I like the points Chat was making. On access and use, whichever definition you use, we can I think agree that digital access is a pre-condition. Meaningful access that is affordable at a decent quality, with the right type of skill to use, is a pre-condition for exercising your rights in the digital space. Access itself can be a right, you know, like in Finland. But we do nationally representative surveys in multiple countries, and access is not to be taken for granted. When we looked at 23 different countries, we saw in South Asia – and I’ll talk primarily about South Asia – we saw that only 30% of the people above the age of 15 have ever used the internet, and that was four years ago.
And worse, these people are online, and even when we count that 30% of the population, there’s a huge gap between men and women. Women in Bangladesh [are] 62% less likely than a man to be online. Women in India, 57% less likely. So the exercise and the use of that digital space [that] men and women have is just statistically different, and when they are online, it is qualitatively different when it comes to harassment, for example. There is the need to identify as a different gender in order to be safe.
If you look at survey after survey on harassment online, you’ll see a statistic that, oh, men generally report being harassed more than women. However, men are more frequently online. So when you control for the fact that they spend more time online, and you control for income and education and all the other factors between men and women, women are far more likely to be harassed in multitudes of ways, and then leave the digital space or use it in suboptimal ways.
So the exercise of their freedoms and rights is fundamentally different. One of the few advantages of the COVID pandemic is it really increased demand. So for example, in India, access and use went from 30% of the population to 50%. That’s wonderful, but still, that means in one of the largest countries in the world, 50% of the people above the age of 15 have never used the internet.
That matters because as Chat was saying, it really impacts everything else you can do online and offline, like education. So again, going back to more formal definitions of rights – the [ICESCR], the economic, social and cultural rights convention. Article 13 talks about the right to education. It was tested during COVID during the two years.
In 2021, what we saw is really the worst kind of natural experiment of having internet access and not. For example, we can look at households with children of schooling age in India. 64% of these households were connected to the internet, 36% were not. In the households that were connected to the internet, 31% received some form of education. So not all, but still a high number. But in the households that were not connected to the internet, only 8% received the same education. [There are] similar numbers in other countries. So internet really mattered.
It’s not just about having access, but what people do when they gain access. And again, going to Article 6, 7, 8 of the ICESCR, it talks about the right to work under just and favourable conditions with a right to form and join trade unions, for example. And we can look at if these rights are experienced by the millions in South Asia who are engaged in digital gig work or platform-mediated work: both the physical kind, like taxi driving, but also the pure digital work where you do all your work online and sell graphic designs.
We’ve talked to gig workers in Myanmar, India, Sri Lanka, and we’ve talked to many who are earning and having a better life because of gig work. But we also talk to many who have not been paid by their clients or the buyer of their work for what they have done. And they don’t have any recourse to recover those lost wages. They certainly don’t have the incentive or the ability – if you’re talking about pure digital work – to form unions often, because workers are spread across and the platforms are designed in such a way – the whole economy works in such a way – that the work goes to the lowest hourly bidder, for example. So it pits one worker against another and sort of takes away the incentives for collective action, which is really problematic when it comes to labour rights and the exercise of those rights.
The other point I want to come to is sort of the whole issue of misinformation, disinformation, malinformation, hate speech, et cetera, you know, now broadly termed as the information disorder. This is a real challenge and you see a lot of the laws and attempts by states to try and curb this issue. Nation-states have a legitimate interest in reducing the damage to social cohesion, to democracy, to elections, as a result of the information disorder, particularly misinformation.
We’ve been looking at India, Sri Lanka and Bangladesh, how content is handled when it comes to the regulation of misinformation and generally content on social [media]. So the big trends that we see, for example, in India, and in many countries [is that] there’s no ability to really regulate and ask the companies to take down [content], because the companies are in a completely different jurisdiction – there are not many employees you can put in jail like in the old-fashioned telecom company ways. There are no sunken assets where they’ve made huge investments like in the telecom space. So what you see is this blanket of cybersecurity, national security, and public order being used as the reason, and sometimes as the excuse, for taking content down.
That’s happening in India. In Sri Lanka, they’re not even doing that because they can’t have any sway over the companies, it seems, the platforms. So what they do are three types of actions to curb the information on content platforms and to curb speech – to shut down the internet totally, or to block certain platforms and applications; they arrest people who have posted certain content or they charge people who have posted certain content.
So this in itself is not problematic if the content is illegal and problematic and has to be taken down – all of these are legitimate actions. But the problem is that in non-democratic and strongman states, the implementation – like the charging of people under various laws and ordering content takedown – is arbitrary. Those who are critical of government, those from minority ethnicities, have their content taken down while the same content posted by other people is never objected to. Similarly, when it comes to arrests or charging people, there’s highly selective implementation. One or two prominent arrests not only curb the right to speech of those arrested, but sends a strong signal to others. And so you actually don’t need to arrest everybody.
So what the governments would like to do is to have content taken down by the platforms. They try – platforms are often summoned to prime minister’s offices, but most governments don’t have the jurisdiction. So therefore the policy reaction is to bring in anti-fake news laws so they have some control and say, here’s a piece of legislation, I want that content taken down if this platform wants to operate in my country. The most copied example in the region I come from is possibly Singapore’s fake news law. The European examples like NetzDG from Germany is viewed upon favourably by some people as well. They impose responsibility on platforms to take down specified content within specified periods of time or to face significant fines.
Again, the problem with these fake news laws is that there’s often no definition of what fake news is or doesn’t tell who can determine what fake news is, or that these laws get passed through non-democratic processes. I think in Malaysia, there was at one point a fake news law passed while Parliament was not even sitting.
Many countries in South Asia used COVID-19 as a way of getting fake news rules passed without due process or consultation. So problematically, non-transparent and non-democratic governments will use the laws to curb speech they don’t like. So my point here is that what might work in Singapore or Germany, or what might work in a functioning democracy with checks and balances, will be used very differently and the same law [will have] very different impacts in some of the Asian countries that my organisation works in. And the state is not always a benevolent actor. It’s not just the private companies that’s at times problematic.
Privacy is my last point. Article 17 is often talked about, but my point is not to discuss why it is one of the most valued concerns – the lack of privacy, algorithmic decision making, increased datafication, data digitalisation, all the data we put out there, many violations of privacy. But my point is to talk about the policy response to it. One of the most commonly-demanded policy solutions to the challenges of digital data privacy and problems around that is the enactment of personal data protection laws. Many governments are coming up with them. Most of these are modelled on the GPDR, the General Data Protection Regulation. It’s amazing in what it aspires to do. And Sri Lanka for example has South Asia’s first personal data protection law that passed last year and is going to be enforced later this year.
The implementation of a law that’s basically a cut-and-paste of a GPDR-type law requires data protection authorities that have sufficient skill – they need legal people, technical people, to investigate violations. [It] needs sufficient resources to pay for investigations and decision making, and autonomy to make independent decisions and issue rulings. All three of these conditions we cannot take for granted in resource-constrained and poorly-governed countries. The evidence is that even in the EU data protection, authorities are plagued by significant delays in decision-making, long drawn-out court proceedings with companies they’re trying to regulate, and ineffective small penalties being issued on big tech firms.
So while we aspire for GDPR-like laws in South Asia, and while they’re a good idea in theory, in practice we need to see if the implementation will actually lead to digital rights, particularly on privacy, in a meaningful way.
I’ll end by saying that a lot of the rights frameworks we have are based on, let’s say, the government violating your rights; for example, the state’s ability to curtail your speech. But as Chat pointed out, there’s so much private sector provision that we do really need to rethink public services and how we regulate them. And this really is a multi-stakeholder problem. We can’t be talking just to governments. We need all the three sort of sectors, public civil society and private sector at the table, when we talk about these issues. And it’s an issue of global governance, not necessarily only at the nation-state level.
Vitit Muntarbhorn KBE is a Professor Emeritus at the Faculty of Law, Chulalongkorn University, Bangkok. He has helped the UN in various positions, including UN Special Rapporteur, Independent Expert and member of UN Commissions of Inquiry on human rights. He is currently UN Special Rapporteur on Cambodia, monitoring for the UN Human Rights Council, Geneva.
Thank you for the invitation and congratulations on the conference, which is very timely and very, very important. What I’d like to do in the next 10 minutes is to offer you five contexts and then 10 considerations to simplify the discussions.
First of all, I think the digital experience is one that everybody is going through particularly as highlighted by COVID, and I think we can all say that we’ve experienced plus and minus, and that is a situation of ambivalence in regard to digital rights, which are the issues of today.
I experienced the plus by learning to teach online, but I also experienced the minus of having to deal with online scams all the time.
Secondly, let’s not forget that when we talk about digital, we’re also talking about power relations, particularly data power. We’re trying to build a community that acts as a balance, a check and balance, against abuse of power. And the power relations are also in an ambivalent situation. Why? Because half of Asia is not democratic – certainly not around here, or at least pretending to be democratic – but they are semi- and demi-democracies very often, and even democratic governments at times use undemocratic means. But we are dealing with a situation of cloistered power relations in terms of power being in the hands of the few. And as has been rightly pointed out, while we look at openings in terms of the state to be more democratic, we’re also dealing with monopolisation of power through data of big platforms, big data. In other words, we’re dealing with the other power of business power, which also raises issues of accountability.
Thirdly, I think from a rights perspective…We have so many norms already, as that have been raised in terms of civil, political, economic, social, cultural rights, the covenants, the universal declaration, et cetera. You’re not lacking in wording. And most Asia-Pacific countries are parties to at least two or three of them, and everyone is a party to the Convention on the Rights of the Child, for example, apart from the United States which is on the other side of the Pacific.
But all the other countries are parties and there are guidances on digitalisation. But the normative perspective of human rights through these treaties are very general norms. And when we deal with digitalisation today, we’re having to deal with very technical issues of not only a rights kind, but security and safety kinds, which means that we have to look a bit broader to a series of other guidances of a more technical kind, which are not necessarily found in depth in the human rights instruments.
In other words, we have to look to maybe guidelines on artificial intelligence or transparency of algorithms, or where we go with automation. All these are not concretised so much in terms of details in human rights instruments. So we have to complement the human rights instruments through and with other guidances from regional entities, et cetera. And of course, very importantly in the multilateral perspective from the UN, and the UN’s also heading to a compact, ultimately in terms of digital conferencing in the near future.
Fourthly, the issue of stakeholdership is very important. And of course, we like to advocate inclusiveness, inclusivity, and COVID has highlighted the lack of inclusion in terms of online gaps, et cetera. And yet the human rights phenomenon and the rights perspective in terms of the traditional caucus of people and stakeholders is a bit limited when we look at digitalisation today. In other words, we have to deal with scientists, engineers, AI people, ethicists, et cetera, who are not necessarily familiar with some of our work. And we need that complementarity in terms of alliance building for the future. Why? Because as we all know, in the near future or now, we are having to deal with non-humans. And our children and next generations will have to deal with non-sentients, inanimate actors who are not humans. And so that’s a challenge in terms of stakeholdership as well as accountability.
And the fifth initial context is to enable us to look not just to law, which tends to be the option of many countries, but also to other entry points. It’s more than law and sometimes it’s not hard law in terms of the legislation that you want, but you want soft law in terms of guidelines on safety at work or safety on digital.
Today, I think what’s very interesting also is what we call self-regulation in terms of industry adopting various guidances. But does self-regulation really work is one of the issues today, and when there’s content moderation used by platforms, et cetera, is that really balanced or not, and then also the possibility of co-regulation.
Finally, number one, what is the meaning of digital rights? The narrow meaning is of course, freedom of speech and respect for privacy. But I think what’s been rightly said is about the broader perspective of economic [and] social rights as well as civil political rights.
Secondly, freedom of expression is critically important. But freedom of expression is not absolute and we need to understand that the limits on freedom of expression also have limits, particularly the three guidances from the UN system – legality, necessity, and proportionality – if you are going to limit the right to freedom of expression. And we are having to deal with too many laws on this front in terms of online constraints and censorship and the like.
Thirdly on privacy, there are personal data protection laws in many of these countries. But the question is do we really implement the respect for consent in real terms at the national level? Why? Because there is always the exception of national security. So we have to try and make that transparent in terms of real protection rather than the advocacy of national security to override human security or digital security for ourselves.
Fourthly, the issue of hate speech has been raised. I’m not going to say much here, but the solution is not necessarily more law. There are too many laws already. The solution is many entry points like liberal education, counter speech, contractual obligations through platforms and the like, as well as empathetic regulation and self-regulation.
Fifth, to deal with algorithmic transparency and explainability of AI, and that needs some guidance in terms of technicality.
Sixth, the very important issue of protection of vulnerabilities – children, women, et cetera, and particularly today, now the issue of targeting of children for adverts and abusive consumerism.
Seventh, digital security. Eighth, workers’ rights in terms of digital respect for gig workers, as well as the right to disconnect. Ninth, green technology and the like. And then tenth, the access issue that’s been raised today.
And if you really want an example of an exposition of digital rights, the latest [is] the European Union adoption of the declaration of digital rights. But again, not to forget that we have the normative framework already in terms of human rights, but let’s broaden that in terms of stakeholdership and the technicalities that offer us real protection in terms of safety and human security.
What must changemakers do to really meaningfully address all of those issues that you have outlined?
Chat Garcia Ramilo: What can we do? I think there’s many layers that we can do. There is something we can do in terms of how we use technology, and it’s important. There are many in this community who will help us with digital security. We have people who have been involved in the process of advocacy. And I think that’s important to be able to really connect, because it just really demonstrates the expanse and the pace of development that happens, and to be able to really understand and make a difference, and we really need to connect within this region and also globally. The policy process is multi-stakeholder. It has to be multilateral, it needs to involve different stakeholders, there’s also the technical community aside from the private sector.
This is not a new issue but it is a more expanded issue […] I do think really being involved and reaching out is quite important.
The other thing I want to bring in which was not really addressed is how we also connect to the biggest existential issue of our day, which is climate change and environmental sustainability. That I think is one issue that we need to connect with. I would like to invite you all to read one of the things APC has done this year [which was] to review what has changed in terms of advocacy for digital rights through our Global Information Society publication. And there are thematic discussions on these various issues that also sheds light on some of the issues that we’ve been discussing and also the strategies that have been raised here today.
Helani Galpaya: I would say three things. One is to use technology in sophisticated ways and to use peer networks to build coalitions to keep yourself safe, and to use technology to fight the problem and to fight the good fight.
Number two would be to create the evidence base to really understand what works and to stop reinventing the wheel. So, for example, people are talking about digital and media literacy as one of the ways to counter misinformation. I think we all intuitively agree that that’s possibly very true, but we have very, very little evidence and certainly not systematic evidence. So we just need to gather that evidence, make it widely available, so that other people can improve their programming. So we do actually need to be somewhat evidence-based, not just value-based, because otherwise we’re reinventing the wheel continuously.
Third, as activists, I would say think a lot more broadly about the forums you think you should engage in in order to influence change. So I think we tend to go to where we are comfortable in – if you’re a researcher, you go to the more researchy ones. If you’re an activist, you go to where other activists are. That’s very useful. Many of us go to the Internet Governance Forum, that’s great, but a lot more policy action is happening in the corridors of many other situations like trade discussions, various other UN forums, and so on. We need to be there and ready to engage in those processes, otherwise it’s going to be a bit too late. So we need to think very broadly about where our points of influence are.
Vitit Muntarbhorn: I’ll give you a very quick 10-point toolbox.
Number one, good laws versus bad laws. I think we have too many bad laws in the Asia-Pacific region at the moment, but I’d like to see better laws which are compliant with human rights. And at this point in time, we want good data protection laws, personal data protection laws with constraints on the national security invocation, which is often exorbitant and excessive.
Secondly, we like good policies and all countries have some development policies. They may need to integrate a lot of what we’re talking about much more into their national policies in terms of empathy, in terms of digital security and digital access, digital protection, et cetera. And that is one that’s linked today, particularly with the opportunities offered by the Sustainable Development goals, which are the global policy on development till 2030.
Thirdly, good programming that’s protective. I work very much on the advocacy of digital security programming for NGOs and human rights defenders because there is a creeping jurisdiction of surveillance, particularly via some states and via some platforms also impinging on public space, democratic space.
Fourthly, good practices, maybe some court cases here and there, or other forms of advocacy. And we see a lot of this in Europe in terms of interventions by European Commission, et cetera. I think ASEAN could be a bit more active in this region if it really wants to be human centered.
Fifth, good mechanisms and personnel. The emergence of national digital authorities of an independent kind. We don’t really see this in Asia so much, but we’re seeing it a bit more in Europe. I’m not touting Europe necessarily, but I’m just saying that there are some experiences we learned.
Sixth, good data and monitoring, to make it transparent, so to speak, in terms of gender sensibility and various vulnerabilities such as the child-related targeting now being used by algorithms vis-a-vis children in terms of consumerism.
Seventh, education and capacity building, particularly digital education and capacity building.
Eighth, remedies if there are violations, and this is one of the critical issues of the business and human rights perspective, which is now very much advocated by the UN in terms of the state and the business sector having to offer some effective remedies. And very importantly, on the data-related issues today.
Ninth, space for development and networking, particularly with different stakeholders that we’re talking about here.
And then tenth, resources, not just money. I think we’re talking about digital resources and particularly access and also demonopolization of control of those resources, together with political will and the social will that comes through pro bono actors such as human rights defenders.