MENU

Skoll World Forum Interviews: Spotlight on Disinformation

October 7, 2024

By Nighat Dad - Digital Rights Foundation, By Imran Ahmed - Center for Countering Digital Hate

Disinformation, conspiracy theories, hate speech, and other forms of abuse perpetuated online threaten to create increasingly impenetrable divides in our society. The need to tackle these multifaceted challenges grows more urgent by the day, especially in this “year of elections” as nearly half the global population casts ballots in dozens of consequential elections around the world.

Despite the problem’s magnitude, the Skoll Foundation is optimistic that social innovators can disrupt the systems that enable disinformation to take hold and poison our civic discourse. This week, as the EU DisinfoLab annual conference kicks off in Latvia, we are pleased to highlight two groups tackling disinformation from distinct yet complementary angles.

First, there’s the accountability angle. The Center for Countering Digital Hate aims to hold social media and other platforms economically accountable for harms created by the spread of disinformation. It is pursuing legislation that would create transparency requirements and accountability mechanisms to rein in “turbocharged” disinformation pathways.

Meanwhile, Digital Rights Foundation (DRF) reminds us that even as online platforms exacerbate the problem, they also serve as a lifeline for marginalized groups, including women, LGBTQ+ people, and many others. Due to overt discrimination, harassment, and abuse in physical spaces, “some communities around the world have nowhere else to go,” says Founder and Executive Director Nighat Dad. “They have only these online spaces.” Thus, DRF takes a gender lens to online safety and seeks to reclaim digital spaces.

Watch the interviews or read the full transcripts below to learn more about how these innovators are working to make the internet a less divisive place for us all.

Imran Ahmed is the founder and CEO of the Center for Countering Digital Hate. He is an authority on social and psychological malignancies on social media, such as identity-based hate, extremism, disinformation, and conspiracy theories. He regularly appears on the media and in documentaries as an expert in how bad actors use digital spaces to harm others and benefit themselves, as well as how and why bad platforms allow them to do so. He advises politicians around the world on policy and legislation. Imran was inspired to start the Center after seeing the rise of antisemitism on the left in the United Kingdom and the murder of his colleague, Jo Cox MP, by a white supremacist, who had been radicalized in part online, during the EU Referendum in 2016.

Nighat Dad is the founder and Executive Director of the Digital Rights Foundation. She is a member of the UN Secretary-General’s High-Level Advisory Board on AI (HLAB) and a founding member of Meta’s Oversight Board. Her organization works on the cutting edge of the intersection of human rights and technology. Nighat has won numerous accolades, including the Dutch Human Rights Award, and has been a TED Fellow since 2017. She is a staunch advocate for digital rights and women’s rights for the global majority.

[/expandable-widget]

Click to show transcript of Imran Ahmed Interview


Host:
Welcome to Role Models for Change, a series of conversations with social entrepreneurs and other innovators working on the front lines of some of the world’s most pressing problems.

Peter Yeung:
Hello, Imran, and thank you very much for joining us today at the Skoll World Forum.

Imran Ahmed:
It’s my pleasure.

Peter Yeung:
Yeah, I’m really interested to hear more about your work today, but just to begin with, can you introduce yourself and explain basically the work that you’re doing?

Imran Ahmed:
So my name is Imran Ahmed. I’m chief executive and I’m founder of the Center for Countering Digital Hate. We were set up about four years ago trying to find ways to reduce the prevalence of hate and disinformation on social media platforms, which currently find it profitable to push that to billions of users by creating costs for inaction. So how do we create a market and regulatory solution to the market and regulatory failure that has led to the information crisis that we’re in today?

Peter Yeung:
Yeah, there’s an interesting term that you use information crisis, and I suppose how widespread is this problem now, and I suppose to any sort of normal citizen of England, of America, how likely is it and how much is it impacting them?

Imran Ahmed:
So let me explain information crisis in words that normal people would understand. Lies and hate, both of which are wrong, at a simple moral level. Now, there’s always been lies and hate in our society, but the problem is that they’re being turbocharged by social media platforms, which actually find that content to be sticky and addictive, whether it’s conspiracy theories that lead people down rabbit holes or it’s hate that makes people feel anxious and so therefore makes them respond, engage, get into arguments online. What the platforms have worked out is that it keeps people on platform. It’s the ultimate addictive sticky content.
But what that does over time, when we’re confronted with it every day, it starts to resocialize our perceptions of the world around us. And that’s the problem we’re in. We’re actually in a distortive, the primary lens that we use to find information, to maintain our relationships, to transact business, to establish our social mores, even establish the corpus of information that we call facts, those spaces, the way in which we see them is a distortive lens, which is distorted for commercial purposes. And I think that’s the problem that we’re in is that the main means by which we get that information is now a highly distortive environment.

Peter Yeung:
And I suppose then, at what point were you inspired? Was there a moment that happened in particular, or why exactly did you begin to start addressing this problem?

Imran Ahmed:
So I started CCDH seven years ago. I was working for the Labour Party at the time, and three things happened that really shook me to the core. One was the rise of antisemitism in the party that I was serving, which was outwith the traditions that party, an anti-racist party that believes in tolerance. The second was the Brexit referendum, in which I was advising our leader on the referendum, Alan Johnson. And I could see the scale of anti-Muslim, anti-black and conspiracist lies that were being spread in digital spaces and how they were affecting the way that people were interacting with us on the doorstep. And then it was the murder of my colleague Jo Cox, who was a thirty-five-year-old mother of two, who was murdered by a man who’d been simmered in lies and hate online. And what became clear to me in that moment, was that conventional institutions had failed to understand that the main way in which we share information, we find out about the world around us, in which we negotiate our values, our norms of attitude and behavior, even decide what facts are, had shifted to online spaces.
And those spaces were run to rules that we simply didn’t comprehend, and that actually advantaged hate and disinformation over tolerance and good information. And unless we dealt with that problem, which was not just affecting one particular political party, it was across the spectrum in the UK. And in 2016, in that summer, I could look overseas and see what was happening in America with the rise of Trump, with Orban, Bolsonaro, Duterte, things happening simultaneously around the world. And that isn’t contingent circumstances that drive that, it’s not one person, that’s a fundamental shift in something. So if you saw icebergs breaking off and sort of melting all over the place, you’d initially think, gosh, what’s wrong with that iceberg? But eventually you’d realize, it’s because the temperature’s increasing. The same is true of our information ecosystem. The temperature has slowly been increasing, and these icebergs melting and breaking off, Brexit, the rise of Trump, the murder of my colleague, and then we realized as we launched publicly, so much more.

Peter Yeung:
And yeah, I suppose that’s, not to mix too many metaphors, but I suppose with rising temperatures and where perhaps the frog in the pot that hasn’t realized, at least society and institutions have been slow to react to these changes.

Imran Ahmed:
When you increase the amount of hate and lies in the world, it has this effect of making us more suspicious of other people, of the nonsense that they might believe. There’s a sort of, I try to avoid using this term because it sort of identifies me as a former pseudo-intellectual, but it’s Hobbesian. It’s creating a system in which we can’t trust our neighbor. And when we can’t trust our neighbor, we start to think, well, how do we defend ourselves against our neighbor? And that’s increasing the amount of polarization, the negative partisanship, the brittleness of our society. It’s the ease with which it fractures in unpredictable and potentially devastating ways. And I think that what it’s leading to is a politics that is incapable of sustaining the values that are required for democracy. And my fear is that we don’t have much time left, because eventually the frog will be boiled.

Peter Yeung:
And I just suppose, so we’ve obviously established here, that is an enormous problem. And how have you really gone about trying to solve this and reduce this mass surface of disinformation and hate?

Imran Ahmed:
We’ve had three phases as an organization, and keep in mind we’re very young, but the three phases were one, to help people to understand that online harms have an offline effect. And when I started this work, I got told, you are mad. This is online stuff, this isn’t real. But I don’t think anyone believes that anymore. The second thing we had to do was persuade people that tech platforms, social media platforms weren’t just cool tech guys from San Francisco who wanted the world to be a better place and believed in liberal values, but they were rapaciously greedy and were willing to sell out our democracy, the safety of our children, their mental health, for money, and my God, how much money. Mark Zuckerberg’s younger than I am. He’s made a hundred billion dollars from what is probably one of the world’s greatest business models ever.
And the third thing we had to do was then build the political consensus to create costs for that hate and disinformation. If, as our contention is, if we’re right that hate and disinformation are profitable, how do you make them less profitable? Now we do that through two primary mechanisms. One is that we work with, we give information to advertisers who are the main, 98% of the revenues of Meta, for example, come from advertising, to help them to make moral decisions about where they place their advertising. And we’re really successful at doing that.
So successful that Elon Musk even sued us because we were so effective in identifying the rise and hate on his platform, and he lost a hundred million dollars of advertising as a result. The second way we do that is through regulation. And that regulation is where we’re able to balance freedom of speech, which is vital, with transparency and accountability for those platforms, such that where they create systemic harms in a negligent way that causes damage to people, that they’re able to seek restitutive justice to the civil justice system for that. And those are the two ways in which we’re making hate and lies less profitable.

Peter Yeung:
And I suppose, yeah, I obviously was intending to touch on the Musk debacles. As such, I’d just be interested to know, I suppose, were you surprised with how things unfolded? And I suppose, what’s the current situation and fallout of that?

Imran Ahmed:
Well, look, I mean with Elon Musk, when he took over the platform, he put up the bat signal, did he not, to hate actors and to disinformation spreaders saying, welcome back on the platform, I like what you say. He indicated that through his own behavior, he let tens of thousands of people who’d been banned by the platform back on, including people like Andrew Anglin, America’s leading neo-Nazi. And then they stopped enforcing the rules. They sacked hundreds and hundreds of their content moderators, the people who were keeping the platform safe, and we quantified what that meant in real terms. So we found that there was a 202% increase in the use of the N-word, so a tripling in the use of the N-word in the week after he took over X, compared to the year before on a daily basis. And when we did that research, obviously it was sensational data.
It went on the front page of the New York Times, and advertisers took action, unsurprisingly, because a lot of them are African Americans, a lot of the senior executives there. And they were like, well, no, this isn’t tolerable. So he lost a hundred million dollars in advertising, and he then sued us, not for defamation, because our data was good. He sued us for the act of doing research, and for tortious interference with his advertising business. The problem is that you can’t sue people for doing research. And so that lawsuit, which actually cost us half a million dollars to defend, was tossed out by a judge at the first chance given to them, and it was designated a slap suit. And I’m really proud of that, because I’m the first person that’s able to prove in a court of law that Elon Musk is not a free speech advocate. He’s a thin-skinned censor who’s willing to shut down speech using strategic litigation because he can’t take being criticized. And I think that was worth the half a million it cost us.

Peter Yeung:
Certainly not mincing your words there.

Imran Ahmed:
No, no.

Peter Yeung:
I suppose it’d be interesting to know more broadly when it comes to dealing with these big social media tech conglomerates, how you found that, has it generally been quite a combative situation? Has there been to extent some cooperation from their side?

Imran Ahmed:
No, I mean, the companies, like I said, it’s one of the most successful business models in history. Why would they change doing what they do? And so the question for us is, how do we hold them accountable? Now look, in politics 101, how do you stop mad or bad leaders? Checks and balances, right? And so we are a check and a balance. We provide evidence and then people can take action based on it, which actually reduces the profitability of hate and disinformation. But the problem is you are taking on the world’s biggest companies, and having the scale to be able to do that, having the resources to be able to defend yourself when they inevitably fight back, unlike say, the fight against big tobacco, the fight against big oil, the fight against, and we are often fighting against big oligopolistic industries.
The tech accountability sector is tiny, and I’m really, really pleased that Skoll is funding this sector and funding CCDH and work like ours because it’s vital to be able to take them on. But I do think that we are in trouble because they are spending an enormous amount of money, that their reaction to our reaction to them is to massively increase their spending on lobbying, massively increase their spending on PR to put up all the defenses that you would expect them to put up. And for us to overcome them, to be able to adapt to their defenses is going to take skill, resource, and people power.

Peter Yeung:
And presumably also the input and support of political institutions as well.

Imran Ahmed:
Look, I think political institutions have a part to play. I don’t think politicians should be telling us what we can and cannot say. I’m a liberal. I believe that we have the right to be wrong. However, you don’t have a God-given right, or a constitutional right, to profit from lies and hate. And so what I want them to do is actually be enabling. I need enabling legislation for transparency, transparency of the algorithms, transparency of the content enforcement policy, transparency of the advertising, so that we can hold them meaningfully accountable in alliance with government, with regulatory bodies, and with select committees, congressional committees, to meaningfully accountable based on transparent data. And then have mechanisms where they do create harm that they can be held economically responsible, whether that is by regulatory fines, or that is so when it’s done en masse to large swathes of society, or it’s through class actions or other mechanisms.

Peter Yeung:
I suppose, obviously, you gave the example of this case with Elon Musk, and the lawsuits, and your success in, I suppose, defunding, as you put it, I think those benefiting from this hate. But I suppose how are there other ways that you quantify and measure success in your goals and objectives? And if so, what are they?

Imran Ahmed:
Look, we have an immediate crisis, and so I am not willing to wait around for politicians to finally get it. And so there’s a lot of work to be done inoculating people against, so getting public information out there, not just to the general public, but also to civil society, to our partners who work, our fellow grantees, for example, climate change people, people working on sexual and reproductive rights, on how best to counter disinformation, whether that’s inoculation, it is offline mechanisms, and so we are students of lies and hate, and how they spread in the modern world, and how to combat them in the most effective way possible. We participate fully in civil society’s resistance against the damage that’s being done. But it doesn’t matter if you work on climate or anything else, disinformation is affecting you right now. And so what we’re trying to do is ensure that we act as a nexus for collective understanding and collective action, to deal with this information ecosystem and the malignant ways in which it affects our interests.

Peter Yeung:
I suppose I would just be interested to know, obviously you have made big strides and it’s only been a relatively short amount of time up until now, but what are the difficulties that you’ve faced more broadly, I suppose obviously we spoke about the relationship with those companies, but I suppose what kind of obstacles are there to you sort of reaching that objective or those objectives that you’re seeking to reach?

Imran Ahmed:
Look, I think there are three problems that we have. The first problem is human and spiritual, which is how do people lead when they’ve never led before? How do they find it inside themselves to master both the technical skills, but also the evolution that they need to go through as a human being? Four and a half years ago, I’ve never managed anyone, and now I run an organization with 31 staff that’s growing really fast. The second is technical, on how do I have the technical skills to organizationally structure, to ensure that we have the best possible operations, to understand how to create a scalable institution, which can meet the demand that’s out there.
And the third problem that we have is money. And you’ve got individual challenges, organizational challenges, and then existential challenges, which are, do we have the cash to do it? And so we are constantly being tested in ways that I think very few other types of organizations are, because there isn’t necessarily a direct connection between the quality of work we produce and revenues. And so being able to balance those is a daily challenge. And then to be a human being while doing it, that’s a tough one. It’s a tough one.

Peter Yeung:
And I suppose just on that last point of the funding, have you explored, or I suppose I’d be interested in your thoughts on whether different models such as those tech companies being required to contribute money towards, I’m sure you wouldn’t necessarily be opposed, but is that something that you’d argue for?

Imran Ahmed:
Look, we don’t take money from governments and we don’t take money from social media companies. And that’s really important is that we’re an independent voice that’s able to provide unvarnished advice without fear or favor to any government or to any platform, but also that we can criticize them when we need to. However, I do think it’s important that platforms play their role in allowing access to data and being willing to react in an open, honest way, not just following the letter of the law, but the spirit of the law as well. We’ll continue to provide constructive advice to them, but I do think that taking money from those platforms is a poison chalice.

Peter Yeung:
And I suppose, to what extent do you think that in these years that you’ve been working, that at least the situation has improved to an extent? Of course there’s always evolving problems that need to be tackled, but to what extent do you think that there have at least been improvements in the landscape and a better acknowledgement of the issue at hand?

Imran Ahmed:
I was the first witness to give evidence to the British Parliament for the Online Safety Bill. 2021, September. And in October 23, it became law. We’ve been instrumental in providing the data that drove the Digital Services Act. I’m giving evidence in Ottawa shortly for Canada’s new online safety laws. In the U.S., I’ve given evidence to Congress and I’ve seen the perceptions of what’s possible change radically over the last four years. I think that there is progress being made, but my fear is that it’s not fast enough. In 2024, over 2 billion people around the world are going into elections. And I’m old enough to remember when Francis Fukuyama wrote The End of History, and said that liberal democracy had succeeded and that it was over. Well, now in 2024, we find ourselves at a tipping point where we could tip into authoritarianism and a digital led populism very, very easily. And my fear is that before we’ve had the chance to fix this crisis, it will have overwhelmed us. And in that respect, it does remind me of the climate crisis. There is a window for action, and it’s closing.

Peter Yeung:
Just one last point then, Imran, was to do with the looking ahead in the future, and I suppose I’d be interested to know when it comes to how exactly you would describe this, and, well, an ideal kind of ecosystem in the future, and what exactly it would look like when it comes to minimizing as much of that harm as possible?

Imran Ahmed:
I actually believe that in free and open discourse, which is run to rules that allow every voice to be heard, fully actualized, for individuals to feel welcome, to contribute, a safe space for abusers is an unsafe space for the victims, and for normal people that don’t want to be somewhere where people are screaming the N-word every two seconds. And so I think it’s really important that we have open spaces for discourse, but that does mean a rules-based order for social media platforms that’s enforced, that has sanctions attached to it if people violate them. And that doesn’t amplify one side over the other, hate and disinformation over good information and tolerance. And I do believe in the virtues of humanity. I think that we all individually want a better world. And I think for the main part, those of us who see that happening through tolerance, peace, science, and democracy outnumber those who don’t. So, give us a level playing field and we can win this battle.

Peter Yeung:
Brilliant. Thank you so much for your time here, Imran.

Click to show transcript of Nighat Dad Interview


Introduction (00:01):
Welcome to Role Models for Change, a series of conversations with social entrepreneurs and other innovators working on the front lines of some of the world’s most pressing problems.

Peter Yeung (00:13):
Hello, Nighat. Thank you very much for joining us today at the Skoll World Forum. We are really looking forward to speaking to you today. Just to begin with, nice and simple, could you introduce yourself and tell us a little bit about the work that you’re doing?

Nighat Dad (00:27):
My name is Nighat Dad. I’m from Pakistan. By training, I’m a lawyer, but I do work on digital rights, and that’s why I started an organization called Digital Rights Foundation in 2018, but I’ve been doing this work for more than a decade now, and my work is basically enabling voices of marginalized communities in the online space. And the voices that they … The voices that can be heard by other people, but also the online space that is safer for them to raise their voices.

Peter Yeung (01:02):
Obviously, that’s a very important topic at the moment. I just wonder, could you describe what exactly are those issues that you’re trying to solve? What is the problem exactly?

Nighat Dad (01:13):
I actually started working on digital rights issues from the perspective of gender. I’m somebody who is very passionate about a woman’s rights or rights from gender lens in my own country, Pakistan, but not just restricted to Pakistan. There is something that we share in South Asia, which is these very closed spaces for women, for marginalized groups, for sexual minorities, for religious minorities, and the cultural norms and patriarchy kind of suppress these communities.

(01:52):
My work is basically to not enable those voices, but when these communities face abuse and harassment and hate and now targeted disinformation when they say something, even if it’s a very normal thing, just talking about themselves or talking about pleasure or talking about … Or accessing information. Even on small little things, they face abuse because of the stereotypes against them, which is entrenched in the society.

(02:33):
I mean, basically, it’s addressing online abuse against women but also protecting the right to privacy, talking against the surveillance and monitoring that these communities face, and also not only providing them safer space and enabling a safer environment, online environment, for them, but also holding powerful actors accountable. Sometimes those are bad actors. Sometimes those are powerful and bad both.

(03:09):
And how we hold them accountable is basically collect evidence, data, analyze that data, and then form our advocacy based on evidence. It’s evidence-based advocacy and lobbying. Telling governments where they’re wrong or what gaps in their work is. How they can address regulatory gaps or challenges but also telling platforms, which may be your social media platforms, right? And now we have generative AI as well which is adding into these existing problems. And telling them that … What are the systems, the policies, their solutions that are not working for our communities? And then holding them accountable that they are not only the platforms for the people of the global north. They’re the platform for the people of the world. And keep reminding them that we are not just the global south or rest of the world. We are the majority world.

Peter Yeung (04:18):
I suppose, and you sort of divided up those various aspects of government working with governments and platforms as well. I wonder, do you see that work sort of equally divided? Or are you prioritizing a specific aspect in particular?

Nighat Dad (04:35):
Working with the government or working with the platform in a way that holding both of them accountable, not necessarily mean that you directly work with them, is sometimes in indirect ways telling them that, “Your systems are not working.”

(04:51):
But I believe … And that was not me … Maybe a decade ago, I was an activist shouting from the edges and telling these powerful actors that, “You are doing something wrong, and it’s affecting my community.” But over the years, I have realized that, if you really want to provide relief to your communities, you have to work with the ecosystem. You have to contribute. And there are frustrations on the way, and there are times when you’re like, “Things are not working. And what I’m going to do?”

(05:30):
But then you look at your community, and you’re like, “Okay. A survivor or victim of abuse reached out to me, and I escalated their complaint to certain platform, and they took an action.” That was a little success for me, and it worked for my community. I wanted to see how we can broaden that work, how we can do it at a scale, where not only just on those escalations or working with victims and survivors and then telling companies to take down content or look into the abuse from individual perspective, but how we can change platforms, policies, and their responsive responses at scale, which not only works for Pakistani victims and survivors of abuse but works for anyone in Latin America, works for anyone in Bangladesh or India or Sri Lanka or Nepal.

(06:28):
That’s how now I look at the system and ecosystem because I believe that these companies are here to stay. They are powerful actors.

(06:38):
And also, I think there’s another way to look into these platforms is that some communities around the world have nowhere else to go. They have only these online spaces where they can talk, where they can make connections, where they can find their networks, where they can find sisterhood and solidarity. And I’m talking about women from tribal areas in Pakistan who are not allowed to go out of their homes, but they have somehow access to maybe Facebook or TikTok. And no matter how much we dislike these companies, I know that some of the communities thrive on them because they have no other space.

(07:27):
I think that’s why it’s very important for me to work with the ecosystem. Maybe it’s not equally. Maybe sometimes some governments respond better, like Europe, in terms of digital services act. Maybe some other oppressive regimes don’t respond in a good way when it comes to regulations, and I don’t want them to make bad regulations. I want maybe work with the companies so that they do better when they are responding to the communities because sometimes governments are not doing better.

Peter Yeung (08:01):
Right. Exactly. And it’s interesting to see how you’ve made that shift and learning over time. And I just wonder as well, just to touch on that, when you said that earlier on you took more of an activist approach. Can you just explain … I suppose, what exactly inspired you to go into this line of work? I suppose, from personal experience, was there a particular moment? Or is it obviously the building up of experiences?

Nighat Dad (08:31):
It was my own experience, and I was not always an activist. I haven’t come from a privileged background. I studied law, though. I was the first woman in my family, and I belong to a village in Punjab, Pakistan. And I had to fight to go to the university to do my law degree. And I knew what patriarchy does to women.

(09:00):
And when I became a lawyer, I actually started fighting cases of women who were actually fighting for their maybe custody cases of their children or divorces or their right to inheritance and all of those things. And I saw how the system was structured against women. The criminal justice system was flawed. I faced myself … Even as a lawyer, I faced unfair processes within the criminal justice system when I was fighting a custody case of my own son. And that’s where I realized that this is not only my fight. This is a bigger fight.

(09:48):
And from that mainstream women’s rights, I thought it’s important for me to look into the new spaces, the spaces where women are finding their voice, and those spaces were online spaces. Access to mobile technology. Not even smartphone. I’m talking about feature phone back then. Facebook was not a thing. There was this Orkut. These new online spaces were emerging, and I found women finding those spaces who were not allowed to step out of their homes. And I thought it’s important to explore these spaces and how we can make them safer.

(10:30):
But also, women started reaching out to me talking about how they’re facing harassment, how they’re facing monitoring and surveillance from their own families around their devices, how they are being victim-blamed when somebody else harasses them. And it’s all patriarchy. It’s all misogyny.

(10:48):
That pushed me to look into this new space. It was my personal experience, but then I decided to fight for every woman in my country, and I wanted to make a space where we can reclaim this space. No one tried to snatch it from us at this space where we find solidarity and where we find our sisters and where we find joy and where we find pleasure and where we also exercise our fundamental rights of freedom, of expression, and access to information.

Peter Yeung (11:24):
Well, again, it’s some amazing achievements that you made so far. And I just wondered, to what extent are you now happy about the setup that you’ve created and the framework that you now have to help respond and minimize that harm?

Nighat Dad (11:39):
Can I just say I’m very proud? I am very proud of the way me and the women around me and the feminists who have led this work and is still leading this work and pave the way for other organizations and feminist movements just to mainstream this discourse that our online rights are as important as our offline rights. And this mainstreaming discourse actually led to our voices being heard in the global important spaces.

(12:19):
I’m now part of UN Secretary General’s AI High-Level Panel advising UN that how we can shape global governance of AI. Having voices like myself, and I hope there will be more voices like myself from global majority world, in these rooms and important spaces where we can shape policy, the global policy, not only national …

(12:45):
I’m very proud what I have achieved, and of course there were failures, and there were moments when I cried, and there were moments when I thought to give up. But then when one woman from a survivor community come to me and tell me that my work matters and the helpline that I lead in Pakistan, which is Cyber Harassment Helpline … When one woman tells me that it matters to them, that’s a success for me, and it makes me so proud.

Peter Yeung (13:22):
I suppose we spoke about how widespread some of those issues are for women facing harassment online and, I suppose, various forms of abuse, but can you just explain about what kind of the impacts that they have and on these women generally on all women users of the internet and how that affects them? Some might just not feel as confident anymore and just want to stop using networks altogether.

Nighat Dad (13:52):
I have seen different layers of impact on these women. I have seen women departing from online spaces and not coming back. I have seen women facing mental toll and not being able to say anything about it because of the patriarchy around them. I have seen women committing suicide, and I have seen women being killed in the name of honor.

(14:35):
And I think that’s why it matters to me that this work should keep going on. There are women, very bold women, like Kamil Baloch, who actually was a social media celebrity … Reclaimed online space with her agency and ownership. And I kind of relate to her because she was from a village. She made her life herself. She was a single mother. Spoke the same language, Punjabi. And eventually she was being killed by her brother because of how she was using online space.

(15:16):
And I think that’s why I feel this work is so important and how it’s impacting my women who are just trying to find their voice and who are being killed by their own brothers and fathers.

Peter Yeung (15:36):
Well, as you say, it’s really … Some people sort of underestimate the digital spaces if it’s online and there’s no harm, but really is … As you point out, it is a question of life and death for a lot of women there.

Nighat Dad (15:47):
It’s a question of life and death. There’s so much chilling effect of these killings that many women who are not privileged … They just don’t dare to come into these spaces. And that’s not what my constitution says. It gives us equality. Where is that equality when it comes to the society and when it comes to our own loved ones who do not allow us to … Who do not trust us. And I think … And that’s pandemic. That’s endemic in society like Pakistan.

(16:36):
And it’s not just … I don’t think that it’s only limited to Pakistan. There’s so many societies around the world, and I think that’s why it’s so important for me to talk about them and so important for me to work with everyone so that these communities, these people, these women keep finding their voices.

Peter Yeung (16:55):
Exactly. And as you say, whilst the problem is everywhere across the world, at least with the model that you’ve come up with, it can be applied to other places. Like you said, Latin America is another region. I just suppose to what extent and how do you, I suppose, adapt the model that you’ve sort of pioneered in Pakistan to other areas? To what things can you just simply move across? And what exactly do you need to tailor to the region?

Nighat Dad (17:25):
I think, in my work, I have realized that there is no one solution to this complex problem because there are so many actors involved. These are platforms. These are governments. There are people from your own society, bad actors, who actually abuse and harass and troll and bully.

(17:49):
What kind of solutions we can propose that sort of address all of this? It cannot be a simple solution. It cannot be just making regulation. That is part of the solution, but I think people sometimes do not understand when there are communities when these things happen. For instance, blackmailing over intimate images. Non-consensual use of intimate images. When that happen, what survivor needs? They call us on our Cyber Harassment Helpline, and they’re like, “Can you please do something right now?” That urgency is something that people really don’t realize that there are solutions in different steps.

(18:36):
How, as a first responder, you are responding to them … Of course, this should be something that sort of tell them that, “You are not alone, and we will look into this.” And there are things that we can figure. But of course in a society where there is a fear that, “What if my father will come to know about those viral pictures? Or my brother or my family or my cousins who have a right, who think that they have a right, to hurt me or kill me or to stop me to go to university?” The survivor’s requirement is, “Please stop this.” You have to address that.

(19:14):
The hotlines and helplines who address that, but then how you make systemic change … The data that you collect, the gaps that you see as helpline or as people working on these spaces … You collect that and see the pattern across different jurisdictions. Maybe helpline is working for Pakistan. And actually so many other people, women-led organizations, replicated this model in Latin America, in Sri Lanka, in India because that is working for them.

(19:48):
Someone, a victim or survivor, picking up a phone … They want to hear a human voice who can actually tell them that it’ll be fine. Not to an online chat or bot. That model would not work for my country because there is so much illiteracy.

(20:07):
There are community solutions coupled with global solutions. That could be regulations. That could be the ways to hold companies accountable. Making their policies better. All those things. But I think community solutions are so important. And that is, I feel, the way to start, basically. Listen to the communities and focus on their solutions but then devise policies and devise your global advocacy and lobby based on what communities are facing on ground.

Peter Yeung (20:45):
Looking at the steps ahead, obviously, as you pointed out, there are still ongoing challenges and new challenges with AI and developments with that. But for you, what are your next objectives? And what do you hope to achieve in the next steps?

Nighat Dad (21:02):
Letting people know more about the challenge that they are facing with. I would say innocence sometimes and ignorance also, but also lack of digital literacy. In countries where people are worried about their food and meeting their ends meet, why would they care what AI is and what generative AI is and what these technologies are doing to their democracies?

(21:31):
And I think it’s so important to mainstream this discourse that even a layman and marginalized groups realize that what are the opportunities coming out from these technologies, but what are the harms that they could face and not only to themselves but to their democracies as well and their ability to participate as a active citizen.

(22:04):
A very small thing that I do … It’s a passion project very close to my heart is, even though I sit on big boards and do these global talks, I go back to my country, and I go to government schools, and I give sessions along with my team to the young girls and tell them that what information and communication technology is and what kind of world that they are going to deal with online and preparing them that how they can deal with the challenges of online misogyny and hate and abuse, but at the same time, what are the opportunities for them that they can utilize it for their economic benefit but also enable their lives and making them really proactive citizen? And that makes me so happy.

Peter Yeung (23:00):
It sounds like that’d be a very nice balance to have and very fulfilling kind of work to add on there. And just one last question that I wanted to ask you. And perhaps it’s connected to that last sort of passion project that you said. I mean, what exactly could you describe is the world and, I suppose, society that you are trying to build and that you’re working towards?

Nighat Dad (23:24):
It’s a loaded question, but in an ideal world, I would like to see a society where a young girl is actually building a technology on her own terms, deciding the rules of the game, and then empowering her community.

(23:50):
The world that I’m trying to build, I would say … I think it’s very ideal, but I think we can achieve this. It’s a world where everyone has an equal say, have safe, accessible online spaces, and are not scared and fearful of saying things because of the bad actors and the powerful actors out there.

(24:21):
The world where women journalists have safe voices, and they can do their work without any chilling effect and fear of bad actors and powerful governments. The world where women human [inaudible 00:24:37] defenders keep defending others without being intimidated and threatened.

Peter Yeung (24:44):
Thank you so much for your time today.

Nighat Dad (24:46):
Thank you.

Related Organizations

No Result