Host:
Welcome to Role Models for Change, a series of conversations with social entrepreneurs and other innovators working on the front lines of some of the world’s most pressing problems.
Peter Yeung:
Hello, Imran, and thank you very much for joining us today at the Skoll World Forum.
Imran Ahmed:
It’s my pleasure.
Peter Yeung:
Yeah, I’m really interested to hear more about your work today, but just to begin with, can you introduce yourself and explain basically the work that you’re doing?
Imran Ahmed:
So my name is Imran Ahmed. I’m chief executive and I’m founder of the Center for Countering Digital Hate. We were set up about four years ago trying to find ways to reduce the prevalence of hate and disinformation on social media platforms, which currently find it profitable to push that to billions of users by creating costs for inaction. So how do we create a market and regulatory solution to the market and regulatory failure that has led to the information crisis that we’re in today?
Peter Yeung:
Yeah, there’s an interesting term that you use information crisis, and I suppose how widespread is this problem now, and I suppose to any sort of normal citizen of England, of America, how likely is it and how much is it impacting them?
Imran Ahmed:
So let me explain information crisis in words that normal people would understand. Lies and hate, both of which are wrong, at a simple moral level. Now, there’s always been lies and hate in our society, but the problem is that they’re being turbocharged by social media platforms, which actually find that content to be sticky and addictive, whether it’s conspiracy theories that lead people down rabbit holes or it’s hate that makes people feel anxious and so therefore makes them respond, engage, get into arguments online. What the platforms have worked out is that it keeps people on platform. It’s the ultimate addictive sticky content.
But what that does over time, when we’re confronted with it every day, it starts to resocialize our perceptions of the world around us. And that’s the problem we’re in. We’re actually in a distortive, the primary lens that we use to find information, to maintain our relationships, to transact business, to establish our social mores, even establish the corpus of information that we call facts, those spaces, the way in which we see them is a distortive lens, which is distorted for commercial purposes. And I think that’s the problem that we’re in is that the main means by which we get that information is now a highly distortive environment.
Peter Yeung:
And I suppose then, at what point were you inspired? Was there a moment that happened in particular, or why exactly did you begin to start addressing this problem?
Imran Ahmed:
So I started CCDH seven years ago. I was working for the Labour Party at the time, and three things happened that really shook me to the core. One was the rise of antisemitism in the party that I was serving, which was outwith the traditions that party, an anti-racist party that believes in tolerance. The second was the Brexit referendum, in which I was advising our leader on the referendum, Alan Johnson. And I could see the scale of anti-Muslim, anti-black and conspiracist lies that were being spread in digital spaces and how they were affecting the way that people were interacting with us on the doorstep. And then it was the murder of my colleague Jo Cox, who was a thirty-five-year-old mother of two, who was murdered by a man who’d been simmered in lies and hate online. And what became clear to me in that moment, was that conventional institutions had failed to understand that the main way in which we share information, we find out about the world around us, in which we negotiate our values, our norms of attitude and behavior, even decide what facts are, had shifted to online spaces.
And those spaces were run to rules that we simply didn’t comprehend, and that actually advantaged hate and disinformation over tolerance and good information. And unless we dealt with that problem, which was not just affecting one particular political party, it was across the spectrum in the UK. And in 2016, in that summer, I could look overseas and see what was happening in America with the rise of Trump, with Orban, Bolsonaro, Duterte, things happening simultaneously around the world. And that isn’t contingent circumstances that drive that, it’s not one person, that’s a fundamental shift in something. So if you saw icebergs breaking off and sort of melting all over the place, you’d initially think, gosh, what’s wrong with that iceberg? But eventually you’d realize, it’s because the temperature’s increasing. The same is true of our information ecosystem. The temperature has slowly been increasing, and these icebergs melting and breaking off, Brexit, the rise of Trump, the murder of my colleague, and then we realized as we launched publicly, so much more.
Peter Yeung:
And yeah, I suppose that’s, not to mix too many metaphors, but I suppose with rising temperatures and where perhaps the frog in the pot that hasn’t realized, at least society and institutions have been slow to react to these changes.
Imran Ahmed:
When you increase the amount of hate and lies in the world, it has this effect of making us more suspicious of other people, of the nonsense that they might believe. There’s a sort of, I try to avoid using this term because it sort of identifies me as a former pseudo-intellectual, but it’s Hobbesian. It’s creating a system in which we can’t trust our neighbor. And when we can’t trust our neighbor, we start to think, well, how do we defend ourselves against our neighbor? And that’s increasing the amount of polarization, the negative partisanship, the brittleness of our society. It’s the ease with which it fractures in unpredictable and potentially devastating ways. And I think that what it’s leading to is a politics that is incapable of sustaining the values that are required for democracy. And my fear is that we don’t have much time left, because eventually the frog will be boiled.
Peter Yeung:
And I just suppose, so we’ve obviously established here, that is an enormous problem. And how have you really gone about trying to solve this and reduce this mass surface of disinformation and hate?
Imran Ahmed:
We’ve had three phases as an organization, and keep in mind we’re very young, but the three phases were one, to help people to understand that online harms have an offline effect. And when I started this work, I got told, you are mad. This is online stuff, this isn’t real. But I don’t think anyone believes that anymore. The second thing we had to do was persuade people that tech platforms, social media platforms weren’t just cool tech guys from San Francisco who wanted the world to be a better place and believed in liberal values, but they were rapaciously greedy and were willing to sell out our democracy, the safety of our children, their mental health, for money, and my God, how much money. Mark Zuckerberg’s younger than I am. He’s made a hundred billion dollars from what is probably one of the world’s greatest business models ever.
And the third thing we had to do was then build the political consensus to create costs for that hate and disinformation. If, as our contention is, if we’re right that hate and disinformation are profitable, how do you make them less profitable? Now we do that through two primary mechanisms. One is that we work with, we give information to advertisers who are the main, 98% of the revenues of Meta, for example, come from advertising, to help them to make moral decisions about where they place their advertising. And we’re really successful at doing that.
So successful that Elon Musk even sued us because we were so effective in identifying the rise and hate on his platform, and he lost a hundred million dollars of advertising as a result. The second way we do that is through regulation. And that regulation is where we’re able to balance freedom of speech, which is vital, with transparency and accountability for those platforms, such that where they create systemic harms in a negligent way that causes damage to people, that they’re able to seek restitutive justice to the civil justice system for that. And those are the two ways in which we’re making hate and lies less profitable.
Peter Yeung:
And I suppose, yeah, I obviously was intending to touch on the Musk debacles. As such, I’d just be interested to know, I suppose, were you surprised with how things unfolded? And I suppose, what’s the current situation and fallout of that?
Imran Ahmed:
Well, look, I mean with Elon Musk, when he took over the platform, he put up the bat signal, did he not, to hate actors and to disinformation spreaders saying, welcome back on the platform, I like what you say. He indicated that through his own behavior, he let tens of thousands of people who’d been banned by the platform back on, including people like Andrew Anglin, America’s leading neo-Nazi. And then they stopped enforcing the rules. They sacked hundreds and hundreds of their content moderators, the people who were keeping the platform safe, and we quantified what that meant in real terms. So we found that there was a 202% increase in the use of the N-word, so a tripling in the use of the N-word in the week after he took over X, compared to the year before on a daily basis. And when we did that research, obviously it was sensational data.
It went on the front page of the New York Times, and advertisers took action, unsurprisingly, because a lot of them are African Americans, a lot of the senior executives there. And they were like, well, no, this isn’t tolerable. So he lost a hundred million dollars in advertising, and he then sued us, not for defamation, because our data was good. He sued us for the act of doing research, and for tortious interference with his advertising business. The problem is that you can’t sue people for doing research. And so that lawsuit, which actually cost us half a million dollars to defend, was tossed out by a judge at the first chance given to them, and it was designated a slap suit. And I’m really proud of that, because I’m the first person that’s able to prove in a court of law that Elon Musk is not a free speech advocate. He’s a thin-skinned censor who’s willing to shut down speech using strategic litigation because he can’t take being criticized. And I think that was worth the half a million it cost us.
Peter Yeung:
Certainly not mincing your words there.
Imran Ahmed:
No, no.
Peter Yeung:
I suppose it’d be interesting to know more broadly when it comes to dealing with these big social media tech conglomerates, how you found that, has it generally been quite a combative situation? Has there been to extent some cooperation from their side?
Imran Ahmed:
No, I mean, the companies, like I said, it’s one of the most successful business models in history. Why would they change doing what they do? And so the question for us is, how do we hold them accountable? Now look, in politics 101, how do you stop mad or bad leaders? Checks and balances, right? And so we are a check and a balance. We provide evidence and then people can take action based on it, which actually reduces the profitability of hate and disinformation. But the problem is you are taking on the world’s biggest companies, and having the scale to be able to do that, having the resources to be able to defend yourself when they inevitably fight back, unlike say, the fight against big tobacco, the fight against big oil, the fight against, and we are often fighting against big oligopolistic industries.
The tech accountability sector is tiny, and I’m really, really pleased that Skoll is funding this sector and funding CCDH and work like ours because it’s vital to be able to take them on. But I do think that we are in trouble because they are spending an enormous amount of money, that their reaction to our reaction to them is to massively increase their spending on lobbying, massively increase their spending on PR to put up all the defenses that you would expect them to put up. And for us to overcome them, to be able to adapt to their defenses is going to take skill, resource, and people power.
Peter Yeung:
And presumably also the input and support of political institutions as well.
Imran Ahmed:
Look, I think political institutions have a part to play. I don’t think politicians should be telling us what we can and cannot say. I’m a liberal. I believe that we have the right to be wrong. However, you don’t have a God-given right, or a constitutional right, to profit from lies and hate. And so what I want them to do is actually be enabling. I need enabling legislation for transparency, transparency of the algorithms, transparency of the content enforcement policy, transparency of the advertising, so that we can hold them meaningfully accountable in alliance with government, with regulatory bodies, and with select committees, congressional committees, to meaningfully accountable based on transparent data. And then have mechanisms where they do create harm that they can be held economically responsible, whether that is by regulatory fines, or that is so when it’s done en masse to large swathes of society, or it’s through class actions or other mechanisms.
Peter Yeung:
I suppose, obviously, you gave the example of this case with Elon Musk, and the lawsuits, and your success in, I suppose, defunding, as you put it, I think those benefiting from this hate. But I suppose how are there other ways that you quantify and measure success in your goals and objectives? And if so, what are they?
Imran Ahmed:
Look, we have an immediate crisis, and so I am not willing to wait around for politicians to finally get it. And so there’s a lot of work to be done inoculating people against, so getting public information out there, not just to the general public, but also to civil society, to our partners who work, our fellow grantees, for example, climate change people, people working on sexual and reproductive rights, on how best to counter disinformation, whether that’s inoculation, it is offline mechanisms, and so we are students of lies and hate, and how they spread in the modern world, and how to combat them in the most effective way possible. We participate fully in civil society’s resistance against the damage that’s being done. But it doesn’t matter if you work on climate or anything else, disinformation is affecting you right now. And so what we’re trying to do is ensure that we act as a nexus for collective understanding and collective action, to deal with this information ecosystem and the malignant ways in which it affects our interests.
Peter Yeung:
I suppose I would just be interested to know, obviously you have made big strides and it’s only been a relatively short amount of time up until now, but what are the difficulties that you’ve faced more broadly, I suppose obviously we spoke about the relationship with those companies, but I suppose what kind of obstacles are there to you sort of reaching that objective or those objectives that you’re seeking to reach?
Imran Ahmed:
Look, I think there are three problems that we have. The first problem is human and spiritual, which is how do people lead when they’ve never led before? How do they find it inside themselves to master both the technical skills, but also the evolution that they need to go through as a human being? Four and a half years ago, I’ve never managed anyone, and now I run an organization with 31 staff that’s growing really fast. The second is technical, on how do I have the technical skills to organizationally structure, to ensure that we have the best possible operations, to understand how to create a scalable institution, which can meet the demand that’s out there.
And the third problem that we have is money. And you’ve got individual challenges, organizational challenges, and then existential challenges, which are, do we have the cash to do it? And so we are constantly being tested in ways that I think very few other types of organizations are, because there isn’t necessarily a direct connection between the quality of work we produce and revenues. And so being able to balance those is a daily challenge. And then to be a human being while doing it, that’s a tough one. It’s a tough one.
Peter Yeung:
And I suppose just on that last point of the funding, have you explored, or I suppose I’d be interested in your thoughts on whether different models such as those tech companies being required to contribute money towards, I’m sure you wouldn’t necessarily be opposed, but is that something that you’d argue for?
Imran Ahmed:
Look, we don’t take money from governments and we don’t take money from social media companies. And that’s really important is that we’re an independent voice that’s able to provide unvarnished advice without fear or favor to any government or to any platform, but also that we can criticize them when we need to. However, I do think it’s important that platforms play their role in allowing access to data and being willing to react in an open, honest way, not just following the letter of the law, but the spirit of the law as well. We’ll continue to provide constructive advice to them, but I do think that taking money from those platforms is a poison chalice.
Peter Yeung:
And I suppose, to what extent do you think that in these years that you’ve been working, that at least the situation has improved to an extent? Of course there’s always evolving problems that need to be tackled, but to what extent do you think that there have at least been improvements in the landscape and a better acknowledgement of the issue at hand?
Imran Ahmed:
I was the first witness to give evidence to the British Parliament for the Online Safety Bill. 2021, September. And in October 23, it became law. We’ve been instrumental in providing the data that drove the Digital Services Act. I’m giving evidence in Ottawa shortly for Canada’s new online safety laws. In the U.S., I’ve given evidence to Congress and I’ve seen the perceptions of what’s possible change radically over the last four years. I think that there is progress being made, but my fear is that it’s not fast enough. In 2024, over 2 billion people around the world are going into elections. And I’m old enough to remember when Francis Fukuyama wrote The End of History, and said that liberal democracy had succeeded and that it was over. Well, now in 2024, we find ourselves at a tipping point where we could tip into authoritarianism and a digital led populism very, very easily. And my fear is that before we’ve had the chance to fix this crisis, it will have overwhelmed us. And in that respect, it does remind me of the climate crisis. There is a window for action, and it’s closing.
Peter Yeung:
Just one last point then, Imran, was to do with the looking ahead in the future, and I suppose I’d be interested to know when it comes to how exactly you would describe this, and, well, an ideal kind of ecosystem in the future, and what exactly it would look like when it comes to minimizing as much of that harm as possible?
Imran Ahmed:
I actually believe that in free and open discourse, which is run to rules that allow every voice to be heard, fully actualized, for individuals to feel welcome, to contribute, a safe space for abusers is an unsafe space for the victims, and for normal people that don’t want to be somewhere where people are screaming the N-word every two seconds. And so I think it’s really important that we have open spaces for discourse, but that does mean a rules-based order for social media platforms that’s enforced, that has sanctions attached to it if people violate them. And that doesn’t amplify one side over the other, hate and disinformation over good information and tolerance. And I do believe in the virtues of humanity. I think that we all individually want a better world. And I think for the main part, those of us who see that happening through tolerance, peace, science, and democracy outnumber those who don’t. So, give us a level playing field and we can win this battle.
Peter Yeung:
Brilliant. Thank you so much for your time here, Imran.