Join us in the Journey to Transformation: A new series
MENU

Skoll World Forum Interviews: Spotlight On Brazilian Social Innovators

September 3, 2024

Brazil’s democracy was tested in the lead-up to the 2022 presidential election. Since then, Brazilian organizations on the ground have continued working to strengthen democracy by empowering everyday people to run for office and reducing information disorder. These community-centered groups recognize democracies are built on participation and public awareness.

As social innovators prepare for the Brazilian Philanthropy Forum, we’d like to elevate two organizations that are protecting democracy by supporting information integrity and fostering leadership respectively. Francisco Brito Cruz, co-founder Executive Director of InternetLab, and Bruna Barros, Executive Director of RenovaBR, shared the importance of their work along with the hopes for their nation’s government at every level. Watch highlights of their interviews, or listen to the complete podcast conversation, and read the transcripts below to learn more.

Bruna Barros, is the current Executive Director of RenovaBR and has +15 years of experience in consulting and auditing companies, public sector and third sector. She holds a BA in Accounting from FECAP, an MBA in International Business from BI International, 2 executive specializations in public policies (SIPA / Columbia and SAIS / Johns Hopkins) and a master’s degree in Public Policy at Insper (2023). Was Special Advisor on Competitiveness, Productivity and Entrepreneurship and Undersecretary of Entrepreneurship and Microcredit of the State of São Paulo. She is also Fiscal Board to Legisla Brasil and the Grupo Mulheres do Brasil, and is fellow from political and leadership organizations as Movimento Acredito, RAPS, RenovaBR, ProLíder of Instituto Four, and the Public Policy Committee of the Grupo Mulheres do Brasil. Her purpose is to to help strengthen democracy and public policies that matters for society.

Francisco Brito Cruz is the Co-founder and Executive Director at InternetLab, a Brazilian think tank focused on human rights and technology. Over the past decade, the organization has been actively engaged in research and dialogue to shape transformative initiatives. Francisco holds a PhD, a master’s, and a law degree from the University of São Paulo School of Law (FDUSP). During his time at FDUSP, he founded and coordinated the Hub for Law, Internet, and Society at the University of São Paulo (NDIS-USP) from 2012 to 2014 and 2016 to 2019. Additionally, Francisco served as a visiting student researcher at the Center for the Study of Law and Society at the University of California – Berkeley in 2013. As a prominent legal scholar and policy thinker in Brazil and Latin America, Francisco specializes in platform regulation, election integrity, the internet, and trust and safety.

Bruna Barros

Francisco Brito Cruz

 

Click to show transcript of Bruna Barros Interview


Welcome to Role Models for Change, a series of conversations with social entrepreneurs and other innovators working on the front lines of some of the world’s most pressing problems.

Bruna Barros (00:12):
What I do, I’m a director in a school of leadership for political and public management in Brazil. I’m a Black woman from Brazil. The introduction start way before what I do. I always like when I was a child of politics and economy and I was always watching the news. But I turned out to be an accountant and an auditor and all that. I started on a private sector and then something came up to me in 2016 probably. It was a dual moment to Brazil. Because over the impeachment and all of those things and I started to participate more actively in a few political movements. One of those brought me to Hanover Bay as a fellow. So I’m also a student from Hanover Bay here in 2018 as well as other leadership programs. And in this I have shift in my career. So I left the private and went to the government. I didn’t be elected, but now supporting those who are actually trying to have a better society, have ethics and have qualifications and have the need to change for something better. So it is grateful to be in this position.

Matthew Beighley (01:52):
Well, can you just start off by telling me what’s the problem that you’re working on? And very broadly, imagine I know nothing about it.

Bruna Barros (02:00):
Wow, it is a big problem. The society as a whole globally, but actually in Brazil as well, have been pivoting to less democratic views. I feel wants to go for the easy way. People have been having misinformation and all of those things that are harming what is supposed to be like societies of the future. So people don’t trust politicians as well in Brazil it’s no different. And they never actually did trust politicians in a way. We are a young democracy. We just have military coup and all of those things. So it is retracting to how can I make society see that this is a problem and that we actually have a solution? So it is part of my job.

Matthew Beighley (03:09):
What are the consequences of not having a healthy democracy?

Bruna Barros (03:14):
The consequences of not being able to strengthen democracy, it is to, well the opposite actually, to have authority governments, to have your voice unheard. If people wants to actually have plurality of voices or even of gender diversity and all that. This has been saying for a long time, but actually in the last 80 years probably a few governments are just talking about maintain the power. It is not something about better the society, but just maintain the power. So this is why we have to fight.

Matthew Beighley (04:09):
So perfect way to jump into your solution. What is the work that you do? What’s the idea that you came up with?

Bruna Barros (04:20):
The idea of Hanover is to have common citizens, but actually people who actually have potential to be a leader, to be influential in a way that society will pay attention to their work to claim what is needed to change. So we actually have skills, learning skills, ability to, they can be articulating and negotiating and doing these things, but to the good of society. So you have people who are from education, environmental, mobility, so we help them to achieve their dream and their dream is be change for a positive and better environmental cities and whatever.

Matthew Beighley (05:16):
How do people come to you and how do you work with them?

Bruna Barros (05:18):
Oh, oh.

Matthew Beighley (05:20):
Just again, if I don’t know anything about it.

Bruna Barros (05:21):
Okay, so Hanover, we select, it is a huge process, but easy one for WhatsApp. So it is very easy to come up with us. It is online, mostly of the process. It is online. Then they have like evaluation… How do I say it? There’s a board that evaluates you. It is like you are in the city council. So we do all of those questions about laws and views and how is going to be your position for something for an issue. And people come up with different hypotheses and different perspectives. So it is very interesting. We value all of those things, how the people are positioned, how she can support her statement. And then we select those ones and spread on to cohorts, different cohorts, and they have very extensive classes of ethics, transparency, the functions of the state and all true, especially in focusing on the campaign aspects campaign.

(06:46):
Campaigning, it is a very hard thing to do. Not everybody know how to do it and knows that you have to raise funds and have to have a mobilization team. You have to have a team that actually runs smoothly towards you have a better communication and everything towards the problem. And you also have public policy aspects. So you have a very extensive with many experts in Brazil and outside Brazil to come up and say, “Hey, do you actually think about that perspective? We came up with data. It is good. You have to have evidence to support that statement you’re doing and actually after you are getting elected. So we have a continuous program. The continuous program, it is to better in your productivity, in your managing. So it is a very complex but towards better their understanding of a campaign and their understanding of how to do better my job.

Matthew Beighley (08:05):
What are the obstacles that you have day in and day out or big ones, philosophically?

Bruna Barros (08:12):
The problems, it is not towards the operation side, but mainly towards the society because I have the huge goal. We are doubling our bets. We are graduating almost 3,000 students this year. And the mainly goal it is to have those students being perceived by society as of candidacy of quality that they can actually improve their cities. So this is like how do I can brought them to the electors?

Matthew Beighley (09:01):
What’s your vision? What’s the hope that you have for Brazil?

Bruna Barros (09:07):
So much.

Matthew Beighley (09:07):
Again, my voice won’t be in it.

Bruna Barros (09:11):
My vision to Brazil, it is a more representative, a more diverse, a more plural society that can actually think that it’s possible to do better.

Click to show transcript of Francisco Brito Cruz Interview


Welcome to Role Models for Change, a series of conversations with social entrepreneurs and other innovators working on the front lines of some of the world’s most pressing problems.

Peter Yeung (interviewer):
Well, thank you so much, Francisco, for joining us today. I’m really interested to talk about your work and we’ll get into that. But I suppose to begin with, can you just introduce yourself and tell us a bit about the work that you do?

Francisco Brito Cruz:
I’m Francisco Brito Cruz. You can call me Chico. I’m Brazilian. I live in São Paulo and I’m the founder and executive director of InternetLab. InternetLab is a Brazilian organization. We are focused on human rights and technology. So basically what we do is try to bring accountability to stakeholders in this area of technology. We work, for example, on disinformation, on surveillance, on political violence online. So these topics, we investigate them, we discover what to ask companies, what to ask governments, and we structure policy work. So we go after them to produce an agenda and to produce change in these areas.

Peter Yeung (interviewer):
So when you touch on some of those points to do with surveillance and some of those infringements of our rights, I suppose, can you go into a bit more detail about what those problems are currently that users, I suppose, across the world using the internet can be suffering and how widespread is that problem?

Francisco Brito Cruz:
Well, our life now is totally digital. There’s no more thing to make it divide between what is offline and what is online. Everything is online. So everything is intermediated or mediated by technology. So as we exercise our life, we exercise our rights, and as we exercise our rights, we do that online today. So we can be surveilled in everything we do. We need to express ourselves and we need platforms to do that. And those platforms can be safe or not, can be triggering the worst emotions that we have or can trigger the best emotions that we have, can connect us to false narratives, can connect us with hate, makes us fear to say things and at the same time can makes us participate and to be part of what is now we consider democracy in this online world. So it’s everywhere.

And of course in realities like Brazil, the investment about the safety of these tools for example, is not as good as it could be as we’ve seen in Europe or in the US. So there’s another layer of inequality and maybe omission regarding those technologies. So what we do is we try to understand how people are using? What are the problems that are arising from this use? What could be triggering this problem? Is any technology choice that made that… Is there any policy or regulation that is making companies do that or not?
And we try to raise this, we try to bring that to the conversation and push those stakeholders to a stance that they are adhering with what could be more protective human rights position.

Peter Yeung (interviewer):
And when we’re speaking about platforms, I suppose in general you’re referencing social media and things like Facebook or Instagram. Can you speak a bit more about what exactly you mean?

Francisco Brito Cruz:
Yeah. I’m speaking about social media platforms, but in Brazil, one thing that you cannot avoid is messaging apps, for example. Like 99.9% of Brazilians that are in the internet have a messaging app and are very active on it like WhatsApp. So I’m speaking about them too. At InternetLab, we look of the use of technology in schools, for example. We look at technology and welfare programs to see out is people should receive some kind of a benefit or not, and maybe the government is using artificial intelligence for deciding that.

Peter Yeung (interviewer):
So things like algorithms and-

Francisco Brito Cruz:
Exactly. So talk about technology is quite a broad range of possibilities. But of course when we were talking about disinformation hate speech, the expression online, and not only the expression and the participation, but the violence that can prevent participation, I’m mostly talking about what’s happening on content platforms, so social media, messaging apps, etc.

Peter Yeung (interviewer):
And so can you, I suppose, take me through the various components of the work of InternetLab. I suppose, do you have researchers looking at specific topics when there are elections coming up or…

Francisco Brito Cruz:
Yeah. For example, flagship project that we have is our observatory on political violence, especially against women. So what we do is that when an election is coming, we and our partners structure monitoring observatory to understand which cases that we are going to see. So who is going to be attacked and why, and how. What is an attack and what is… We try to create a structure to differentiate where is an attack, what is violence and what is a criticism for example against a woman.

And of course we’ve done this twice now and we discovered that of course women are attacked because they’re a woman, not because they’re politicians. And not only woman, but black people in Brazil, LGBTQ communities. So the intersections are quite there. And we produce evidence about them. And in 2020 for example, we produced the first set of evidence and then we wrap it up and put it all on a big report after raising this during all the election process.

And on the end we were able to present this to congress. And in 2021 Congress voted a political violence law, like an act making, for example, this kind of violence an electoral crime. So basically trying to prevent this from happening. So it’s simple example on how we made the problem visible to policymakers. And so this kind of triggered a lot of conversations within private and public sector. And I think now today people have the perception that this is a problem that we need to dialogue on solutions about the problem and everyone needs to be part of this conversation.

Peter Yeung (interviewer):
Yeah. Well, I mean, congratulations. That’s an amazing sort of impact that I suppose your work has had a real change in law. And I suppose that is one point I wanted to touch on in terms of how exactly do you go about measuring the success of what you’re doing? Obviously, there’s so much information out there the whole time, and obviously you can’t track everything.

Francisco Brito Cruz:
This is the big shiny case. We were looked about the political environment and the disposal of the pieces in the board that were able to get us to be in this position of bringing the evidence to the approval of a law. This does not happen this way all the time. Sometimes success can be preventing something back to happen and sometimes success can be putting an extra phrase on something that was going to be super bad. And when we were talking, for example, on platforms, we noticed that platforms don’t like to give you the receipt that you got what you asked for.

And I understand that. They’re companies. They’re afraid of showing that they conceded or that they’re responding to something and then entered in a cycle that they will be always being demanded and not being able to grapple the conversation anymore. So they were never saying to you that you got this because you want it. And you need to be smart about that. You need to understand, for example, when we are directly having a conversation with a big tech company, they will get your information. Probably they’re going to dispute they’re inside that huge organization. And what you’re doing is pushing them and putting some tools over there that you’re not going to receive the end result with a clarity that you might want it.

Peter Yeung (interviewer):
That’s the point I wanted to touch on just in terms of how those tech companies tend to react. I personally have done some reporting on tech companies too, and we have one example in mind where Google was very obstructive in trying to prevent me from reporting a story. And I just wonder how difficult, and I suppose how cooperative have they tended to be with the work that you’re doing?

Francisco Brito Cruz:
We have 10 years now. So our conversation with those companies is now a decade long and it’s different in every subject, in every topic. Sometimes we are discussing things that we are in a similar position in terms of the policy part of it. For example, when a government is pushing companies for delivering data that they don’t want to deliver without a court order, without due process. A number of times we are defending this stance and in other times we are basically pushing them and demanding them more information, more transparency, more action.

I think that in our field, we are not the frontliners. We are considered more diplomats. And I think that this kind of put us in a position in the last years that at least we are hurt. I think of course we are only heard because there is frontliners also. So we are part of the ecosystem and the civil society on that. And Brazilian civil society is super vibrant and it’s incredible. There’s a number of organizations that we are part of a coalition there and each one plays a different role.

But I think that depends on the subject. We can receive lots of silence sometimes, but we learned a lot about how do this. For example, there is disputes inside companies, between teams, between the Brazilian branch and the headquarters in the US for example. So how you should play with this because they’re playing with you. Teams are playing with you, companies are playing with you. If you enter that not being ingenuous and try to push the buttons and pull the levers and move your agenda forward, I think you can learn things like that.

Peter Yeung (interviewer):
Yeah, that’s a very interesting point. I suppose we often just tend to think of these massive companies of difficult to see personalities and real people within them, but actually as you say, there can be differences of opinion. But I suppose can you give a bit more detail on what is that in terms of how they perceive or believe that certain policy should be?

Francisco Brito Cruz:
Of course, speaking about the Brazilian ’22 elections, a very sensitive topic because we had the most tense elections in our recent story. And then we were discussing how different platforms should apply or should implement through terms of service regarding content that was being posted there. And I’m talking specifically about content against the electoral system, not against the candidate, but basically saying that the authorities that were managing the election. They were rigged or they were fraudsters and this was false.

And in our opinion, this could feed into a very problematic and extremist movement and as we see during the process, this grew a lot. And in January 8th, this movement came into force and invaded lots of government buildings. So it was real. And in our conversation with companies, we saw lots of rules that were created to the American elections, US elections, such as rules regarding conspiracy theories like the QAnon thing, basically trying to deliver something, in order to deliver something to the US civil society and the US discussion.

And the way that the rule was written was pretty delimited and designed only to be implemented by the US context. Well, I was dealing with conspiracies in our reality and why does this wording cannot be applied to the Brazilian context? Like we were dealing with an election. We were dealing with conspiracies. These conspiracies can cause harm. YouTube could not use this policy for taking action regarding content that was clearly creating a movement against democracy and could lead us to an attempted coup.
We saw this interplay between what was the US part of YouTube and what was the Brazilian part of YouTube. And at the end, I think that the process was really not perfect. The electoral court really had to act a lot, but we saw that if civil society wasn’t there and we were part of a bigger effort, I think that things would be even bad that were during the process. So we were happy to be there in that position of demanding more and more cohesion.

Peter Yeung (interviewer):
Yeah, exactly. I just want to ask, I suppose, obviously it’s important to not be reductive about this, but you were saying the links between US and Brazil, obviously Bolsonaro the former president, I think he was given the nickname at some point, the Trump of the tropics. And I just wondered to what extent, obviously these are, as you say, shared issues that we can find in countries across the world, and to what extent do we also need to recognize the different contexts of them?

Francisco Brito Cruz:
When Bolsonaro won in Brazil, we got a lot of questions if we thought that internet had done that. And I never liked this formulation because I think that Brazil done that and every country can do something like that. Technology is helping and it’s helping a lot. The way that digital communications shattered the way that we used to communicate and opened lots of breaches to create cracks on the legitimacy of journalism, on the legitimacy of institutions that were previously there, created lots of opportunities for the use of this information and of course authoritarians like Bolsonaro to arise. But I think that the emergence have lots of things to do with what’s happening in the country in terms of anxiety.

The internet is not functioning without people and people have feelings, people have anxiety. People wants to solve their problems. So in my opinion, this can happen everywhere because in everywhere people are using, and if we don’t have guardrails, if we don’t have an investment in trust and safety, if we don’t have people looking for human rights and understanding how those tools are going to be weaponized by these narratives, this would be easier or not.

In terms of parallels everywhere, it’s funny how the presence of a character like that and the presence of Bolsonaro in the Brazilian context really changed the way that civil society worked in Brazil. I think that when I look at colleagues from other countries that didn’t have that experience, it’s just like we were from the same species, but now we evolved differently in different contexts. I evolved on an island with a big predator and they don’t.

But I mean it’s everywhere, right? Because globalization, inequality, a world of poly-crisis conflict, it’s everywhere. And those things generate anger, generate blaming people, blaming people, different people for example, generate violence. So if you do not apply some effort and try to produce solutions, democratic solutions.

Peter Yeung (interviewer):
And I suppose, could you just speak a bit about how all of these issues might tend… Because I suppose for maybe some listeners even or just standard internet users who read about some of these electoral events and think that maybe have some kind of distance with what’s going on. I just wonder how from your point of view are their day-to-day lives affected by all of this from disinformation to data policies?

Francisco Brito Cruz:
Of course, you can be totally alienated by what’s happening on politics in your country, but things matter. If someone in your family is an LGBTQ plus person and is wanting to see, his, her, them identity represented in politics, this person would have lots of more problems to participate on the public sphere and to engage and to express themselves because they suffer a lot of violence while they’re doing that. And not because of policy ideas, it’s because of who they are.

And this can, for example, take it off an example, an inspiration, the idea that you can be what you want to be. So it’s a tiny example, but at the same time when we were talking about freedom of expression, what is freedom of expression in an environment that some people receive violence and aggression about who they are when they’re expressing themselves and other people don’t.
So it can be things like that. At the same time, when we entered the world of data usage like companies and governments are making decisions about you, departing from the data that they have about you, and you might not consider that these decisions about insurance, about social benefits, about health, about education, about your safety, about the possibility of you being arrested or not, about something that you might have done or not.

When they’ve done that, you want this data to be treated or possessed in a way that is protective of you like the holder of the data, the owner of the data. So it could happen everywhere in your life. Our lives are mediated by those technologies. So of course a person can say like, “I just use this technology and I don’t do nothing wrong.” Well, until anyone with power say you do, and when this happened, you might want it that due process is respected.

Peter Yeung (interviewer):
Well, no, as you say, it is very much widespread in almost every aspect of people’s lives.

Francisco Brito Cruz:
Yeah. Everyone wants to be free to say what they think. Everyone wants to feel safe when they participate in democracy. Everyone might want to have the right to do that. When we transition to a more digital society, this means that we need to demand some kind of responsibility from these intermediaries. And that’s our work, the work of constantly demanding that they implement technical solutions that are reasonable, or they’re adhering to human rights because technology is not coming from the sky.

It’s not being created by a divine entity that don’t have flaws, that are not selecting things in detriment of others. Technology is created by humans with flaws, and problems, and everything. So they might be wrong and they know they might be wrong.

Peter Yeung (interviewer):
I just wonder, again, looking ahead into the future, I mean even this morning over breakfast, we were talking about the impact of AI on certain industries and I suppose things like the development of deep fakes. And I suppose to what extent is that problem ever evolving for you and new challenges coming up?

Francisco Brito Cruz:
Yeah, every day we see something different in this field. It’s a funny part of working with this. The use of AI in elections, for example, is an obvious one. And we are trying to help Brazilian authorities for developing rules for this. For example, should campaigns label the content that is made by generative AI tool? I believe, yes. But of course I believe that some content should be banned, should be forbidden to be produced, especially when this content is too manipulative. It’s pushing too hard in terms of the lie or the false narrative that is in there.

And it’s difficult to strike the balance in all cases at the same time. You need to create some caveats. We need to create some permissions or not. And in elections like campaigns have what we call the side A and the side B. And at least in Brazil, the side A is the one that appears to you. That is the one that is illuminated in a sense. And this is the one that probably is going to label the content if a rule about label is established.

But we have the side B, which is the more obscure side of the campaign, the operatives, the ones that are trying to avoid the enforcement, the control of the authorities. And it’s very difficult to prevent them from using because technology is there and if it’s a crime to kill someone, you cannot say that people don’t kill other people like homicides happen, for example.
So I think that for this kind of case, you need guardrails that can help people accountable for problems that they are generating. And you need to work with designing of platforms and distribution to understand how those intermediaries can or cannot make it easy for distributing content like that. And I need to work with users too. So how to provide tools for them to understand what is manipulated or not, what is synthetic or not.

So it’s another example actually on how you need an agenda for dealing with things like that. Because at the end, one or two days before an election, someone like the side B of a campaign can generate a very realistic audio or video and really create an episode, or a tipping point for this or that campaign. And you need to be prepared beforehand. You’re not going to be… As we say, it would be difficult to cry over the thing that is built on the floor.

Peter Yeung (interviewer):
Right. And I suppose you were touching on… Obviously, it’s, well, as we say in English, I suppose, opening a can of worms maybe when you said coming to in some cases maybe censor and prevent the publication of certain amounts of content. And obviously that’s a very tricky situation to find where the line is. I suppose I wanted to ask about some of the difficulties and challenges of your work, and presumably in this case as well it is a… There’s also a strain on resources. There’s only so much that can be done by your team.

And I suppose as the companies would argue by theirs as well, when it comes to content moderation, can you talk to some of the points that, what are the most difficult challenges of your work?

Francisco Brito Cruz:
Striking balance between rights is maybe one of the biggest challenges. And for example, when we talk about liability about content, so who is going to pay damages if a content produces those damages? The person that posts it or the platform that hosted that content. And many people in the ecosystem raise easy solutions for that saying that actually should be the platform. Because if it’s the platform, the platforms would be incentivized to act upon content that is harmful.

But for us, this is not always the solution, for example, because if you put platforms in this position, the thought that they’re going to have can be not exactly if the content is harmful or not. The calculation would be, for example, how the judiciary is going to judge that, and how much is the risk of not winning that process, or the risk to administrate this lawsuit?

What can lead into an incentive of taking down legitimate content, for example? So this is not our goal. We don’t want platforms to take down legitimate content. We want platforms to make thoughtful decisions, applying resources in safety systems and counter-moderation systems that are adapted to context and that can provide reasonable decisions about rules that they have and justify that to society.

So depending on where you’re going to land on, what’s the solution, what’s incentive, you can think that you have a good idea, but you might have a very bad one because you’re going to basically incentivize companies to hire a bunch of lawyers to make risk assessments on this or that content and not thinking about what exactly is the content that you want to take down. And this money could be, for example, for training detection of violative content in Portuguese, and you’re paying a lawyer with the money, for example.

So striking this balance is key for us because when you’re missing, you’re not only possibly harming a human right, which is itself a bad thing, but you were pulling resources from, for example, an important solution that needed also. But yeah, it’s difficult. Sometimes, for example, you have a very problematic crime that needs to be investigated, and the investigators come out with a solution, with an alternative for the investigation that is super problematic precedent for surveillance like seeing it out.

Anyone that searched Y or X at Google, for example, and get the data from everyone that did that. This can help an investigation, of course. Maybe could help, but the precedent that you’re establishing with that, for example, is that in any investigation that people can think it’s grave, it’s important, they will try to use it again. And at some point that you use an investigation that you don’t think it’s so important. And this is the way that human rights are violated, one step after another. It can happen in a process. It can slowly evolve.

Peter Yeung (interviewer):
I just wanted to ask one more point is to do with the… Because obviously it seems to me what you are trying to defend here and protect is the very basic civil liberties and freedoms that basically what we understand to be our most essential rights. It’s just the fact that it’s within the digital realm and for a number of reasons, perhaps that’s less safeguarded than in the physical world and within our normal institutions. And I just suppose, could you speak to the point of how really vital you see this sort of struggle and I suppose the objectives of your work are?

Francisco Brito Cruz:
I like to say that bringing regulation for social media or accountability about human rights on these sectors is a generational task. We need to do that as a generation. I mean, not my generation, but in a broader concept of generation, because these are frameworks that are going to… They’re going to impact lots of people in the future. We are trying to create that as technology is evolving.

So being able to do things, to experiment, to not be unmovable by the risk of getting things wrong, but being open to change the idea, and change your mind and adapting. It’s part of the conversation. So our lives became this continuum between what is online and offline. You’re constantly here or there, and suddenly you pull your phone from your pocket and then now you’re online, or you’re listening to something in your ears or you’re wearing something.

You cannot put the toothpaste back on the tube. So if you’re going to live with that, we need to understand what we are going to demand as a society? What’s our limits? What we think is legitimate or not? And the thing is, I don’t like easy solutions, but I think we need to be attentive of what’s the problems that people are seeing? What is to say like not everyone has the magic bullet. I think no one has, but this does not mean that the concerns that people have about technology are wrong to deal with that.

Peter Yeung (interviewer):
Well, brilliant. Thank you so much for your time then, Francisco.

Francisco Brito Cruz:
Thank you. It was a pleasure.

Related Organizations

No Result