Fn 20: How To Save the 2020 Election: Stopping Fake News

"...disinformation is a threat to everything that we believe in. Because, whether we are campaigning to stop climate change, whether we are fighting for LGBTQ rights, whether we're campaigning for peace in the Middle East, disinformation is creating chaos in all of our political systems." -- Fadi Quran

Are social networks downplaying their complicity in the problem that is "fake news?"

Anil talks to Fadi Quran of the people-powered social advocacy group Avaaz about how tech is used to target groups of people and spread disinformation that affects our elections, relationships, and social justice movements. Together, they discuss the insidious nature of disinformation and misinformation, meet its victims, and go over solutions.

Listen closely for the steps that platforms can take right now to stem the tide of fake news and fake accounts.

Other Links

Big thanks to LinkedIn for supporting the second season of Function.

Function | About


Transcript

Anil Dash: Welcome to Function. I'm Anil Dash. Trust, it is one of those elusive things that is easy to lose but really hard to get back. All season long, we've been exploring what trust means when it comes to the internet, technology and social media. For a lot of people, the 2016 US election represents the moment when they lost all their trust in social media. Whether it was attention hacking or foreign interference or the way our data were being used against us, you literally could not trust the information that was being presented to you in your timeline. The aspects of the internet that were supposed to help us connect to each other and share information together, were used to undermine the election, and to undermine democracy itself. But if you fast-forward three years, most of us are still on Facebook. We're still on Twitter. We're still on all these networks, and that's despite the fact that all of them have gotten worse. So, we've got to think about why. The bad actors, the people who want to manipulate these networks, they've gotten better at exploiting the weaknesses of social media.

Anil Dash: They know how to exploit the algorithms, and they sure as hell know how to manipulate the social media companies themselves. Because the truth is, these companies are so afraid of accountability that they would rather let themselves be party to rigging elections, and to skewing votes than to actually holding their feet to the fire and saying, "We've got to make some changes." So, as we head into the 2020 election, guess what? Disinformation and misinformation, they haven't been solved. They're at an all time high. But, if you've been listening to this season of Function, you know, I'm interested in how we fix it. How do we rebuild trust on the internet? Because it's not like we can log off. It's not like we can sign out. The internet isn't going anywhere. For this last episode of Function, this season, I reached out to Avaaz. Avaaz is a really interesting nonprofit, and it's because they focus on solutions.

Anil Dash: They are trying to build that world that we all say we want. Part of that world we all want is an internet that doesn't spy on us, and doesn't lie to us, and sure as hell doesn't undermine our elections. But I wasn't sure how we get there. I talked to Fadi Quran. Fadi's a campaign director at Avaaz. He discussed the recent report that they published, which covers the spread of disinformation all around the world. The core question that we dug into is, how do we make sure that 2020 doesn't look like 2016? Fadi, thank you for joining us.

Fadi Quran: Thank you. It's a pleasure to be on the show.

Anil Dash: Some people might not be familiar with the Avaaz. I want to start there with what the organization is and what is the work that you do, so people have some context.

Fadi Quran: Avaaz is a global civic movement with over 53 million subscribers, and it was established 10 years ago with this idea that the internet is a place where people can connect and create the world the majority of us wish, because we believe humanity, we all share a basic set of values, and the organization campaigns in an extremely democratic way. We ask our members, what are the issues or campaigns that they care about? If our members have an idea, we test it, we poll it, and the ideas that our members care about the most, end up being the campaigns we run. Our members had the wisdom, about two years ago after Brexit, and after the US elections, to say disinformation is becoming a threat to everything that we believe in because whether we are campaigning to stop climate change, whether we are campaigning for LGBT rights, whether we're campaigning for peace in the Middle East, disinformation is creating chaos in all of our political systems. It's making deliberate of democracy, almost impossible. So, we decided to focus, and make this one of our priorities in terms of solving this problem.

Anil Dash: In the report, you outline exactly the ways that disinformation spreads, and then why it's so hard to get out of somebody's head once it's in there. Can you explain a little bit more about that?

Fadi Quran: Imagine your mother or grandmother, and she's sitting on Facebook. She's looking through pictures. She sees you in the party. She's enjoying. Then she sees the story. The story, it's made viral by a network. Oftentimes, this network are coordinated, inauthentic actors, so they're not even real people. They're just a bunch of pages, run by maybe the same person. The way the Facebook algorithm works, is if you share something fast, if 10 people share something at the same time, the algorithm is like, "Oh, this is hot.", and they bring it up to the top of your newsfeed. So now, maybe one of the pages that, or one of the friends of your mother is following one of the pages that spread this fake news, and they see it being shared, so they share it. Facebook will put it on the top of your mother's newsfeed.

Fadi Quran: It will be one of the first things she sees scrolling through. Now, if she sees that story, and then she sees another story about, again Nancy Pelosi, one of the stories we found said that her son also worked in Ukraine. Then there was a third story that she had asked the Ukrainian government to investigate XYZ. So, suddenly your mother or grandmother who's just on Facebook to see her family has seen five horrible, viral stories with thousands of likes saying these things about Nancy Pelosi. Now in politics, what we were taught in any political campaign, is the average person can keep up to three ideas about a certain politician. Maybe they'll keep the idea Nancy Pelosi is a woman with experience. Nancy Pelosi is from California and Nancy Pelosi ... The third will be that she's corrupt because they've seen all these fake news stories about her.

Fadi Quran: If you can just get one bad piece of information about a politician to stick in people's heads, then you've destroyed or you've decreased the amount of support that that politician takes. One thing I want to emphasize here, is this information's not just about making you like or hate a certain politician. It's target voter suppression. So particularly if you're from a minority, let's say you're an African-American for example, they will target you with information that will seek to make you feel frustrated, feel like your voice doesn't matter, feel like your society's broken with the purpose of making you say, "I'm not going to go vote. It doesn't matter who I vote for." That's a type of voter suppression that's specifically targeted at minorities in places like Florida, like Michigan and states that are, let's say, swing states.

Anil Dash: Looking forward at the 2020 election, does it look like there's going to be a repeat of what we saw in 2016?

Fadi Quran: 2020 will probably be worse because there are new disinformation tactics that are being used by the Russians, but also there are now local actors who are using these same tactics. Our findings, to put things in perspective, three to six months ahead of the 2016 elections, the top 20 disinformation stories, and these numbers are approximate, but had about 2.5 million interactions, so likes and shares. Now we're still a year ahead of the 2020 elections and we found about 1.5 times more interactions on the top 20 stories this year. This means if this trend continues, and in 2016 the trend was an exponential curve, it went up very steeply as we got closer to elections. Well, if the trend we're seeing now goes up the same way we saw in 2016, you'll probably have five times as much disinformation in 2020 than 2016, unless the platforms act, unless the platforms implement solutions like correct the record, these elections may be decided by lies.

Anil Dash: Avaaz has decided to take on this challenge of disinformation. One of the big bits of work you've done is as a recent report, really tackling the issue. But what was most extraordinary to me, I think, our listeners are aware that there's misinformation on social networks. They're probably pretty fluent in that and can say, Yeah, there's something wrong with the algorithms. There's something wrong with what gets promoted." What jumps out to me is you get about two thirds of the way through your report, all of a sudden, you start to talk about solutions. What could the big social platforms do? What could the big technology companies do? I feel like this is a thing we just almost never hear. There's a lot of hand-wringing, saying it's all broken and Facebook is terrible or you know, whatever. The Google algorithm, the YouTube algorithm is bad. But, very seldom do I see, especially an organization that has so much broad support around the world, say there are things we can do about it.

Fadi Quran: When we began, it was very tough to find strong, like you were saying, strong policy recommendations to these platforms to act. So, we did a massive tour. I spoke with Macedonian young men and women who are creating the fake news in the 2016 elections. I discussed with them how they were doing it. I spoke with Facebook executives, and executives at other platforms. We met with NATO. We met with victims of disinformation, whether they're members of the Rohingya community. What all of this research trying to find solutions, was focused then on four specific tasks that we think if the platforms implement, can severely reduce the threat and reach of disinformation. The first solution is correct the record, and what this means is, imagine in the US South, you had the Russian malicious actors create Facebook pages for Black Lives Matter and pages for Blue Lives Matter.

Fadi Quran: Then you had them create events, and spread hundreds of pieces of fake news that polarized these communities, and had people go to the street, and in some cases get in scuffles. Now Facebook, when they found these groups, when they found this disinformation, they took down these pages. But the truth is, those hundreds of thousands of people who are following those pages just never got informed. They didn't get a notification saying, "Sorry, the page you were following was run by Russia, and these pieces, these stories that you read were all fake stories. Here are some third-party fact-checkers." People never got that, and so what we're calling on all the platforms to do is be transparent. When there's a piece of disinformation spreading, and someone has read it or clicked on it, notify them, and send them Facebook and other platforms are working with fact-checkers.

Fadi Quran: Send them a correction saying, "PolitiFact or Snopes or AP have debunked the story, and this is the true story." If you do that, a lot of, what we have found looking at the research, the academic research is number one, corrections work, and they work effectively. Number two, we're running currently with professors at Ohio State University and at George Washington, a few other places, a study where we show people an exact replica of the Facebook newsfeed with disinformation. Then we show them what corrections could look like. So, another newsfeed but with corrections. What we've found is that corrections can slash by over 50%, in some cases by over 80%, the amount of people who believe this information. The other two solutions that I think deserve emphasis are what we call, detoxing the algorithm, which means just redesigning certain recommendation algorithms that the platforms have put forward, so that disinformation actors or malicious actors are downgraded significantly, and where more authoritative content is pushed upwards. The platforms have begun to move in this direction in the last year, but still not effectively.

Fadi Quran: The last bit is, we think there must be transparency. The platforms need to be transparent about the amount of active fake accounts, and how those fake accounts are functioning on their platforms, and how they have functioned. The platforms need to be transparent about certain design aspects of their algorithms. They need to be transparent about ... In our report, we say just the top 100 fake news stories have reached in the US, over 158 million views, but that number is the tip of the iceberg, and Facebook should tell us actually how many people were reached by these stories and other stories.

Anil Dash: You talk about reach and impact. 158 million impressions in the US alone, of known misinformation being targeted. As you said, the tip of the iceberg. One of the things I want to call out there is, well, like a lot of institutions defending themselves, Facebook for example, will start by saying, "Oh it's a problem, but it's small. It was a couple thousand, maybe it was 3 million" Then a couple months later, "Maybe it was 10 million." Then a couple months later, "Maybe it was 100 million.", right? There's this creeping confession, but the first thing is almost a straight denial, and push and response as a pushback to your initial identification of this issue, is they'll sort of minimize it, diminish it, dismiss it. Then usually, in the fullness of time it comes out that actually the problem was as serious as you've said, or sometimes even worse. Have you seen that pattern play out with the activism and the advocacy that you've been doing?

Fadi Quran: Yeah, I mean, we've definitely seen that and it kind of breaks our hearts, because I think the impression that a lot of us have of these big companies in Silicon Valley, or at least had, is that they're companies that wanted to do good, that were transparent, that wanted to serve the big causes that we care about, whether democracy or human connection or human rights. When they come out and put a very well-designed public relations statement instead of just speaking authentically and truthfully to the size and scale of this issue, it hurts our trust in the companies. That's why, polls showed that the trust in these big Silicon Valley companies, the trust in their brand is continuously deteriorating, and it's a pattern. They always try to exaggerate what they're doing, and underplay the problems underneath that they still haven't dealt with.

Anil Dash: Yeah. It's funny, I hear such an earnestness to what you're saying, which is like, "I wanted to think that they were doing good. We were told they wanted to make the world better and that's why they made this tech." It's funny because I just identify with that so much because I spent time in Silicon Valley. I know people at all these companies. I sometimes know the founders, and I think I had that same impulse for a long time. I'm like, "But they want to do good, and they're nice people, and they told me it's not that bad a problem."

Anil Dash: Want to do good and they're nice people, and they told me it's not that bad a problem. And then, over and over and over they keep diminishing it. They keep sort of saying, "No, no, no, it's not as serious as we said." And yet, you've got the data and you've got the information, and you can show this is a problem that is incredibly serious. It's extraordinary to hear that still you have that desire, that impulse, to assume good intent and to think the best. But, after a while, don't we have to say, "This is a strategy." This isn't just, they keep making the same mistake over and over. This is an intentional, and so far pretty effective, attempt to downplay their complicity in this.

Fadi Quran: I think these companies are full of earnest, whether they're product designers or engineers, or data scientists or policy people, they are packed with wonderful people who want to see this issue solved as much as you and I. And I think the elephant in the room here is that I'm at least beginning to get a sense, and a lot of people working in this space are beginning to get a sense, that there's a huge divide between the leadership of these companies, who are risk averse, who don't want to move fast to solve these problems, who may be afraid of certain political repercussions, and the hundreds or thousands of people that work for them that did join these companies for the promise that you and I and others felt that these companies had for humanity.

Fadi Quran: But, as leaders of these companies, they also need to realize it's not them against civil society, or them against government, or them against the people. By not solving these issues at scale and at the level that they know they should be solving them with, they're also changing and poisoning the culture inside of their own organizations. And that's a danger to democracy that we care about, but it's also a danger to the sustainability of the companies themselves.

Anil Dash: That's extraordinary to hear. And I think it does match what we see bubbling up, whether it's the Google walkout, which I think was also catalyzed around a lot of labor issues, around harassment and things inside the company. But, certainly a background to that, at all of these big tech companies, is their employees are the most expensive resource they have and the thing they have the least control over, and it seems like accountability might come from the bottoms up there.

Anil Dash: I want to return to one of those other threads that you talked about and what ... You advocate for ways to correct or try to solve some of these issues. One of the things you flag is ... When I think of it from a software standpoint, it's almost a feature in the product, which is go into the newsfeed, go into the alerts and tell people, "Hey, it turns out that thing you saw was false." And you literally are talking about that, right? You're talking about, in the same place in the app where we see whatever headlines people are sharing, we would see a correction, a flag, a notification.

Fadi Quran: We actually built this handy website. If you go to factbook.org, F A C T book dot org, then you can see what this design would look like. You'll see a design that looks like Facebook today and what Facebook corrected, and it's a simple addition.

Anil Dash: Okay, so we'll include a link to factbook in the show notes here so people can check it out. But, one of the things that, when we talk about these kinds of solutions as people say, "Well, one, there are people that believe in conspiracy theories and if a fact checker says this is false, doesn't that make them double down?"

Fadi Quran: So, what we've found, going through the historical research and the behavioral and sociological research on this topic, is in every society in every group, probably at your Thanksgiving dinner, there's one or two people that are very strong. They believe a certain conspiracy very powerfully. And in society, that number is between 2% to 7% of people will believe a conspiracy, like Elvis was abducted and now lives on Mars or something like that. Now, with that group of people, you're right. You do not want ... It will be very, very tough to convince them. If they get a correction, even if the correction came from Jesus Christ, they probably wouldn't believe it. So, with that group of people, but that's a minority. And what this information particularly is focused on is not that small group of people who are often politically not extremely influential, but it's infecting the rest of society. And the same thing with misinformation about anti vaccinations, for example.

Fadi Quran: Now, what Corrections do is, the rest of society, the 95% to 97% of society that don't have these strong conspiracy theory beliefs, are defended when they get a correction. So, if someone's saying that vaccines cause autism, and someone's spreading that significantly on Facebook and so forth, they can spread that within their small group. But the rest of the population on Facebook won't believe it if they get corrections. And where we think the focus should be is on that population, that disinformation and misinformation is trying to infect the broader population of society, not the minority in the corner who we think, of course, it's worth doing research about how to change beliefs and transform beliefs. But, if we want to talk about impacting the political discourse, polarizing society, it's not that group that you look at. You look at the broader sample of society.

Anil Dash: Right, so it's dissemination. It's what we call reach or amplification.

Fadi Quran: Exactly. And you isolate. It's kind of like corrections create a quarantine that offer people more information. And the average person who sees a correction, sees this information, sees a correction, will be like, "Oh, what I read is wrong." And they'll stop believing it. And so, you isolate that piece of bad information.

Anil Dash: So the platform companies, and I talk to them a lot, and they all tell me, "Oh we're doing all this. We're doing this work. We're starting to flag things. We're starting to reduce the reach of things that are known misinformation." And they keep insisting they're fixing a lot of this stuff. And then, at the same time, they'll be a little defensive. So they'll say, "Well, yeah, okay. If Macedonian teenagers made pages for Black Lives Matter and for Blue Lives Matter, isn't that okay?" Because aren't there really supporters for Black Lives Matter and aren't there people who are pro police? Those exist, so why shouldn't teenagers in Eastern Europe be able to make such pages?

Fadi Quran: This, I feel, is a divergent tactic, right? Because nobody is claiming that these Eastern European teenagers don't have a right to create these pages. They have a right. Let them create these pages. The problem becomes when these pages are used to spread disinformation with a purpose of malicious harm or for economic gain, because then the creation of these pages stops becoming about, "I want to share my opinion," or, "I want to bring people together in this group to talk about Blue Lives Matter." It becomes about manipulating and deceiving people for profit or for political goals. And that's what we're trying to stop.

Fadi Quran: And again, the solution, correcting the record, we don't say remove content, we don't say remove pages, because we do believe in freedom of expression and it can become very dangerous if pages are being removed or deleted. What we're saying is just add more information to the information ecosystem. When you see these types of tactics, when you see these types of pages spreading this disinformation and you add corrections, what will happen is, your average American man or woman who's following one of these pages, if they get a correction once, twice, three times that says, "The content on this page is XYZ or is disinformation, and here's a correction," they'll stop trusting that page. If they stop trusting that page, you remove the malicious incentives and you move people towards better content, more healthy content, without taking away the right of the Macedonian teenager to create a Facebook page. That should never happen.

Anil Dash: We'll have more with Fadi after the break. Welcome back to Function. I just want to give you a heads up. In this next part of the interview with Fadi, we're going to talk about some pretty graphic and fairly violent things that have happened to people online as a result of disinformation, so take a moment if you need it, but then please do listen.

Anil Dash: I want to get into your experience of doing this work a little bit. Right, so as we are talking right now, the report's been out about a week. And you've done similar reports in the past, but this is one that is having some impact. People were seeing it. People are sharing it. I see it on social media. I'm curious. From your standpoint, you are advocating what seems like pretty reasonable suggestions, right? This doesn't seem that wild. You're not saying burn it all down, destroy anything. You're like, "Could you add a couple of features and a little bit of information, and make a couple tweaks to the algorithm? That might have big impact without silencing anybody," like I said, "... Without shutting anything down." And I'm curious. I'm sure it's got to feel good to see people sharing the report and talking to one another. What's your sense about whether this is going to have the impact you hope? Are you being heard? Do you get back channel conversations from people at the big platforms saying, "We see you. We hear you."

Fadi Quran: One of the most effective things that Avaaz has done on this front was not just the reports, but it was bringing survivors and victims of disinformation, members of or a representative from the Sandy Hook Families, many of whom are in hiding because of the malicious disinformation that spread against them, people from the Rohingya community, and taking them to Silicon Valley to meet with the teams working on this, and the platform leaders working on this topic. And people had tears in their eyes because a lot of them have never really interacted directly, physically, with the painful consequences of their platforms. And I think winning people's minds with these types of policy solutions has been key, and these reports, but also bringing in the heart, bringing in the real on the ground tangible consequences, so that people can see and touch them, has really led the platforms, and particularly employees at the platforms, to become more passionate about solutions.

Anil Dash: Should it take that much? Should it have to go that far? Should people have to perform their pain for a product manager before this stuff changes?

Fadi Quran: No. I had tears in my eyes the first time we sat with these disinformation survivors, and they were telling their stories. Because, I just know how traumatic it is. Losing your six year old son and then not being able to live a normal life or not feeling capable of protecting the rest of your family because you've come under more attack. And then, from hiding, having to come out and speak to a product manager because everything you've done over the last five, six, seven years of sending messages, of reporting every single video on YouTube attacking you. None of it was impactful and-

Anil Dash: No, none of it amounted to anything until they met them.

Fadi Quran: Until they met them. This shouldn't be the case. The companies should invest in going ... Going down to Assam right now, or to the indigenous communities in the Amazon who are being just massacred and also massacred online, and they don't have Facebook. They can't report the attacks against them and the lies spreading about them.

Fadi Quran: And the same thing in other places like Nigeria or Uganda and so forth. Communities, minority communities, coming under attack. They can't report any of that. The platforms, when they're spreading in these countries, they should have their teams go and see the direct consequences. It shouldn't have to be Avaaz or any organization that brings these survivors to Silicon Valley. But, we also need to commend the bravery of a lot of the disinformation survivors-

Anil Dash: Oh, absolutely.

Fadi Quran: ... Who made this trip and who went into those meetings with their head raised high and said, "This happened to me and I'm here to make sure it doesn't happen to anyone else."

Anil Dash: That's such a powerful point, that associating a name and a face and a real story takes that cost of disinformation and makes it real and impossible to ignore. Are there other stories that you've heard from survivors of this kind of disinformation?

Fadi Quran: One of the stories that I remember is this young man, high school student. His name is Ethan Lindenberger, and his story is heartbreaking because it's about his relationship with his mother. So his mother has been targeted with misinformation about vaccines, anti-vaccination content. She deeply believes this anti-vax content. And she decided not to vaccinate him or his younger brother. Now, going to school and in reading and learning it, middle school, high school student, he learned that vaccinations were important, that he had to take them. He wanted his siblings to take them to be safe. But he came under attack online from anti disinformation actors, and they said that he was working for the government, that he was a spy, that he was paid. And it came to the point where his mother believed the lies that were being spread against him.

Fadi Quran: Him just saying that story. This young kid's super sweet. He volunteers at the church in his town. And it's destroyed, in many ways, interfamilial relations. And the bigger story is that now you have measles outbreaks because ... He sees it as a biological weapon. He sees what happened to his mother. But if you have a thousand mothers who believed the same thing in one city, then that city becomes prone to disease. So, that's one story.

Fadi Quran: Maybe I'll just mention another story, which is the story from the Rohingya community in Myanmar. And here we had the leader from the community, and he spoke about how, at a refugee camp, a woman who had been raped and who had her husband lynched, slaughtered and lynched, in front of her because on ...

Fadi Quran: In front of her. Because on Facebook, the government shared the picture of her husband, who was just a normal farmer, saying that this guy is a terrorist and he's planning a coup and an attack [inaudible 00:32:16] and the government used that, manipulated that story, made it go viral, and then all the villages around where that woman lived, we're just looking for her husband to kill and lynch him. And so this woman was crying. She didn't have anything to live for anymore because of disinformation that had gone against her. And now she's a refugee. Her whole village was destroyed. Now she doesn't have anywhere to live. She's a refugees in Bangladesh.

Anil Dash: So, these stories are heartbreaking and I think it leaves us feeling gutted, but maybe even a little powerless. Is there something I can do as a user, if I'm on a social platform, how can I hold them accountable?

Fadi Quran: So there are three steps. The first I would say is contact your representative. And here I'm speaking to particularly US audience, whether it's the Senate or the House, those who are responsible for defending our democracy are not doing their job, particularly when it comes to these platforms. So the first thing I would say is contact your representatives and say, "We want Facebook, we want Twitter, we want Google to implement Correct the Record. And we want you as our representative to push them to do that," because the platforms do listen to these decision makers largely because they fear strong and strict regulation coming their way. So, that's step number one.

Fadi Quran: Step number two is communicate this to the platforms themselves and we can share and we'll share a link to our campaign and you can join this campaign. You can become part of it. And that way we can send you when the right opportunity comes, for example, for you to reach out to Mark Zuckerberg and say, "We want XYZ to happen," or for you to reach out to Jack Dorsey and you can become part of this campaign where we can help make sure that your voice goes directly to the leaders of these companies and it's not a lone voice. It's a voice that's magnified by millions around it.

Fadi Quran: And the third thing is, and this is more broadly speaking, particularly to the audience of this podcast, we need to begin speaking with the big advertisers. So you have companies like Unilever, Proctor & Gamble, Mars and others that together control a big chunk of the revenue, the ad revenue that these companies get. When these companies come together and say, as advertisers, "We want these platforms to be safe and responsible places, we want them to implement solutions like Correct the Record."

Fadi Quran: The leaders of these platforms have to listen because we're threatening their bottom line, we're threatening a big portion of their income. And for those in the industry, I'd say, when you go to marketing events, when you go to any type of event that includes these types of organizations, go up to them and tell them, as advertisers, we believe you have a responsibility. You should push platforms to move into X, Y, and Z direction.

Fadi Quran: No, those are the kind of three things I would focus on at this time period. There's a bigger picture, which I think is a challenge to all of us. And that's what is the social media that we want to create and that humanity deserves and will make humanity better. And there's a question, right? You know, we now have these companies that have their pros and a lot of cons. Is there a way for us to rethink social media? Not in a for profit model, but in a way to say, how can we bring humanity closer together using the technology that exists today? And we may need a revolution on that front. We may need a new type of social media that is built to serve humanity, not built to serve profits. And that's an open question.

Anil Dash: Fadi, I love you leaving us with a big question, because you've done so much of the groundwork of really digging into a problem that seems almost intractably complicated. And, you have the moral authority of tens of millions of people standing behind the work Avaaz does around the world. And also, of course, the moral authority of being able to take victims and let them in some ways confront people who've been complicit in their harm. And I think that's an extraordinary moment. In reading the report and then seeing the things you all have shared over the years, there is almost an optimism to it. Like it really feels like despite you talking about some of the worst things that happen, well Avaaz broadly, right? Some of the worst things that happened in the world, right? Some of the biggest challenges we face as societies.

Anil Dash: But then in this particular case about disinformation and technology, you're talking about one of the biggest problems that the technology world faces. And yet what I see kind of threaded through everything in the way you do this activism, the way you share these reports and the way you galvanize action is a bit of an optimism that you think change can happen and that you can have a positive impact. And I'm curious, you know, as maybe a final note, do you really see impact happening? Do you think we can fix problems this big, this severe?

Fadi Quran: Oh definitely. I mean, the problems we face are huge, but they're not intractable. And what gives us hope and optimism is people like you and the team that's developing this show, for example, you try to bring more information, more knowledge, open debate to society. That's a beautiful thing. And there are many, many, many people around the world who want to see a better world for their children, for themselves, for their loved ones and for their communities. And that's the source of our optimism, this broad and interconnected network of humanity who, no matter what they face, no matter the difficulties thrown at them, decide to persevere to create the world that we all seek and we all want. And so these problems will be solved. The question that we have is, can we solve them fast enough to limit the amount of pain caused by them? But we will solve them and we will overcome as a people, not just as an organization, not just as Avaaz, not just as a global community, but as a broader interconnection of humanity.

Anil Dash: Fadi, thank you so much for joining us on Function.

Fadi Quran: Thank you.

Anil Dash: This season of Function has been really personal for me. I mean, we started talking about trust on the internet and yeah, everybody cares about that, but what I've realized is we dove into it is I really give a damn about this because we've seen the consequences of what goes wrong when you can't trust technology. It's not abstract. It's not about just data and privacy. It's about people's lives being ruined. It's about really changing the world for the worse in big ways. We're talking about elections, we're talking about journalism, I'm talking about things that matter. And these consequences are felt by some of the most vulnerable people in the world.

Anil Dash: Honestly, it makes me furious. It makes me mad as hell because one, I got into technology because I thought it was going to be a force for good, but two, I feel that burden. I feel that weight. I feel that responsibility. When I make technology, when I lead a technology company, that is my job, is to think about those things. So why the hell aren't these other people doing that? Especially when they've got a lot more resources and a lot more reach. I see some of the most vulnerable people thinking deeply about these questions. I see activists doing the work, trying to hold technology companies accountable.

Anil Dash: But I don't yet see the leaders at the big companies really feeling the pain yet. And it comes back to one of the topics Fadi talked about in this episode where we discussed the idea of whether people should feel uncomfortable when they're at a cocktail party, if they mentioned that they work at one of these social media companies. And that's a good place to start. I mean, yes, we need policy and regulation. Yes, we need to see Mark Zuckerberg testifying in front of Congress. Yes, we need all kinds of methods of accountability and maybe some of it is deleting some apps off our phones.

Anil Dash: But I think a lot of it comes down to the personal level. We really, really have to think about how individual people can put pressure on those who are creating technology to say, "This matters and you have a choice and you need to be the ones to make a change."

Anil Dash: It's especially personal to me because I started talking about this stuff like 10 years ago, maybe more, and at the time I felt like I was honestly burning down my entire career in Silicon Valley. I left San Francisco with people literally saying to me, "You will never work in this town again." All for offering criticisms that now, well, everybody thinks they're common sense. Yeah, of course it's a little creepy what these companies are doing. Yeah, of course they should be a little more accountable. And I'm not saying that like, "Oh, I was ahead of my time, I got this thing right." I think there were a lot of people that got it right. I think there were a lot of people that were waving red flags.

Anil Dash: But it took time for society to change. It took time for society to catch up and to see some of those risks and those harms. And lots and lots of people got hurt along the way. Lots of people. They didn't need to, they didn't need to be. It didn't have to be this way. So the urgency, the fight now is because we can't let more people get hurt.

Anil Dash: And the funny thing is, even though all these stories we've heard about, the harms that people face when confronting technology that they can't trust. Those things get me down. But every single one of these stories also has an element of hope for me. There's an optimism. Because I listened to the activists, I listened to the researchers, I listened to the experts and they all talk about solutions. They all talk about things we can actually do. Especially that idea of holding technology accountable.

Anil Dash: I see increasing movements of people saying, "I want to be personally accountable about what I make and what I put out in the world." And those are the seeds that are planted that are going to grow into what I think are going to be movements and I hope those are what flourish. I hope that's something we can all take action on. Because the truth is even when I lose faith, I still do think technology is a force for good. I think that's what it was meant to be and I think if we can all join together and holding people's feet to the fire, especially the people that make these big technology platforms. And hopefully tech can be as good for us as we were promised it would be in the beginning.

Anil Dash: Function is produced by Bridget Armstrong, our Glitch producers, Kesha TK Dutes. Nishat Kurwa is the Executive Producer of Audio for the Vox Media Podcast Network and our theme music was composed by Brandon McFarland. Thanks to [inaudible 00:44:05] and the engineering team at Vox. And huge thanks to our team at Glitch. You can follow me on Twitter @anildash and you can follow the show @podcastfunction and we do read all of our mentions. We listen to everything you say. So share your ideas and your feedback.

Anil Dash: You can also visit glitch.com/function, we've got full transcripts of every episode. There's even apps for certain episodes. It's really cool to check out. Thank you so much for listening to Function this season. If you haven't already, please subscribe and rate and review the show. It makes a big, big difference and stay subscribed to Function because we've got bonus episodes and interviews coming up too.