"When Twitter decides they are going to do Moments now and they hire journalists to curate them...you just created a newsroom." — Jay Rosen
What happens when social networks become social media, and tech starts making editorial decisions?
Anil talks with NYU journalism professor Jay Rosen about the priorities of tech companies and how their faux neutrality and feigned ignorance has taken a toll on our news cycles and elections. Together, they identify the problems that the press is facing, like attention hacking and misinformation, in order to focus on how the press and us as citizens can reclaim “the news” and what they care about.
Big thanks to LinkedIn for supporting the second season of Function.
Anil Dash: Welcome to Function. I'm Anil Dash. All season long, we've been talking about trust on the internet. Whether we've got it, and how it gets broken. I'm super interested in that topic, because the internet and social media are so embedded in our lives. All of us rely on it for our work, to stay connected to our family and friends, and especially just to understand the world around us. These days, two thirds of Americans say that they get at least some of their news, from social media.
Anil Dash: Those social networks weren't originally created to enable news to be shared. Sure, you could always post a link on Facebook or Twitter since they've been around, and I know because I was on them pretty much from the beginning. But the people making the technology really saw that as no different than sharing a photo or anything else. They didn't have a special treatment for news. Then things evolved over time.
Anil Dash: You know, really out of good intent. They thought they want to make it easier to discover things, or they wanted to make sure to emphasize the information that's most important to you. But they made algorithms that started to make decisions about what information we see, and when. Which sources are considered credible or not. These are choices and a choice about what information matters, and what information is accurate. That's an editorial choice.
Anil Dash: You know, I used to work at a newspaper. That was the decisions that editors were making. So what you've got is, the major tech companies, the social media platforms, making journalistic decisions about the world. Even though, well, pretty much none of the people making those decisions were journalists. They were computer programmers, or product managers, or designers, all well-intentioned. I really do think they wanted to do good, but the reality is you can't do well in a very complicated field that you know nothing about. We saw the results of those choices in 2016.
Donald Trump: I've just received a call from secretary Clinton. She congratulated us, it's about us, on our victory.
Speaker 2: I'm sickened, I'm without words. I thought for sure that Hillary would win this election, that Hillary would win Florida.
Speaker 3: Social media played a huge part in the 2016 election. It was used to get the message out, but then candidate Trump-
Speaker 4: What we're talking about is a major foreign power with the sophistication and ability to involve themselves in a presidential election, and sew conflict and discontent, all over this country.
Speaker 5: Starting in 2014, Cambridge Analytica funded a personality test on Facebook, and paid people to take it. The alleged goal, influence the views of the American electorate.
Anil Dash: The election surprised everybody, but also prompted this deeper examination into the impact that technology broadly, and social media specifically, were having on our lives. Tech is suddenly realizing it has to confront the value of journalism in society, but also the challenges of making it happen.
Anil Dash: There's also the question of how journalism has adapted. As soon as that question came up, I thought the one person I'd most want to talk to about this is a professor of journalism at NYU, Jay Rosen. He's written a couple of different books about journalism in this new era.
Jay Rosen: I studied the press as an institution, and the pressures that come to bear on it.
Anil Dash: Social media has fundamentally changed the way that news is disseminated these days. So I talked to Jay to find out how we got here and also where we go from here. I'm very glad to welcome Jay Rosen.
Jay Rosen: Thank you very much Anil.
Anil Dash: There are people that have this deep and abiding reverence for journalism as an institution, and almost like a priesthood, and that you have to protect it. You've never had that. You've, you've always been a little bit challenging of that. I think one of the seminal sort of phrases, that to me sort of typified your challenge of, okay, what is journalism's role in the ecosystem and how do people within this institution see themselves, was the view from nowhere? Can you sort of, for people who may not be familiar, can you succinctly describe the view from nowhere?
Jay Rosen: Well, that's my term for very well known pattern in the American press, but also other countries press, where journalists tried to persuade us to believe in them by making a claim like this, "I don't have a stake, I don't have an interest. I don't have an assumption. I don't have a party."
Anil Dash: No dog in this fight.
Jay Rosen: I have no dog in this fight. I'm just telling you the way it is, so you should believe it because I have no view.
Anil Dash: My impartiality is my credential.
Jay Rosen: Mm-hmm (affirmative). I sometimes called this viewlessness, as a presumed good. So it happened within the evolution of American journalism, that this became the dominant way to persuade people to accept your account, is to demonstrate that it has no stake, interest, bias in. So not only was a profession built on that claim, but a market force was built on that claim, which is the metropolitan newspaper, which eventually became more or less a monopoly product by appealing to everyone. So there was a commercial logic to it.
Jay Rosen: There was a status logic to it, which is in American culture, if you want higher status, you can make more money, but also you have to start acting like a profession. You have to say that you would deliver a public good, that you had some sort of elevated roles, elevated commitment, you see? What is that based on? Like how do you persuade Americans that you offer a public good?
Anil Dash: So there's a rhetoric or a narrative that you have to perform, you have to say, if you want America to believe that your institution, your business, your industry, is good and important, and must be defended. It seems like the sort of the pillars of that are one, there is a public, two, you're serving that public, and three, you do so from above the fray. Being neutral, having no point of view. And it's interesting because it's been more than a dozen years since you sort of formed this narrative of-
Jay Rosen: Correct.
Anil Dash: ... view from nowhere. Even if it's not phrased that way, that framing of the political discourse of the cultural discourse, has become all pervasive. Everybody sort of intuitively knows it. These days I think it's usually phrased as bothsidesism, right?
Jay Rosen: Yeah.
Anil Dash: So if we have a, a white supremacist gathering, then the news has to say, "Well we got to hear from both sides." It ignores sort of the value that even attention and amplification in an ecosystem, have value, separate from even what the substance of the messages.
Jay Rosen: Right. That is one example of many, where the view from nowhere as I have called it, breaks down or doesn't address what's going on. So that's what I've tried to do, is name the part of what academics would call newsroom objectivity or professional objectivity. Name the part of it that we don't really need, so that we can save the parts of it that are good. So like when people talk about objectivity in journalism, it means a lot of different things. Some of them I think are like, "Yeah, these are virtues." Like being able to step back from your upbringing and see how other people live, that's a kind of objectivity. Like I definitely want journalist who can do that, right?
Anil Dash: Yeah.
Jay Rosen: So what we have to do with this beast objectivity, which is like this, it's like a blob, is you have to start naming the parts of it that work and the parts that don't. So that's why I developed-
Anil Dash: Yeah, it didn't come off a bad impulse. They didn't- [crosstalk 00:08:40].
Jay Rosen: No.
Anil Dash: They didn't aspire to this objectivity.
Jay Rosen: No. Many things that are called objectivity, are the impulse to tell the truth, to get evidence to show you don't have to rely just on my word, look for yourself. All those things, increase objectivity. The kind of objectivity we don't need is when the journalists pretends to be above it all. When they treat everyone else as if everyone else has an argument, but they just have facts. These kinds of patterns don't do the press any good. They enrage users, and then with the coming of internet as the baseline for discourse on this, the people who are really dissatisfied by press practice have a way to give voice to them. That you know, it's different than when there are an atomized group that can't contact each other.
Anil Dash: Right. Now all of a sudden they can say, "Do you feel this way too? Do you feel disconnected too."[Crosstalk 00:09:32]
Jay Rosen: There's evidence that many other people feel this way, and they can "talk back", and writers emerge that represent their dissatisfactions. All that's a very different environment than when I started studying the press. But the view from nowhere is increasingly under attack by events themselves, as well as lots of critics, as well as a general evolution in the way people think about this. That yeah, you know, there probably isn't a place above all the action from which to work, from which to view it.
Anil Dash: Right, not anymore.
Jay Rosen: Yeah. Probably not a good thing to go about like claiming that.
Anil Dash: So thank you for that framework because that gives us a sort of a common ground, where we have a vocabulary now, we have a perspective. There's an interesting thing here too, because one of the reasons it's so important to me to understand journalism and media institutions as a starting point, is that we talk about a lot of technology as social media. It used to be called social networking, and even before that social software in the tradition of, Microsoft Word on Windows in the 90s.
Anil Dash: You fast forward 20 years from there, and what we have with media is a whole different set of assumptions. But also people building this technology, frame what they do, in the language of media and journalism. So Mark Zuckerberg goes to Congress and talks about serving the public.
Mark Zuckerburg: Our policy is that we do not fact check politician's speech, and the reason for that is, that we believe that in a democracy it is important that people can see for themselves what politicians are saying.
Jay Rosen: Freedom of speech.
Mark Zuckerburg: Exactly right. We didn't have conversations about freedom of speech, when Microsoft was making Microsoft Word in the 90s, and you were copying it off of a floppy disc onto your computer. So, something changed to make technology media. Again, Bill Gates, when he was sitting in front of Congress and being examined by the Department of Justice, these issues never came out. He didn't ever have to talk about freedom of speech, freedom of expression, our first amendment rights. Like that stuff didn't come up.
Mark Zuckerburg: Zuckerberg, it seems like every six months, is awkwardly drinking water while sweating it out in front of the cameras in Congress, and keeps going back to these trips. So that seems like a radical shift. Yet, all these folks are still, well in their case they're both dropouts, for Gates and Zuckerberg. But for the rest of them, are folks that have computer science degrees, and have probably never once sat in a room of a journalism school or taken a journalism class, or have any fluency in media. That's weird. Right?
Jay Rosen: Well, it is because what started as a technology company with a engineering culture, and origin on a college campus, gradually came to be a media company, or a company that disrupts, pardon the expression, media hugely. Just as it became, slightly different term, it became I think an editorial company, when for example, it began explicitly hiring journalists to do journalistic things. You are an editorial company, right?
Anil Dash: Right. When YouTube chooses what's on their homepage, they're an editorial company.
Jay Rosen: Yes, in some way they are. When Twitter says, "We're going to do Moments now, and hire journalists to curate them," you just created a newsroom.
Anil Dash: And send you notifications about them.
Jay Rosen: Totally.
Anil Dash: It's the same as the push notification for the New York Times in a way.
Jay Rosen: I think this will address your question. When Twitter first announced that this team was being hired, and they were going to do this thing, which became Moments, and they now curate. I happened to have been booked to interview Twitter's Head of News, just by chance. So it was the day that this was announced and I said,-
Anil Dash: That's lucky.
Jay Rosen: "I'm really fascinated by this team of journalists you've hired. I think it's a really interesting direction for Twitter. In my view, this is actually the moment when Twitter has kind of crossed over into an editorial company. So you're the Head of News, I want to know when Twitter does this, when it creates an editorial culture inside a tech company, what vision of journalism is it operating from? What tradition does it see itself as standing within?"
Anil Dash: What's your conceptual model?
Jay Rosen: "What's your conceptual model, and also what are your priorities? Where are you coming from?" So I just asked a very general question like that. Like, "What are you thinking when you..." and what I observed was that there was literally no way to get an answer to that. They had literally not thought about that. They instead had done something very, very predictable, very easy to do, which is to simply say, well what do you mean, we are hiring professionals? They're going to exercise their professional judgment.
Anil Dash: It's sort of a credentialism.
Jay Rosen: Exactly.
Anil Dash: Huh. That's really interesting because in technology, if you say, "We are Apple, and we're going to make a better camera in the iPhone," and then you say, "What is it going to be able to do?" "Oh, it's going to work at low light, and low light means this much, this many candle power and lux, and whatever, and it's going to be able to pick up the colors" and that's it, right? So you're given this answer, which is very knowable. But if you said iPhone 14 is going to have a better camera, and they said, "Well, in what way?" and they said, "Well, we're going to hire professionals, and they're going to make a camera, and how dare you interrogate our magic." It would be a little weird, wouldn't it?
Jay Rosen: Yeah. But nonetheless, this was a perfectly fine answer from the Head of News's, his point of view, from Twitter's point of view.
Anil Dash: Because that was their mandate. Their mandate was like, give us-
Jay Rosen: Give us some professional credit here. [crosstalk 00:15:21] Now I was very surprised, for example, that they didn't say something like when they gave rise to Moments, it became a kind of an editorial company, that they didn't say, for example, that universal human rights were part of a part of their grounding. You know, because Twitter is very associated with that in other ways. That would seem like a natural to me, if you had, right?
Jay Rosen: Anyway, same thing with the Facebook is, is they have been forced to admit that they are kind of a media company, and now they're forced to concede that they have an editorial part too. I'm told by people who may know that this is something personal to Mark, like being accused of wounding news was something that he felt personally, and he wanted to try and adjust it.
Anil Dash: More with Jay Rosen after the break.
Anil Dash: The catalyst of this conversation is honestly the 2016 election here in the U.S. and the 2020 election here in the U.S. Right? And there's many more examples all over the world. I mean, there are actually far more egregious examples of misinformation. If I look at what's happened in Myanmar, it is even more serious and damaging. But I think about in the U.S., here, where these companies are based, the catalyst was, what the heck just happened back in 2016 and what's going to happen that's even worse going forward in 2020?
Jay Rosen: Yeah.
Anil Dash: And there's an interesting thing here I see, which is, as a tech guy myself, we look at a lot of security and privacy and other sort of considerations like that, and there are systemic weaknesses. You look at exploits, you look at hacks.
Jay Rosen: Yes.
Anil Dash: How are they going to hack the system?
Jay Rosen: Right.
Anil Dash: And it seems to me like the lack of informed perspective and bringing in journalistic functions, editorial functions, into the major platforms and the lack of fluency in things like the strengths and weaknesses of the view from nowhere-
Jay Rosen: Mm-hmm (affirmative). Oh yeah.
Anil Dash: ... were exploitable hacks. They were things that hackers, just as they will find the weaknesses in your security, policies and then you're running an old version of the software and it's exploitable. These were editorial organizations running an old version of the journalism software that was very hackable, not in the technical sense. You didn't have to be Russian bot, spammers, all the things that get usually blamed here. All you had to be was somebody who understands the exploitable aspects of these legacy media systems that were adopted without fluency, and then you would be very hackable.
Jay Rosen: Oh, I think that's dead on. I think the news system evolving at a slower rate than the network around it is exploitable. The whole view from nowhere contains a flaw in the code in the sense that when the system that makes public service [inaudible 00:19:34] impossible is under attack because there's a political movement that's generating capital from doing that. The code of impartiality and objectivity contains no instructions for what to do under that circumstance. And so that's how you get, for example, Marty Baron's famous phrase, which is a very important, fascinating phrase, we're not at war, we're at work. This is his way of saying-
Anil Dash: [crosstalk 00:20:00].
Jay Rosen: ... that we have to stick to what we're doing, right? And we cannot be seen as anyone's adversary. So that kind of idea, that's another example of your point that there are these flaws in the software where bad actors can overwhelm the system and not take control of the news so much as waste everyone's time.
Anil Dash: It's denial-of-service attack.
Jay Rosen: Yeah. Yeah. Denial is a slow-mo Sunday morning denial-of-service attack.
Anil Dash: Yeah, that's really wild because I think of this amazing shift that has happened where now, the tech companies can't opt out. They can't say like, we're not going to have this conversation. When Congress calls for you to testify or just average consumers are like, what are you doing about this? They have to engage in, and I think it's very telling they have an in the sort of stereotype of what technologies do, chosen binary solutions. So when we talk about something like political ads as recently happened in the last few weeks-
Jay Rosen: I had tried to pay attention to that, yup.
Anil Dash: ... Facebook said, okay, we're not going to filter any political ads. And Twitter's like, we're going to ban them all and not take any money for them. The sort of this very like feast-or-famine kind of solution thing. And it seems like both are this attempt to like, I don't want to understand this problem. I want to not have this problem.
Jay Rosen: Okay. I think you really hit it there. I want to not have this problem because that's what I have seen in Facebook's dealings with this media political space where it has power within the political communications world. So it gets attacked as a player, but it doesn't want to be a player. So it could only accept...
Anil Dash: You can't opt out.
Jay Rosen: Yeah. It could only accept descriptions of itself that kind of vouch for its neutrality. Right?
Anil Dash: Ah, that's so interesting.
Jay Rosen: And that doesn't actually describe what's going on. So I'll give you an example. When Facebook was coming up with its original fact-checkers alliance, it collaborated with a group of nonpartisan, quote, unquote fact-checkers, like ABCT News and factcheck.org. And before they announced the system, they emailed me and said, can we get your thoughts on this because we want to see how people are going to react? And I'm not a reporter, so I didn't care about breaking the story. So I said, sure, kind of thing I do all the time, right? So I said, sure, I'll take a look. And I said, this sounds reasonable, but let me tell you, within an hour upon announcing this, the right wing is going to say that these fact-checkers you've selected is careful nonprofits and they're all plants of the democratic party. They're just tools over there.
Anil Dash: Yeah. It's been games.
Jay Rosen: Been game. Within an hour, they're going to like, they're not going to accept that the ones you carefully selected are carefully selected at all. They go, well, we're not trying to play any political role here. I said, well, it doesn't matter if you're trying to or not.
Anil Dash: Right.
Jay Rosen: It's like, this is what they're going to do.
Anil Dash: Right.
Jay Rosen: Right? We don't...
Anil Dash: If one team only wins when they work the refs, they are never going to not work the refs.
Jay Rosen: Exactly. And I'm having a work the refs conversation with somebody who's saying things like, well, we don't have an ideological acts to grant.
Anil Dash: So what?
Jay Rosen: Yeah. So that kind of passivity, vacancy, there's got to be a function to that. I don't know what it is, but...
Anil Dash: Do you think it's intentional or do you think they don't know they don't know?
Jay Rosen: I think they did for a while have a culture in which denying that you are a media company and [inaudible 00:23:26] company was the party line, was what the company [crosstalk 00:23:31].
Anil Dash: It was an effective strategy for a decade, plus.
Jay Rosen: Yeah, that's what I mean.
Anil Dash: Yeah. You can build a trillion dollar company on denying that you are what you are.
Jay Rosen: And there's the additional factor of, from the point of view of a targeting ads company, false information is just as good as true information, even better in some ways. That's the fundamental problem.
Anil Dash: Right? The economics here. Right? So there's this incentive...
Jay Rosen: The way the system works. The way it's designed.
Anil Dash: Yeah. The incentives are misaligned here because these truth teams, the truth squad within these companies, is only ever going to be a cost and they will never make the company more profitable.
Jay Rosen: That's right. And they prevent scaling and-
Anil Dash: They are inefficient.
Jay Rosen: ... they're a drag on scale.
Anil Dash: They're inefficient in companies that prides efficiency almost above everything else. So we have this set of incentives and we have a gameable system and now, people are saying, how do I take advantage of this? And attention has monetary value, because you have to pay for attention on these platforms unless you can hack it.
Jay Rosen: Right. And here, we come to the best hack, the most potent hack of the media system, which is Trump's own media. We can't call it a strategy, but it's a presence.
Anil Dash: Yeah. Intuitively, he's very savvy.
Jay Rosen: I don't buy that part, but we can argue about that.
Anil Dash: Okay. Sure.
Jay Rosen: But there's no question that his, let's say, his style, his political style floods the system with news and almost all of it is newsworthy by the criteria that existed prior to Donald Trump, like focused on conflict, something new and original, or the best definition to all of news, holy shit, can you believe what happened?
Anil Dash: Right. Right.
Jay Rosen: So just on these incredibly traditional simple grounds of newsworthy...
Anil Dash: Checks every box.
Jay Rosen: Checks every box, pushes every buttons 24 hours a day.
Anil Dash: And because they've got this sort of factory model of how they monetize...
Jay Rosen: Some do. Yeah, anyone with a factory model overdoses on it. Yeah.
Anil Dash: Right. This is like, this guy is a car factory that just spits out cars 24/7 for free.
Jay Rosen: Right. And so...
Anil Dash: And if you're a car dealer, you're like, great.
Jay Rosen: Right. Just how we get to the famous statement by the head of CBS News is, may not be good for America, but it's damn good for CBS, right, which is a famous line about Trump as a media phenomenon. Everybody knows that now. But the reason I bring this up is because he, everything he does is newsworthy by traditional criteria, the only way to not be controlled by him is to reconsider the criteria.
Anil Dash: Ooh, yeah. Change the algorithm.
Jay Rosen: Yes. Because he broke your definition of news just as he broke the incentive system around fact-checking. Glenn Kessler, who is The Washington Post fact-checker, has said this in many occasions, that the difference between Trump and other presidents is that when Republican and Democratic candidates and presidents were fact-checking the past, they would not hold up their hands and say, you got me, but they would do something to change the situation.
Anil Dash: They would attenuate the worst excesses.
Jay Rosen: Yes. Or they make the claim past or they just stop saying it, right? And Trump not only doesn't do that, he doubles and triples down and then he does something further, which is he takes the friction raised by these false statements and taps them as a source of energy for his base and his campaign. So that...
Anil Dash: In fact, their evidence of his correctness is the fact that they're pushing back.
Jay Rosen: Totally. And he's kind of like delivering on a campaign promise, which is that I will put down, annoy and drive these people crazy for you. And each time one of these fact-checking flares, jumps out of the fire, he's making good on his promise. So anyway, now, there's another regular institution of journalism that he broke. What we need is for the press to be able to somehow meet and repair these and replace them with something stronger built for this kind of information warfare. So far, we don't have it.
Anil Dash: Right. So this is really interesting because what I'm hearing in all this is exactly the kind of arms race we have in security, in tech, that we have in privacy and tech, where there's all these sort of hackers that are constantly coming in the systems and you have to evolve your model.
Jay Rosen: It is a kind of arms race, roughly speaking.
Anil Dash: Yeah. And it's interesting too because you talk about the fact-checkers and the mechanism by which fact-checking works is shame.
Jay Rosen: Yes.
Anil Dash: Right?
Jay Rosen: Totally.
Anil Dash: So if you say, you said something that's not true, what we're assuming is sort of an emotionally developed maturity where being caught in a lie makes you feel shame and stigma to the point where you address your behavior modulated at least to some degree. So that's there. So there's this feedback loop and what I see with this is an algorithm, right? And I look at search engine optimization. This is how to make your content show up in Google. And they've been, for 15 years now, iterating on this. And first, it was like, make your page look a certain way, and then it was, make sure you got good headlines, and then it was, make sure your links work a certain way. So they gradually added on top of this layer upon layer of essentially hoops you have to jump through. If you wanted to publish online or you want to share content online or you wanted the things you're selling in your store online to show up in Google, you would just do these things.
Anil Dash: And it's funny because they would never make... they didn't do a press release, which is, now, all your links have to be blue and bold letters, right? It would just sort of be folk knowledge that would get shared and a little bit of cargo cultism, right? So there would be the sort of back channels, a forum. I heard, if you mentioned goats on Thursdays, your site will rank higher in Google. And people will be like, well, it works for him, I'm going to try it. And so that practice evolves, or in and out, it's $1 billion industry and it's evolved over 15 years, but Google tried to rationalize it a little bit. They sort of started to say, oh, we want to give you a set of guidelines or rules. They would go to the conferences and meet halfway with its algorithm. But the point is the people that were playing this game were playing against a software algorithm that was created at Google and that had enormous economic value.
Anil Dash: Absolutely. I mean, perhaps trillions of dollars of economic value is tied up into this. And so there were enormous incentives to figure out how to game it. And then once it became clear that this was an arms race between Google and these folks, and there were, they're called the black hat and white hat SEO hackers, just it was like they were security hackers, they sort of settle all this out. And in attention hacking, we don't have any of this narrative or any of this fluency. And so Facebook has an algorithm for their newsfeed. It gets hacked by attention hackers regularly. They have grand pronouncements every once in a while when there was a catastrophic failure on the scale of a ethnic cleansing or an election being undermined.
Jay Rosen: Right.
Anil Dash: And even then, it is this sort of weird mush mouth back and forth where Facebook will say, well, we are doing listening dinners behind the scenes with a couple of people on the right and a couple of people on the left and we'll tell you afterwards who it is, but nowhere near the systematic approach of what would happen if this were a security hack. Is that intrinsic or do you think that's strategic? Do you think they're sort of saying like, because I basically think there's, are you evil or are you dumb? Right? That's the question I keep asking about everything these days. And that there's one way, which is you're dumb and you don't know they're hacking you and so you're just having these dinners because you're like, that's all Zuckerberg knows how to do. The other is you're pretty nefarious and you're like, well, this'll put them off the trail and feel like we're doing something long enough for us to keep profiting off of this.
Jay Rosen: Yeah. I really don't know. But my bias is towards one part of this, which is, they build a machine that they couldn't control and they couldn't know exactly what it was doing. So we've built this thing and in a sense, no one is running it. And in that, there are things that it's doing to the culture and the environment that we cannot track.
Anil Dash: Right. It's like the weather, it's too complicated to fully predict.
Jay Rosen: Something like that. And we've built this thing and in that sense, that specific sense, it is out of control. Nobody wants to say that. Nobody wants to hear that. Nobody in the regulators wants to really know that. It's kind of like goes beyond our measures of [inaudible 00:15:24], our institutions are built for it. So I think it's like, part of what happens at Facebook especially is that, but also, that this part might be strategic that they keep themselves politically stupid and dumb about media so that they can just blunder into whatever is the easiest thing for the business. They kind of play like six layers dumber than their engineering degrees would attest. I think this is starting to change now because Facebook is interacting with more critics than it used to and it has teams working and stuff where they need to consult with other people and they're learning what Facebook. But the part that really interests me, and you would know more about this than I do, is I wonder when these things begin to become huge factors in the recruitment of talent, technical talent.
Jay Rosen: Because I think technical talent has a... if anybody has power here, it's them.
Anil Dash: Yeah. So that moment is right now.
Jay Rosen: I thought so.
Anil Dash: What we see in tech, and I've talked to people who organized the Google walkout a year ago and lots of people that's been my network for 10 or 20 years in this industry is sort of the people who love tech but are dissenters within it. What I think we all came to understand five or 10 years ago was consumer boycott's not going to happen. Nobody's going to stop looking at Instagram and disconnect themselves from their family on WhatsApp, almost no matter what happens because you'd be complicit in anything in order to know how to talk to people you love. So given that that there isn't not going to be consumer boycott and we can't anyway with the amount of data they have, then, you have a challenge about how do you have any accountability? And what...
Anil Dash: A challenge about how do you have any accountability, and one vector is regulation, but that's a very slow one, somewhat by design and also because of lack of fluency from policymakers. And so then, the big lever is their biggest expense, which is people. They're highest in demand, and with that resource, I already see it. People are way, way more reluctant to go out there. They're not wearing their Facebook hoodie.
Anil Dash: They are looking at their options around where they want to work when they're an engineer coming out of school, and also there is a lot more competition for talent. The difference between an extraordinarily wealthy salary and a merely very, very high salary, when you're 23 and you just got your degree, is like it's all Monopoly money. It all seems like it's a fiction. I think that that's starting to have some leverage. It is fascinating to watch, but it gets into that's the workers. That's their leverage. That'll start to have some cost.
Anil Dash: I look at society, though. Right? We're talking about these algorithms being gamed. We're talking about, whether it's playing dumb or being dumb, this real social cost. Going all the way back to the beginning, you talked about journalism as a institution that serves the public and has a role to play in a civil society, and now the narratives are going talking about fifth estate versus fourth estate.
Anil Dash: This is very, very fundamental, highfalutin language about the roles that things play in a functioning democracy, in a functioning society. Attention hacking sounds kind of trivial. It sounds like this is kids playing online, this is people sharing memes. But at a fundamental level, you're talking about undermining elections. Right? You're talking about destabilizing institutions that matter for civil society. How does this play out? How bad is this going to get?
Jay Rosen: I am pessimistic and in a darker mind every month or so about it. I think we're at a point now where between 25 and 30% of the electorate in the U.S. has been effectively siphoned off into a different media system, where Trump is the leading source of information about Trump, and the rest of the news system effectively cannot get through or penetrate.
Anil Dash: Right, and this is also conspiracy world.
Jay Rosen: That's where this conspiracy world is strongest, yes, and of course it is the original and very strong bond between Trump and his core supporters involved in that, but it's very important to realize that rejection of the information in the press is a condition of that bond. It's not a nasty word he flings in his Tweets. It is a form of political mobilization.
Jay Rosen: It is also promise-keeping on Trump's part. As I said, one of the things he ran on that he definitely delivered on was, "These fucking media people, I'm going to put them down for you, baby. You just watch."
Anil Dash: And you're going to humiliate them.
Jay Rosen: He did it, and he completely delivered on that, and continuing to deliver on that promise is the presidency. It's not like a weird feature, a weird bug.
Anil Dash: Right.
Jay Rosen: Right.
Anil Dash: It's a platform.
Jay Rosen: It is the heart of his political method. The heart of his political method. So then you take an institution like the American press, which it has difficulty changing under pressure. Right? It's not an on-the-fly institution because-
Anil Dash: It's a lot like policymakers. It's slow.
Jay Rosen: It's slow to change, but partly because it has to produce every day. It's not like you can stop the news-
Anil Dash: Right, and rethink it. Yeah.
Jay Rosen: ... and just say yeah, like-
Anil Dash: Let's pause everything.
Jay Rosen: Yeah, let's redraw that or let's put a new operating system in. There's no such thing. Just like everybody knew that 2016 was a massive failure by the institutional press in the election, but the next day they had to start oh my god, there's going to be a Trump presidency.
Anil Dash: Now what are we going to do?
Jay Rosen: What are we going to do? Like it's already a huge story before it even started.
Anil Dash: Right. We have to do Ivanka puff pieces because that's-
Jay Rosen: And so the idea of, well, let's take some time and pause and think what happened in this election, that never happened because in journalism you're under this constant pressure to produce, and one of the consequences of that is that you always make mistakes and things get printed and published that are result of half-done work. Right? And so, journalists are always vulnerable to criticism because their work is vastly imperfect. It has to be sent out every day. It would be like if you had to ship software every day no matter what, as good as it was. Maybe you get lucky and it's good.
Anil Dash: Right. Well, we have that right now, where Apple rushes out the iOS update because they got new phones coming out, and everybody's like, "Man, this version of iOS is all buggy. Even the flashlight doesn't work." All this stuff's going wrong, and you're like, "Oh, well, that's because they had a deadline and they had to get this out because the new phones had to come out, and hopefully they'll fix it later." But fixing it later is okay if it's like my iMessage is a little slow and laggy sometimes. It's not okay if you're like, "We undermined an election."
Jay Rosen: That situation has always been there. It's been weaponized by Trump. It's not just that people mistrust the New York Times. It's way beyond that. It's they now have an information sphere that can allude the facts in the New York Times and carry them all the way through the campaign to the election.
Anil Dash: We've gone to the darkest depths now. Everything's broken. Everything's doomed.
Jay Rosen: Yes, and getting worse.
Anil Dash: All the incentives are misaligned.
Jay Rosen: Yup.
Anil Dash: We have all this pessimism. The optimist in me still always believes there's something we can do. I'm curious, for me as a engaged social media consumer, somebody who pays for journalism as much as I can, what can I do?
Jay Rosen: Well, of course, you can support the news organizations whose work you use, and that's obvious, but I've been working on a better answer for that in my directorship of the Membership Puzzle Project, which is a research project studying membership models to support journalism.
Jay Rosen: Our founding distinction is between subscription and membership. Subscription is a product relationship. You pay your money, you get your product. If you don't pay, you don't get it. If you're dissatisfied with the product, you just stop paying, and everybody understands.
Jay Rosen: Membership is different. You join the cause because you believe in the importance of the work. If you join the cause because you believe in the importance of the work, then you should be able to support it, yes, with your dollars, which go to an open surface that anyone can access. That's part of what you're doing with your money. That's part of what you want to support.
Anil Dash: Right. You're subsidizing it for everyone.
Jay Rosen: Right, but also you can contribute if we create participation paths where you can actually improve our journalism directly. For example, a database of expertise you have so that when we need it, we can call on you if you're a member. Crowdsourcing projects that try to bring a lot of distributed information together in one place through the members. Right? Lots of facts you cannot otherwise find unless you have lots of hands picking up stones and collecting them.
Jay Rosen: So, a better answer to your question would be when we have a fully developed notion of membership and people are involved in sustaining the news organizations they support, and they do so because they believe in the values and the way that those institutions are run and funded, and they get the whole thing and they get why they're part of the business model, not just a consumer of news. Right?
Anil Dash: Right.
Jay Rosen: And that they have to call out errors, and they have to participate when necessary, that's the future that I am optimistic about, but we're not there yet.
Anil Dash: To that point about the calling out and participating, what about me as just a ordinary social media participant? I see manipulation and misinformation all around me. Maybe I participate in it without even knowing. How can I be a better citizen? How can I be a better consumer of media and a better sort of participant in social media? What are the ways that I get exploited, that I become complicit in these systems that I might not know about?
Jay Rosen: Yeah. Well, anytime you're talking about something that lots of other people are talking about and that's why you're talking about it, you're in a vulnerable position, because it's possible that you were brought there.
Anil Dash: Wow. And these platforms encourage that.
Jay Rosen: They do. Yeah, that's one of the normal ways of using them.
Anil Dash: Yeah.
Jay Rosen: I also try to observe this rule for myself, don't always succeed, of chill before serving. I think there's great value in cooling down before. It's not censorship. It's self-control. So, chill before serving is good.
Anil Dash: The flip of the vernacular is great.
Jay Rosen: I think it's really important, and I think you're quite aware of this, it's really important not to sort of like join gangs. You know what I mean? Like attention gangs?
Anil Dash: Yeah.
Jay Rosen: And I'm probably guilty of that sometimes myself. You really have to watch it.
Anil Dash: Yeah.
Jay Rosen: Attention gangs can be dangerous.
Anil Dash: I think about networks like Twitter. Some of the most fun is when everybody's jumping in on something.
Jay Rosen: That's true.
Anil Dash: If we're joking about something and we're all having a good time, but the line between that and the gang can be thin.
Jay Rosen: Yeah. I'll tell you one more rule I use on Twitter, and Facebook too. I don't do much on Facebook anymore, but on Twitter I try to be 100% personal, 0% private. What I mean is everything that I do is an expression of Jay. It's what I think, what I think about, what I care about. It's authentic me. It's like I wouldn't talk about it if I didn't care about it, so it's personal.
Jay Rosen: But 0% private means you can't tell where I am. I don't talk about my family. I don't talk about my lunch. I don't position myself in social space. It's JayRosen_NYU who is focused on the things that I know a lot about.
Anil Dash: So, it's you-
Jay Rosen: That's it.
Anil Dash: ... as a sort of public intellectual and as a voice of what you are, but not as a ... you know.
Jay Rosen: No, and so that limits what I can do.
Anil Dash: Do you think that's compatible with influencer culture today, where people are literally famous because of how much they open up about this is what I'm wearing, this is what I'm eating, this is where I'm at, this is where I'm [crosstalk 00:44:47]?
Jay Rosen: It is intended to limit influence. I want to limit influence to the thing that I'm expert in. That's the only kind of influence I can gain from. Any other kind is actually lowering my profile in ways that matter.
Anil Dash: Well, it's interesting. Your framing there is stick to what you know, and that's such a powerful foundation, it feels to me, of both, one, how we can be responsible individually in what we do in social media-
Jay Rosen: It is, yeah.
Anil Dash: ... and two, in what we should demand of these platforms, whether that platform is a journalistic institution or that platform is a social media, social networking technology institution. Maybe asking them to stick to what they know. Maybe something that can reduce some of our social vulnerabilities. Jay, thank you for joining us on Function.
Jay Rosen: Thanks. Thanks for having me.
Anil Dash: We've been taking this deep look at journalism in the current era of social media and the internet and at trust broadly. This entire season of Function is about trust on the internet, and it reminds me of a story of a personal experience I had.
Anil Dash: Back in 2015, when marriage equality passed in the Supreme Court here in the United States, like a lot of people, I was celebrating. It was really exciting. And one of the things that happened that day, there was a lot of public spaces and public buildings got lit up with rainbow lights, which was just incredibly moving.
Anil Dash: I had a friend send me a picture from the White House that'd been lit up in rainbow lights and celebrating the decision. I took that picture and a couple others that I'd found online that people were Tweeting out of rainbows lighting up everywhere in public, and I put them all together into a Tweet, just sort of saying this is a wonderful day.
Anil Dash: And it wasn't until I got tens of thousands of retweets and people amplifying that message that I found out that some of those photos weren't from that day. I had inadvertently shared a message that was inaccurate. That Tweet got so many retweets, it may be the most visible thing I've ever written in my life. Now, this wasn't a lie. I think those photos were from a different day, so it wasn't something where I was trying to mislead people, but I had had a role in amplifying inaccurate information.
Anil Dash: All of us have a responsibility to think about the information that we share, and the platforms don't always do a good job of reminding us of that. There's not often enough nudges to say is that really a credible news source? Is that really the information you want to share? And even, maybe at a fundamental level, is that even something that you're amplifying just because it feels good to you or because it's true?
Anil Dash: Those are the kinds of deeper, tougher questions that really drive the spread of misinformation and disinformation online. They appeal to our more basic emotional instincts. Sometimes they're good. They're celebrations. Sometimes they're not so good.
Anil Dash: That question about how disinformation spreads online and what's causing it to happen on social media is so important and is so fundamental, and it's what we're going to go deep on in the next episode of Function. I hope you'll join us as we go even further into getting ready for 2020 by taking a look at the role that social media has and being part of the political and journalistic process that we all rely on.
Anil Dash: Function is produced by Bridget Armstrong. Our Glitch producer is Keisha "TK" Dutes. Nishat Kurwa is the executive producer of audio for the Vox Media Podcast Network, and our theme music was composed by Brandon McFarland. Thanks to the whole engineering team at Vox, and a huge thanks to our team at Glitch.
Anil Dash: You can follow me on Twitter @anildash, but you should also follow the show @podcastfunction, all one word. Please remember to subscribe to the show wherever you're listening to us right now, and also check out glitch.com/function. We've got transcripts for every episode up there, apps, all kinds of stuff to check out about the show. We'll be back next week, and we hope you'll join us then.