Fn 7: Behind the Rising Labor Movement in Tech

"Anything you can think of is taken care of. [...] Why would you be unhappy if you have all these things there?"
— Mark S. Luckie

On November 1, 2018, thousands of Google employees around the world staged a mass walkout in protest of how the company handled claims of sexual misconduct. While this is not the first time we have seen protests at this scale, it does signal to the larger community that workers at huge tech companies like these are at an inflection point. When is enough, enough?

This week on Function, we take a look at the rising labor movement in tech by hearing from those whose advocacy was instrumental in setting the foundation for what we see today around the dissent from tech workers.

Anil talks to Leigh Honeywell, CEO and founder of Tall Poppy and creator of the Never Again pledge, about how her early work, along with others, helped galvanize tech workers to connect the dots between different issues in tech.

Next, Anil speaks to Former Facebook manager Mark S. Luckie about his recent memo that's swept the Internet, and Mark details steps that tech companies can do to make conditions better for employees of color.

Lastly, Anil sits down with Matt Rivitz: one of the key people behind the grassroots campaign Sleeping Giants which caused thousands of advertisers to remove their ads from Breitbart News. According to Matt, there needs to be an awakening in the tech industry, and he illustrates that all of us can take small actions which can come together to make a massive change.

Guests

Other Links

Big thanks to Microsoft Azure for supporting the first season of Function.

Function | About


Transcript

Anil Dash: Welcome to Function. I'm Anil Dash. Now each week here on Function, we talk about the way that tech and the apps around us shape culture. This time we are going to another level, which is another important part of the tech story: the power of tech workers themselves. A few weeks ago, Google employees around the world walked out of work. By some counts, it's as many as 20,000 employees that participated in this walk out. The reason behind the walk out was protest against sexual harassment and discrimination at Google, an that Google wasn't holding these harassers accountable. And in some cases, even paying them out millions or tens of millions of dollars.

[audio breaks to a crowd of Google employees]

Speaker 1: I've had to hear about and watch so many people leave the company after experiencing mistreatment and harassment, and I've seen women be unable to get promotion until they wither and leave.

Speaker 2: She seemed eager; trying to jump in to offer her say but she was invisible.

Speaker 3: This is a really huge moment, and we should all be proud and excited to be right in the middle of it. But if this is going to matter, this has to be the beginning, not the end.

[audio goes back to Anil]

Now the Google walk out is an important story on its own, especially in a time of Me Too and Time's Up, but it ties to a much larger movement in tech that's been going on for years. You see, all of us, even if we don't work in tech, we use these apps. We use these websites. They shape our access to information and to news. They shape our politics. And so, that means the choices that tech workers make shape our whole lives and our whole culture. If tech workers say it's time for them to stand up and speak out, that must mean something really remarkable is happening.

We're going to hear from people who, each in their own way, have been part of this story of tech workers organizing over the past years and decades. A little later in the show, we're going to hear from Mark Luckie. He's the former Facebook employee who stepped down a few weeks ago, but before leaving the company, he sent a memo to the entire staff at Facebook calling out what he saw as Facebook's "Black people problem." Now, Mark shared that memo in a public post on Facebook that he entitled "Facebook is failing its black employees and its black users."

After we hear from Mark, we'll talk to Matt Rivitz, who leads the community called Sleeping Giants, and the Twitter account of the same name. That's the community that's responsible for getting over 4000 companies to pull their ads from the Breitbart [News] website.

Leigh Honeywell

But first, my conversation with Leigh Honeywell. Today, she's the founder and CEO of Tall Poppy, where she helps companies protect their employees from online harassment. But, she's somebody who's been putting herself out there for years, leading communities in efforts to organize people, sharing stories of people that have been victimized in the tech industry. A lot of people don't know that there's a years-long history to organizing in tech, and Leigh is the perfect one to tell that story.


"I think there was that aspect of solidarity that was really powerful in just saying like, 'we're in this together.'" — Leigh Honeywell


Joining us today is Leigh Honeywell, who's the CEO and founder of Tall Poppy. Thanks for joining us, Leigh.

Leigh Honeywell: Thanks, Anil. It's great to be here. I'm really happy to have the chance to talk about the history of tech worker activism, both before the Google walkout and in the early days as well.

AD: Awesome. And so for some context on your work, you are somebody who has done security at the highest levels in big companies like Microsoft and Slack and Symantec. And then, these days, you are running a company called Tall Poppy that is about trying to keep people safe from some of the risks like harassment and abuse that happen online.

LH: Yeah. Over the course of my career, I've worked in the standard corporate security trenches, everything from shipping software to a billion people through Windows security updates, to helping protect the infrastructure behind Slack. Over the years, I've had twin tracks of the standard corporate security work, but also diversity and inclusion in tech activism. I was one of the original authors on the Geek Feminism blog and wiki. I was a founding advisor of the Ada Initiative, which was a nonprofit that worked to support and encourage women in open technology and culture. So, I've had these twin tracks throughout my career of the occasional activist stunts, but also having a day job that was very tech heavy, engineering heavy, high-impact technical work.

AD: On the one hand, you were the person responsible for the Windows alert saying it's time to reboot your machine.

LH: I was one of the people in a very large machine.

AD: Right, okay, so all of us are kind of familiar with that. The thing people outside the tech industry might not know, one of the resources you mentioned, the Geek Feminism Wiki. Wikis are like Wikipedia. So people know, this is a site you put resources that the community can collect and edit together. Geek Feminism was both early and really visible in the tech industry years ago as the first place to collect that sense, that certainly all non-men in the industry had felt around, "There's something really wrong here. Something's broken in the culture here." And really, was making visible some of that context. Can you talk a little bit about how that came to be and what the resources evolved into?

LH: Yeah, I think one of the early blog posts on the Geek Feminism blog was a post by Mary Gardiner that was titled, "Why we document", and it really cut to the core of...we've all had these experiences in tech of being sidelined, of being talked over, of experiencing pretty overt discrimination at times. Actually writing that down, actually recording whether it was incidents of...that are sort of mundane, like speakers having scantily clad women on their slides at a professional conference, which seems so ludicrous to even talk about in 2018, but in 2009 it was yeah. This was over 10 years ago. Starting to write that down and make it visible was one of the really revolutionary things about the Geek Feminism blog and wiki. We've seen the echoes of that through the years. We were talking earlier about Susan Fowler's blog post detailing her experience at Uber, in this incredibly dispassionate and factual voice. When I saw that, I was like, "Ah, this is that documentation. This is that power of just writing down what happened in a factual way."

AD: You reference Susan Fowler and her really striking memo about the really egregious problems at Uber when she was working there, around the culture and particularly how women were treated, but a whole host of cultural issues at the company that we've seen fall out since. There's some aspect of...it was meticulous. It was documented. It's just very dispassionate, almost a clinical assessment of, "Here are the issues and here are the receipts." Right? Going back for years.

LH: Mm-hmm. Yep.

AD: But, some aspect that is also having to be, for lack of a better phrase, the perfect victim. You have to be this very unimpeachably dispassionate person lest...certainly women get characterized as emotional about any time that they are victimized in these ways, or non-binary folks have the same sort of challenges, where you have to be in a certain mode or model in order to be seen as valid. Is that something that you think resources like the Geek Feminism Wiki helped with?

LH: I think they do. I think one of the things that was powerful about the Geek Feminism Wiki was that it was sort of messy and organic. There were varying levels of receipts. Some of the incidents that are documented were ranged really broadly from the extremely minor and mundane right up to violence and assault. There's a page on Hans Reiser murdering his wife, which was a thing that happened, that was perpetrated by a prominent Linux developer. Recognizing that all of these experiences are on a spectrum, are connected, even if they are obviously wildly different degrees, I think that was a nuance that was important in how we were documenting, to say, "It's okay to write this stuff down, even if it's kind of petty." The cumulative effect of all these paper cuts does add up.

AD: Yeah, so there's a sense of evidence to the exhaustive nature of it, when all these people have all these stories. You can quibble with any one of them for whatever reason you have, maybe even legitimately, but when it's this preponderance of evidence, then it's almost undeniable that this problem is bigger than any one person or moment or company.

LH: The qualitative documentation that the Geek Feminism Wiki represents, I think, stands as a really helpful mirror to the kinds of stuff you see in the sociological research on women and computing. And, when you do those longitudinal studies of what makes women choose to leave careers in tech, and we know figures like 50% of women leave tech careers within 10 years of starting and that the drop off rate as women hit their 30s is pretty tragic, honestly. As I think about the work that I've done over the course of my career, there's always been lots of energy of, "Let's get little girls coding. Let's increase the pipeline. Get university admission to focus on these early pipeline efforts."

But, I think the thing that has often been under-resourced, under-focused, because it is more complicated, is "what does it mean to actually retain under-represented people once they are in the field?" And that's where stuff like Susan Fowler's post, like the work that has been doing even beyond tech, across as the #MeToo movement has regained force, to say, "Hey, there are structural forces at play that lead to women being forced out of the field." That's what we actually have to address, because otherwise we can send as many bright-eyed and bushy-tailed young girls into the pipeline of coding, but if we're just going to kick them out by making them have to deal with this bullshit, it's all sort of for naught.

AD: Right. The pipeline can't lead to a cesspool.

LH: Exactly, yeah.

AD: So, one, there's this first order effect, which is documenting the experiences of women in the industry, gives a lens to women themselves, but to those of us that aren't, to be able to see the patterns that emerge. You said there's this evidentiary aspect to it. But, the other thing I look at a meta level is this also established a pattern for how action can happen. This is an issue that primarily impacts women, as documented in the Geek Feminism Wiki, but there are other issues that can affect other communities or not just women, with the same playbook can work, right? You can extrapolate what are the ways we can be effective in helping each other.

LH: Yeah, absolutely. I think the networks and connections that people built through things like the Geek Feminism Wiki, through the AdaCamp conferences that the Ada Initiative used to put on. Those were absolutely the scaffolding that led to us being able to spring into action after the election. And obviously, that was a bunch of different groups. It wasn't just folks that I knew from those communities. As we were figuring out — I say we, as people in my social circles, in the tech activist circles — were figuring out what to do in the days after the election, there were those relationships. There was that infrastructure and there was that set of experiences of how to organize effectively online quickly.

AD: That gets us to this milestone moment. There was a galvanizing moment with the U.S. Presidential election in 2016. There is a sense of urgency around issues that affect so many communities, affect immigrants, affect Muslim community and other religious communities. And, I think every one of the communities affected felt like, "What can we do?" But, in tech in particular, there was a sense of urgency around our potential culpability, people who make software, make technology, about enabling violation of peoples' civil rights or their human rights. Can you talk to me about the context that people were feeling in the beginning of 2017, about what was the potential risk that tech workers had of being part of something they didn't believe in?

LH: I think in the wake of the election, a lot of tech workers felt blindsided by all of a sudden, their Facebook newsfeed algorithm or Twitter's complicated policy decisions around banning or not banning Nazis, except in Germany and France, where they're obligated to. That these technical decisions had world historical impacts in a way that people were just shocked by, as people being the ones making those decisions. We've had sociologists and theorists, like Jonathan Zittrain and danah boyd, who've been warning us about this stuff for years, but the election was really a wake-up call to people that hadn't been heeding those calls from the sociologists, to say, "Oh yeah, this work actually has an impact and we need to be, maybe, responsible for that impact that we're having."

AD: There's the rising movement that's been happening for a couple of years about tech responsibility, about the unintended consequences of a lot of these technologies being created. And then, probably a little more historical awareness of...I think of the IBM example that I heard a lot about. Maybe you could explain what people were thinking about, particularly in the early 2017 context of where IBM had found themselves 70 years earlier.

LH: One of the things that we cite in the Never Again pledge, the pledge that we ended up writing, was that we don't want to be building the machines of deportation and genocide. Citing the example of the punch card machines that the Nazis used in the Second World War to catalogue the people that they were killing. As we think about today what the equivalents are of those, it's Salesforce deploying software that ICE is using in various parts of their organization to assist in the engine of deportation. There's a direct line from the work that a random coder in San Francisco is doing to those kids that are in the baby jails that have been reported on over the summer. People can't just say, "Oh, well, software can be used for whatever." No, you're actually running the infrastructure that this work is being done on. There's a level of culpability and responsibility that people need to be willing to take there.

AD: So, there's a lot of context about taking culpability for the technology we create that is already starting to bubble up. In December of 2016, after the election, there's this galvanizing moment where a lot of independent workers all across these different companies feel like, "We don't want to be responsible for creating a Muslim registry. We don't want to be participating in creating this tech." There is the genesis of the Never Again tech pledge. Can you tell us what the pledge was and how it happened, how it came together?

LH: There was all this energy after the election among tech workers who were horrified. They maybe hadn't been as involved, they hadn't done the phone calling or door knocking or schlepped over to Nevada that they had intended to. And all of a sudden, we were faced with this incumbent president who did not share the values of the vast majority of people in Silicon Valley. There's often, like, techies go to this temptation of solutionism, of like, "Well, we should build an app and that will solve things."

Tech Solidarity was an impromptu organizing group that Maciej Ceglowski and Heather Gold ran in the early days, right after the election. Maciej and Heather called on us to, number one, put our money where our values were and donate to these groups that were doing important work with immigrant defense, with defending Muslim Americans, with working on solutions around homelessness, all of these different social issues that were about to get a lot more dire with the new administration. And also, to engage locally in ways that didn't consist of offering to build an app for whatever nonprofit. That nonprofit doesn't need an app, they need somebody to come in and fix the printer. They need somebody to come in and set up two-factor authentication so they don't get hacked by neo-Nazis, right? The material support that local community groups needed.

AD: So, it's almost like civics classes for coders.

LH: Yes, I think that's a really good way to put it. Yeah. You're all fired up. You're all energized about this work. Here's how to direct that energy in a way that's not going to get in the way of these very competent folks who are already doing their job and just need more resources to do their job more. But, there's also this aspect of...we're seeing this level of racism, Islamophobia, anti-immigrant rhetoric during the campaign, and folks were just so on edge about what they could expect in the coming years. Really, with the Muslim ban landing right after the inauguration, I feel like we were really validated in freaking out a bit. That freaking out led me to think about how can we as techies, the folks making the inner tubes work, making the lights stay on...how can we make it clear that we're not going to be complicit in harming vulnerable people?

Well, that's where a pledge comes in. The difference between a pledge and a petition, I think, is a really important one to think about in this case. A pledge is someone saying, "These are my values. I'm going to act in accordance with my values." A petition is saying, "Dear Sir, please do this thing." We all know that's not going to work with this current administration. We need to just say, "This is how I'm going to be the squeaky wheel. This is how I'm going to say this is not going to happen. I refuse to do this work that I find ethically compromised. I am going to quit. I'm going to agitate as much as I can within the company." And I think, Liz Fong-Jones is a great example of just how much one individual can agitate within a company to engender change and to hold a company accountable to stated values.

[audio breaks to an interview with Liz Fong-Jones]

Liz Fong-Jones: I'm a trans woman of color. I am working at a large tech company and I am also the daughter of an immigrant mother first generation and a fifth-generation Chinese American father.

[audio breaks back to Leigh]

LH: Liz has been an employee organizer at Google for many years. I've learned so much from Liz about how to get companies to change their policy from within. She gave a wonderful talk about it as a live stream a couple of years ago.

Liz Fong-Jones: Ever since the election, but especially right now, I am really terrified. I am terrified for everyone who's Muslim. I'm terrified for everyone who is an immigrant of any kind, LGBT+ people. I am terrified for women. I am terrified for people with disabilities. The tech community is at a crossroads. We can choose to do things that are seemingly apolitical but are choosing to side with people who are oppressing others, or we can take a stand and decide how we want to collectively fight back and make sure that we are on the right side of history.

[audio breaks back to Anil and Leigh's interview]

LH: Making change from within versus making change from outside.

AD: People like Liz and yourself have been active as individuals and galvanizing others to act, but with the Never Again tech pledge, there's this breakthrough moment where it's a really broad spectrum of people, and not just an individual person who is willing to be the squeaky wheel, but I think an unprecedented show of people at all levels of organizations really speaking up. Can you talk a little bit about who are some of the people that signed this pledge? What were some of the companies or organizations they were with?

LH: Gosh, yeah. There's so many. There was about 60 people who signed it straight out of the gate, and one of the things we were worried about is "are we going to experience harassment? Are we going to have to deal with bullshit because of signing onto this pledge?" It turned out that 60 people was about the right level to have that safety in numbers. We had folks from...there were many many Googlers, a couple dozen, I think. There were folks from all different parts of the industry, all of the big companies. Really, all levels: engineers, designers, marketing people, all the way up to executives. There's varying degrees of impact, right? There's no chance that like ye small design firm that does websites for co-ops is going to get asked to build a Muslim registry, but I think there was that aspect of solidarity that was really powerful in just saying like, "We're in this together."

That sort of idea of like, "Oh, if I don't build it, someone else will." "Hey, there's actually like a lot of us who are not going to build it," and that was really powerful.

AD: It feels like something that is akin to what labor movements have had in other mature industries. If you go back, auto workers organizing or you look at the union busting of the air traffic controllers in the '80s, there were these sort of galvanizing moments, whether positive or negative, or whether successful or not, in labor movements in other industries, but they were very sort of manufacturing-based, they were classic industries. In many cases, they'd been organized for decades.

This was not a unionized effort, this was not the sort of classic, labor organizing model as we recognize it. This was something people participated in by making a change to a document on GitHub, the coding site, right?

LH: Yeah, absolutely. I mean, I think if we think about the classic ways that labor organizing has been done, I think there's such a great example, recently, with the Marriott strike, where this is a bunch of unionized workers who are demonstrating solidarity, who are making their presence felt.

This is like the nerd version of that, which obviously, was physically a lot easier than sitting out in the rain for many, many weeks, but to be able to say like, "This is our picket line. This is our bright line in the sand." Saying, "We are not going to do this."

Getting to know the organizations in the space that are doing more of that sort of labor flavored organizing, likeCoworker.org and the "Tech Workers Coalition"(https://techworkerscoalition.org/), both of which have done really wonderful work changing how companies interact with their employees.

Using the tools of what's called protected concerted activity, which is the idea that you can advocate for changes in labor conditions without being part of a union, and you're actually protected against retaliation, as long as you can convince one other person that this is a thing worth advocating for.

As long as two or more employees at a tech company say, "Hey, we should like change our benefits," or "Make sure that we have trans-inclusive healthcare," or whatever the working conditions that you want to advocate for, as long as it's two or more people, you actually can't be fired for that.

I'm not a labor lawyer. People shouldn't take legal advice from me, but that is my understanding of how protected concerted activity works, and Coworker actually has a really good FAQ about it.

AD: This moment happens, and yes, there's decades of history leading up to it and all these other influences that helped shape it, but you did galvanize it, you personally. What was your feeling for you, as somebody that's worked in this industry for many years?

As you said, you've still got your day job. You're still working on, "Let me make sure the security works on these platforms I'm responsible for," like you're still doing that work, but this principle, this idea that you're advancing, catches fire and has this sort of milestone moment happen. How did you feel, personally?

LH: I mean, personally, I was freaked out. I'm just going to be totally honest. I was on an H-1B. The H-1B is the skilled worker visa that a lot of techies come to the U.S. on. If you have a job on an H-1B and you get fired from that job, you basically like just have to leave the country.

There's like a small grace period where you might be able to like find a new job and transfer your visa over, but it's like super, super dicey. Your presence in the country is contingent on your continued employment.

There was always this sort of back of my mind like, "Is this the time I've gone too far? Is this the time I've agitated too much, and now I'm going to get sent back to Canada?" Which, in the grand scheme of things, as far as like having the privilege of being an immigrant from Canada, like that's not actually the worst place in the world to be sent back to, but I definitely...I'm very much like the stay and fight type.

This was my way of saying like, "I'm going to stand by my Muslim-American and other people who I know and care about who are threatened by this new administration." It felt very worth it, even if it was like stressful as heck.

The thing that really made it feel worth it was at one of the tech solidarity meetings after the pledge went public. I introduced myself to one of the...a representative of one of the local Muslim groups, and she just gave me a big hug.

AD: That's got to be a moment. I mean, you sort of, you realize this is having this sort of real world impact on people's lives. They're feeling it so viscerally as to want to hug you, but then there are all these other issues, right?

As you were saying earlier, just a couple of months later, Susan Fowler puts out her memo at Uber and then, of course, we get to much more recently, Google, reckoning with a lot of different choices they're making internally.

AD: Most notably, the galvanizing event behind their recent walk-out, being the understanding that settlements, in some cases, in the tens of millions of dollars are being paid to abusers and harassers in their company that are helping make this environment extremely hostile towards women in many parts of the company.

There's, of course, pushback on this. But in the past, when similar stories had bubbled up at Google, it was sort of internal grumbling about...what...on a message board or an email list internally, right?

LH: Yeah, absolutely. Google has traditionally sort of worked a lot of this stuff out internally, and I think the walk-out was symptomatic of fundamentally, like a breakdown in moral and ethical leadership within the organization.

I think it's important to sort of connect the dots between the ethics of work like Project Dragonfly, Google's effort to go back into China, or I think it was Jedi, was the military contracts that they decided to not take after much employee organizing, and this kind of sexual misconduct and abuse or underrepresented employees, of women employees.

These are all failures of organizational leadership and there's just been this increase of space of sort of willingness to talk about the impact that these decisions that are fundamentally like, you're making a decision about your values as an organization when you say, "I'm going to pay out this abuser," versus, "I'm going to just fire them, and actually make redress in some way to the people that they've harmed."

We have this tech worker organizing that comes around, this political change of scene. Then we also have the Me Too Movement over the past year that has really just changed the balance of power, and changed how people think about sexual harassment in the workplace.

It's wild to me to think that this is 20 years after Anita Hill, but that it's taken that long for people to be like, "Oh, yeah. This is a systemic problem. It's not just individual bad actors. It's, "How do we create these like systems of power and accountability in organizations?"

AD: Well, Leigh Honeywell, you've had a very inspiring influence, I think, on the entire industry, but to talk about both of the large scale and some of the most important political issues going on in the world, but at the individual scale of individual people who code, who create software, who create technology, reflecting, I think, as I certainly have and many people have, and saying, "I could be doing better. I've seen these stories now, and now I can no longer pretend to not know or try to ignore it."

I think you've inspired a lot of people to take some huge steps and really help to drive a lot of change in technology, so thank you for joining us on Function today.

LH: Thank you very much for inviting me to be on.

AD: We'll have more with Mark Luckie after the break.


"Black employees are suffering in silence, or they're trying to make inroads in ways that they're being blocked." — Mark Luckie


Anil Dash: Welcome back to Function. I'm Anil Dash. Now, why are we talking to Mark Luckie? A few weeks ago, Mark stepped down from his position at Facebook as, and listen to this title, "Strategic Partner Manager for Global Influencers Focused on Underrepresented Voices for Facebook."

Now, that's a long title, but as it indicates, his job was to give voice to underrepresented people at Facebook. Yet, when he left Facebook, he had a memo that was titled "Facebook is failing its black employees and its black users."

In an internal memo that he later made public, Mark discussed what he called, "Facebook's black people problem." What Mark shows us is that tech activism isn't just giant walk-outs with 20,000 people. A lot of times, it's individual voices taking a brave stand and telling the truth about their experience.

And the thing is, this isn't the first time Mark's spoken up. He had similar criticisms and complaints when he left Twitter a few years earlier.

Mark, thanks for joining us at Function.

Mark Luckie: Thanks for having me.

AD: You've been at three of the companies that have some of the largest communities in the world, that have probably enormous outsized cultural effect. I mean, between Twitter, Reddit, Facebook, that is shaping, certainly for the English-speaking world, but for a lot of the world, what we see in media, what we see in culture.

ML: Absolutely. Especially when you consider that Facebook also owns Instagram, WhatsApp and Oculus, which are even bigger than, I think, people most imagine when they think about Facebook.

AD: This reach is extraordinary and you've gotten the chance to spend time at each of these places. Then there are moments when your experience of being on these teams isn't what you expected. Perhaps, most prominently with your departure from Facebook recently.

ML: Yes. I come into these companies because they want change. They want to improve their relationships with these users from diverse backgrounds or to think more broadly about how they can connect with media organizations, as was the case with Twitter and Reddit, but what I found is that they haven't thought through, "How do you change your existing hierarchies to make room for these kind of changes?"

That's ultimately why I have departed the companies, is because it becomes clear that it's going to take a little bit more work in order to be able to execute some of the things that they want to do. Often, they are resistant to change because they don't see it as a benefit to the company, although there are many, many benefits to thinking this way.

AD: You recently decided to leave Facebook, and then you wrote a memo on the way out. Initially internally, but now it's been shared widely, publicly. I want to hear how you characterize this memo and what it's about.

ML: Yeah, so I first posted the memo internally at Facebook, really calling out some of the things that black employees had been talking about privately and were too...I don't want to say scared. They were too reticent to share, because they didn't want to harm their professional relationships. They didn't want to speak up because there is fear of retribution.

Of course, people have private conversations with their managers and there were pockets of change, but certainly it wasn't on a wide scale. I knew in exiting and not going directly to another position that I had this point of privilege where I could share what was happening, especially being at the center of a lot of these efforts at the company.

I decided to post it more widely, because recognizing that A, Facebook needs to hold itself accountable and it won't do that often until this discussion becomes public. Then B, that this is bigger than Facebook. That there are companies, tech, even beyond tech who are going through some of the same things. Black employees are suffering in silence, or they're trying to make inroads in ways that they're being blocked.

After the post was up, I've gotten messages from people across many, many sectors and its just been both disheartening and encouraging to know that people are galvanizing and that they're going through some of these same tough things.

AD: There's another criticism that will come up, and not just for you. I look at other people who have left some of the major tech companies and written memos. One of the most prominent that comes to mind is somebody like Susan Fowler, who was at Uber. In your case, this isn't your first time you've written a note like this.

ML: It isn't. I had hoped that previous notes would have been the last, and then I get into another situation, I'm like, "Oh, actually this is worse than the previous memo."

I'd say the memo about Twitter, I can't say it was a sole catalyst for change, but back channel conversation, again, is like, "No, this was Twitter now holding itself accountable. We're now going to explore more about Black Twitter and about our users there. We're going to give more resources to the Blackbirds," which is the black employee resource group, and you've seen a tremendous change.

Now since I've left, I'm like looking back like, "Dang, y'all couldn't have done this while I was here?" I would hope that this would happen at Facebook as well. Facebook is much more resistant to change and is very invested in its external reputation, and so to change would be to admit that there is a problem at the company. It is not in the company's best interest to admit that.

AD: You think these companies have very different cultures. I mean, that's what I hear too. Where in contrast to something like Google, where there's a lot more internal dissent, Facebook is much more everybody is sort of facing the same way and has a shared mindset.

ML: Yeah. I think that was one of the scary things about working at Facebook is that people thought so similar and there was little room for dissent. I mean the patriotism of the company and the loyalty, the unwavering loyalty to Mark Zuckerberg and Sheryl Sandberg is just...

I often felt like I was in a propaganda state where I'm like looking around like, "You guys aren't actually going to question any of this? It's..." One of the main examples of that is when Mark was putting together his circle, where it was all these VPs and he was naming them and, of course, there were no people of color, one woman there.

People called him out and said, "Hey, what's up with diversity?" At the very end of his post announcing all these changes it says, "We're thinking about diversity." In the comments, people are like, "Thank you Mark. We appreciate you thinking about diversity." A lot of us are looking around like, "But is he actually going to do anything?" But we didn't say anything because that would have been sacrilegious.

AD: It's interesting because I think it puts a little bit of a myth to the sort of sense that there's this monoculture, right? It seems like these companies are very different and they all have their challenges, but they're very different kinds of challenges.

ML: Most definitely. With Reddit I found, and a lot of people who were over 30 found that the company was geared towards people under 30, and because a lot of its staff was in technical roles, and so there was a lot of boisterous, boys' club kind of behavior going on.

At Twitter, certainly you see a different age bracket, but like many tech companies, mostly white and Asian. For people who come from minority communities, it's all about finding your tribe, so you can exist within these spaces, and certainly within Facebook, that was the case.

Facebook's employee resource group for its black employees is called Black@. Black with the @ symbol. It was a vibrant community. People who were building partnerships with external communities, people who are building programs internally. Like really, really great things...but it was within this bubble. It was the black community building things for the black community with some outliers contributing.

AD: In the specific case of a Facebook, were they...obviously, they're Facebook, but they're also Instagram, they're also WhatsApp. You talk about, from the perspective of a black user, what are the ways that Facebook could have their back more, where you see there's a gap in what the products themselves do?

ML: There's two big gaps in terms of what Facebook and Instagram are providing for black users. One is having their content taken down because they are creating safe spaces, they are talking amongst themselves and saying, "Hey, this is just for black people only."

Then that gets reported as racist content, it gets taken down without any notification. Often, no ability to say, "No, what you're doing is wrong," because there is a lack of cultural competency inside of Facebook that Facebook recognizes, but they sort of brush it off and say, "Okay, we know that's a problem, but we're relying on AI, we're relying on algorithms, so that's not something we can handle right now."

The other part of that is when you see, I think the biggest example for me is the Instagram Explore page, which if you go there, what pops up on my tab is the Art page, the Art Explore page, the Health and Beauty page. I follow a lot of black artists, I follow a lot of health influencers who are mostly black.

According to Facebook, what appears on that Explore page is based on the people you follow, but none of the people that I see on that page are not reflective of the people that I follow, and so it becomes, "Is it reflective of the people you follow, and you think it's okay? Or is there a deeper problem here?"

In my world, in particular, what I found is that the black influencers, the people of color from underrepresented backgrounds wanted to engage with Facebook. They wanted to launch these products and to be on new platforms like IGTV or Watch, but they found themselves being excluded from this, just because of a lack of recognition. Some really big names that I could say, who are just getting the short drift because people don't know who they are. If you said to a black person or a Latino person, they would know exactly who you were talking about, but not from Facebook's mostly white nation staff.

AD: What do you think the range of response is within like the Black@ Facebook community to something like your memo? I'm sure there's a wide range of feelings and responses out there.

ML: Oh, sure.

AD: What do you think it is?

ML: When I posted the memo, before I posted the memo, I was thinking to myself, "I'm just going to be out here by myself. I'm going to be out here on a limb. I know that I'm saying something that is incredibly controversial, and I don't expect to be backed up on this."

The opposite was true, is that when I posted this internally, lots of black employees sharing their stories saying, "Yes, this is happening to me. This is happening on my team. I've experienced this kind of encounter," and so it is ... That was incredibly encouraging.

Now, when the post goes public, now it becomes a different thing, and so you're more likely to stick to your internal tribe or you're going to be more protective of your role, and so that same sort of fervor is not there. I certainly don't blame them. If I saw someone writing a post and I'd be like kind of mum, I'd say, amongst my colleagues for sure, but it does open up this greater conversation where it becomes, now it's not just Facebook employees who are speaking up, it's employees across multiple sectors, and you'll see that in the conversation that's happening on Twitter, on the media, on Facebook, of people saying yes. It sort of mirrors the #MeToo movement, where people are speaking up because now they have a catalyst to do so.

AD: What do you think is the wish list? What do you think are the list of demands? If you had to point out for yourself or you look at the broader context of so many tech companies are having these kinds of moments right now, what do you think should be prioritized as the changes that need to be made?

ML: There are people at tech companies who aren't white, Asian, and male, and the perspective is different. You can't say that discrimination doesn't exist at the company if you've never been discriminated against. That's a great part of allies, who say, "This may not be something that I'm encountering personally, but it is actually happening." Then all the things that come off of that, sexual harassment, lack of diversity, problems in hiring, all these things spawn because there is a homogenous aspect to the tech companies.

If you think everything's great, then you're not likely to change, and so it takes people stepping up and saying, "Hey, wake up, this is happening around us, and this isn't a figment of our imagination. This is causing real world problems. You're affecting employee morale. You're affecting our user growth. You're affecting our public perception because you don't look outside of yourself or outside of your group to see that there are possible changes that could be made."

AD: Does it affect the products themselves?

ML: Oh, it absolutely does. One of the things that I describe in my post is that black users are the most engaged by far across multiple metrics, but they feel that Facebook doesn't have their back or, in many cases, is stifling their ability for conversation. If your most engaged user group is having issues on the platform, that can only lead to decline and lower morale and less engagement with the platform. When you have less engagement with the platform, that means lower advertising dollars. That means that you are ultimately hurting your business. It just isn't seen as a problem.

I used to joke. I said, "Nobody says, 'I don't want to make more money.'" If you're a business, you want to have as much revenue coming in as possible. It's really smart to think of, okay, this is our most engaged user base. What are we doing here? If women aren't feeling comfortable in the Uber culture, then they're not feeling comfortable in the actual Ubers. Think about how this is hurting your business and not just, we're cool, we're good, everything's going all right. It could be better, and maybe you get a better paycheck out of it, I don't know.

AD: It's a really unique perspective. It's one that I appreciate hearing from. I think it's thoughtful and provocative for all of us to think about what our role is in speaking up and also how we go from one person, one voice, to galvanizing action either by saying the things that others don't say or by looking for others to organize and collaborate with.

Mark Luckie, thank you for joining us on Function and telling your story.

ML: Thank you for having me.


"I think that moving forward, we need a more clear constitution; something that we can all be held to and something we all understand when we sign up. Right now they've not been enforcing anything for years, and so everyone is wondering where they stand." — Matt Rivitz


AD: Now let's turn to Matt Rivitz. Matt caught my eye a couple years ago with Sleeping Giants. It's a community that he spearheaded with some help from collaborators and a rapidly growing community of people who believed in what he's doing. Basically, they were taking direct action, calling to advertisers saying, "You shouldn't support the kind of hateful messages that we feel we see on Breitbart." Breitbart is a notorious media site that Steven Bannon once said was the home of the alt-right. As you might expect with a site like that, a lot of people object to some of the messages they published.

What's unique is Sleeping Giants decided that targeting the advertisers on the site would be an effective way to protest the messages being shared. This is different. We talked before to activists within companies organizing within the giant tech firms, but this is happening from consumers. This is people saying we need to hold media companies and tech companies accountable, and maybe we can organize on the outside in order to have that kind of impact.

Matt was able to use the Sleeping Giants' community to galvanize over four thousand advertisers to drop their advertising and sponsorship of the Breitbart website. Matt and I talked about Sleeping Giants' work both in encouraging advertisers to drop their sponsorships of Breitbart but also in pushing for accountability for some of the hate speech and other activity on social networks like Twitter and Facebook.

AD: Matt, welcome.

Matt Rivitz: Thanks for having me, man.

AD: Sleeping Giants has hundreds of thousands of followers across all the different social platforms at a time, and a lot of times what you're asking those folks to do is take some small action each day. What are the things and the messages that you send them?

MR: It depends on the day. It started with a very simple set of instructions to ask people to go to Breitbart News, take a screenshot of an ad next to a particularly offensive article, there are lots of them, so that wasn't a problem, and then tweet it to the company at their corporate handle on Twitter. Again, a very simple set of instructions on our pin tweet at the top of our page. That has been ongoing. That's been a two-year process now, and it's been immensely successful.

Breitbart is obviously still going, and they have every right to do that, but what was happening was advertisers really had no idea that they were ending up on Breitbart. Just by letting them know, we were able to show them where they were showing up. The vast majority of them have decided not to continue to advertise on there. That's how it started. Any day there will be something that shows up because of lack of enforcement in terms of service on a social platform, or someone will do something horrendously wrong repeatedly, and we ask advertisers to consider how they're spending their money.

Again, it's a compounded effect, because we have a lot of people, the more people we get, the bigger voice we have. We're able to let them know. I don't think that a lot of us, me included, I don't think that any of us would have a voice loud enough for a lot of these companies in tech, or companies in general, to consider what we're saying, but if we have hundreds of thousands of people, they definitely take it under consideration a lot faster than they would if it were just a single person.

AD: This has been very effective, right? Thousands of advertisers have responded, saying, "Okay, we don't want to have our ads appearing next to content like this?"

MR: Officially, we've gotten four thousand, over four thousand sixty advertisers, I think at this point, is the number, to remove themselves, but the number is much greater than that. We know of a lot of large companies that didn't make a public pronouncement about it, and they just decided to no longer advertise there. But we know about them, but they've asked us not to announce it publicly.

AD: Is that what you expected? Do you think that was how it was going to play out two years ago when you started working?

MR: I did not know what to expect. I had no idea what I was getting into, and I'm not a hundred percent sure that I would have done it had I known, because it's completely changed my life in a lot of ways for better and worse. I'm on Twitter a lot, and I really wish that I didn't have to be on Twitter as much as I am, because it's a hellhole of a platform. Unfortunately, it's the most effective way to do what we're doing.

Yeah, I had no idea. It is gratifying to see in a number of ways that a lot of companies are making a smart decision, in my mind, and that they're making a moral decision. Again, I think when this started, companies really did not get involved with any of this stuff. Their MO was always to just stay as far away from anything political as possible, and I totally understand it.

I'm in advertising, so the goal is never to offend anyone, but right now, because of a number of reasons, they're in the middle of things. We don't bother them every day, but we ask them to make a choice not to support things that are bigoted or sexist or xenophobic. The tide has turned a bit, and that is incredibly gratifying, to know that they're making those choices.

AD: Was it your background in advertising that led you to build this campaign around targeting advertisers as the sort of leverage?

MR: I think I knew just enough. I didn't know what programmatic advertising was, which is a system of internet advertising that ends up placing advertisers on these sites without them knowing it. I didn't know anything about that. I am from a storytelling background. I write TV commercials, and I always have. That's where I started.

I knew just enough that the news sites, any site, is supported by advertising, and so it felt really fishy to me that the first advertiser that I saw when I went to Breitbart's...I didn't know about it pre election. I was just starting to hear about it. I had no idea what a massive, what a juggernaut they were, but I was pretty curious as to who they were, what they were printing. I happened to go on the site, and the first ad I saw was for sort of a progressive loan company from San Francisco.

Just knowing that when I was coming up, you would buy an ad. You would know exactly where it was going to run, what show it was going to run on, at what time. When I saw that ad on Breitbart, I just felt like there's no possible way they could know they were next to an article that says there's no hiring bias against women in tech, they just suck in interviews. That seemed really off base. That just what got me going and what got me curious about how they ended up there and why they were advertising on there.

In the beginning, I thought maybe I'd contact four advertisers, and they would say, "Oh, yeah, we don't support that," and they would go away, and that would be the end of that, but I had no idea that there could be thousands of advertisers that show up there, and it's still happening. It's pretty wild, and it's pretty irresponsible. I think we're just calling it out.

AD: Do you find common cause between what you're doing and the work you're trying to do in speaking up and those who are organizing within tech companies like Google or across the industry around things like the Muslim registry, trying to fight creating that, do you think those are people who share a common set of values or goals with you?

MR: Absolutely. I think anyone that wants to make some kind of difference, and they're in a unique position to do it, but they're also in a much riskier position to do it. I'm a freelancer. This stuff hasn't really affected my work. I continue to get work, and it's all good. They're risking their jobs to do this, and it's incredibly valuable.

I think if anyone sees something that they feel like is wrong within a company or if they're coming in as a consumer or as someone that uses a platform, I think it's their responsibility to speak up. I don't think that it's easy to do, and hats off to anyone that does it, but it's necessary right now. These companies control a large part of our life and how we buy things and how we communicate and how we share information. They own all of that now, and that's dangerous. I think they have a lot of power.

You see it with Facebook. They have not been super responsible with how they've used all of our information, and people are starting to fight back. It doesn't feel fair. I think, yes, the fact that there's a Google walkout, I think that they're speaking their mind, and they feel like they're not being heard and listened to, and these companies are really big, but ultimately they're built of people, and the people need to make themselves heard.

AD: Any of these big platforms, whether we're talking about Facebook, Twitter, YouTube, any of these sites, they have a terms of service that says this is what's acceptable and this is what's not. It's sort of the constitution of what's allowed there. Yet, it seems to me like that's one of your biggest areas of focus with Sleeping Giants, is that maybe those terms of service either aren't properly enforced or aren't up to date covering the kinds of threats or misbehavior that they should be. Is that right?

MR: Yeah. I mean, it's changed a lot in the last two years. When I started this, there were still Nazis all over Twitter denigrating people left and right, and there was harassment and there was doxing. It has gotten better, but ultimately the terms of service, they're supposed to mean something, and all the arguments that are being had right now about people feeling silenced or people feel like that algorithm is changing to silence them, I think it would all get taken care of if these companies would evenly enforce their terms of service.

That goes for, right now Breitbart's. If you look at the terms of service for Google and Facebook that are serving ads to Breitbart, some of that content explicitly violates their terms of service, but yet they've done nothing to enforce it at all on this website. Just some simple enforcement of clear policies regarding enforcement and what happens and how many strikes we all get, I think that's really important moving forward. I think it would take a lot of the guesswork out of all this.

It also, all the trust issues that all these platforms are having, if they just simply enforced their terms of service, that wouldn't be there. They could tell everyone exactly what they did wrong. They could tell everyone why, and they could tell everyone how many strikes they had, and if they had three strikes or ten strikes, whatever they get, then they would be able to, then they would get banned from a platform, and they would know exactly why.

Right now it's a little willy-nilly, and so we have us saying, "You've got to enforce it. There are clear rules right here. They say it. It's in print on your website, and you're not doing it." Then you have other people saying, "Wait, how come I got chucked from this platform, because I don't know what I did wrong." I think that moving forward, we need a more clear, again, constitution, something that we can all be held to and something we all understand when we sign up. Right now they've not been enforcing anything for years, and so everyone is wondering where they stand. That's hopefully where this is all going.

AD: Well, Matt, the work you've been doing at Sleeping Giants, as a community, has been doing, has undoubtedly had an extraordinary impact in the last two years. Thank you for joining us here on Function.

MR: Really good to be here. Thank you so much for having me.


AD: Of course, something as momentous as the Google walkout is going to get a press attention around the world. Twenty thousand employees walking out of one of the most well-respected employers in the world is dramatic. A lot of people might say, "Well, don't tech employees have it pretty good? I mean, I know I do. I work at a tech company, and we get free snacks and free drinks and stuff." But all the free snacks in the world won't cover up for if your working conditions are terrible, if you are being harassed or mistreated at your workplace because of your race or your gender or any other part of your identity. They especially won't cover up for if you feel like the product you're putting out in the world is having a negative impact, it's making the world worse.

What we see is that this Google walkout is not a moment but part of a larger movement. It's an effort that's been going on for years, maybe decades, little things that start small, like what Leigh Honeywell was doing with her collaborators on the Geek Feminism Wiki ten years ago; individuals speaking up, as Mark Luckie did, not just at Facebook recently but at Twitter before that; or even users who band together, as Sleeping Giants has brought a community together, with Matt Rivitz leading, to have people say, "We want to hold these platforms accountable for the impact that they have on the world." All these things are tied together, and what they seem to represent is tech growing up. We realize that the apps we use and the websites we visit have a real impact on people's actual lives.

That's it for this episode of Function. Next week we are talking about one of the most important technologies in democracy, the voting machines.

Function is produced by Bridget Armstrong. Our associate producr is Maurice Cherry. Nishat Kurwa is the executive producer of audio for the Vox Media Podcast Network. Our theme music was composed by Brandon McFarland, and big thanks to the team at Glitch.

You can follow me on Twitter at @anildash, and you can find the show and all the show notes at glitch.com/function. Please remember to subscribe to the show wherever you listen, and we'll be back next week with a new episode of Function.