Transcript
TERRY GROSS, HOST:
This is FRESH AIR. I'm Terry Gross. Internal Facebook documents were leaked by a whistleblower and acquired by my guest Jeff Horwitz, a technology reporter for The Wall Street Journal. He's the lead reporter for The Journal's new series of articles called "The Facebook Files." This series details how Facebook executives are aware of the ways the platform causes harm, but executives often lack the will or the ability to address them. The series reveals how a separate set of rules has been applied to VIP users like celebrities and politicians, allowing them to at least briefly escape restrictions and penalties that are applied to other users.
Facebook's own researchers are aware that Instagram, which is owned by Facebook, has negative effects on the self-image and mental health of many teenage girls. Internal documents also reveal that Facebook researchers have warned the company's executives that the platform is used in developing countries for human trafficking, drug-dealing and to promote ethnic violence.
The company's CEO, Mark Zuckerberg, has made it a goal to promote the COVID-19 vaccine, but his researchers have pointed out that that effort is being undermined by commenters spreading misinformation. At least some of the leaked internal documents have been turned over by the whistleblower to the Securities and Exchange Commission and to Congress.
Jeff Horwitz, welcome to FRESH AIR. Congratulations on the series, which isn't over yet (laughter). You're still - there's more to come. So what are these internal documents that were leaked?
JEFF HORWITZ: So this is a collection of internal research notes, executive presentations, in some cases company audits of its own practices that provide a pretty clear sense of how Facebook sees itself and the company's awareness of its own problems. And I think that's something that sort of separates it from a lot of other really good reporting on the company, which is that instead of this being outside voices asking questions about whether or not Facebook is being detrimental to the world, this is Facebook asking those questions and answering them and sometimes finding that the answer is very much yes.
GROSS: And what you're talking about is researchers from Facebook who report to executives and tell them what's going on. And often what they've told them is that this platform is backfiring. It's causing harm for these and these reasons.
HORWITZ: Yeah, exactly. I think it's important to draw a distinction between sort of irate watercooler chat and people letting off steam about things that don't really involve them at the company versus this stuff, which is these are the people that Facebook has hired to inform it of reality and to help it address problems. And in many cases, they are finding some really unpleasant things and then running into obstacles in trying to fix them.
GROSS: Now, are the obstacles a lack of will? Or are the obstacles that Facebook is so big and there are so many users that it is hard to control, even if you want to?
HORWITZ: I think that the premise that the company is just too big to be - to regulate itself isn't correct. There are - yes, having nearly 3 billion users is quite a lot of users to have to be in charge of. But what our reporting seems to indicate is that the company's complexity has become a big problem, as well as just kind of a lack of will and lack of interest in some instances. So it's not that a platform couldn't be made to work for this many users in a sort of simpler and safer way. It's that you can't have all the bells and whistles, and you can't maximize engagement in the way that Facebook would like to and not have that come at a cost.
GROSS: Let's look at the first program you reported on, which is a VIP program called XCheck. This is a program that basically created separate rules for VIPs and for everybody else who uses Facebook. What VIPs have been exempt from certain rules? What kinds of people?
HORWITZ: Oh, a lot of them. So Facebook has talked in the past about providing some - a little bit of extra leeway for politicians and fact-checking and misinformation - right? - the idea being that, you know, in an election, candidates should have the right to say whatever they want to say even if those things aren't strictly true. And the thing we found is that the protections Facebook offers to powerful users go far, far beyond that.
So they include celebrities. They include journalists. I have no doubt that you should qualify. I most certainly should qualify. They include athletes and just sort of people who are famous for being famous, influencers. They include animal influencers. So you know, just, like, literally, the account Doug the Pug is actually covered by XCheck, which was the program.
So basically, the idea is - the commonality among all these people and entities and animals is that they are big enough and prominent enough, they could cause problems for the platform. The way that this program was designed very explicitly internally was to avoid, quote-unquote, "PR fires." And I think that's something that kind of sticks out in general in this reporting, is that the thing that makes Facebook scared more so than harm that it might be causing is the risk of public embarrassment.
GROSS: What kind of public embarrassment? What kind of PR fire?
HORWITZ: So this can be everything from making a mistake and tangling with, you know, the singer Rihanna's account because she posted a risque French magazine cover to, you know, making an error on something Donald Trump said to, you know, anything that basically would result in the company receiving widespread public criticism. And I think this is something that is kind of - exists throughout the series, is that Facebook really likes to stay in the background. They really would like to be kind of viewed as this neutral platform in which just kind of life plays out online. And as you know, what our reporting tends to show is that that is not the case. The company is actively making a lot of choices, is determining which interests benefit and at what expense. And I think XCheck is kind of a perfect example of that, which is that the whole idea is to never publicly tangle with anyone who is influential enough to do you harm.
GROSS: Can you give us an example of a post that caused harm or could potentially cause harm that was allowed to stay up for a long time or a brief time because this person was a VIP?
HORWITZ: Sure. And there are - so there are a lot of them. Facebook's own analysis of XCheck found that 16.4 billion views of violating content occurred solely because of the lag time in taking down stuff from VIPs that shouldn't have been up in the first place. But I think the example I would give for how this program can cause harm and does sort of run against Facebook's sort of basic ethos of fairness is the Brazilian soccer player Neymar, who, in 2019, was accused by a Brazilian woman of rape. And he, to defend himself, took to Instagram and took to Facebook in a live video. And he showed pictures of this - of his WhatsApp chats with this woman, his messages with this woman. And those messages included not just her name, but also nude photos of her that she had shared with him.
And this is just a complete no-go on Facebook. You are not allowed to broadcast people's naked pictures without their consent. It is called nonconsensual nude imagery at Facebook. It's called revenge porn everywhere else. And the appropriate response, per Facebook's own rules, is to immediately take down the post and delete the account that posted it. So that was kind of what would have happened. A Facebook employee did catch this, you know, pretty early on and tried to delete it. But the problem was Neymar's account was cross-checked. So it didn't come down. In fact, it stayed up for 36 hours, during which it racked up 56 million views. And this resulted in extensive harassment of the woman who had accused him of sexual assault. There were thousands and thousands of impersonators of her. And the video was reposted just all over the Internet. And basically, Facebook acknowledged internally that it had just completely failed to protect this woman. And this happened because of XCheck.
Now, I think another part of the program that is important is that it really does and is intentionally designed to allow executives, communications and sort of public affairs people to weigh in on punishments that would otherwise be doled out. And that's what happened in this instance is that Neymar, who is one of the top 20 accounts on Instagram - like, this is a guy who is probably more famous for social media than he is for soccer. Facebook just simply wasn't willing to lose him. And so this got bumped all the way up to senior leadership of the company. And they determined that rather removing him from the platform, even though that was the absolute standard rule for this situation, they were going to kind of let it slide. So they took down the post in the end. But they didn't punish his account in the way they normally would. And I think it's kind of representative of the dual-class - or even more than dual-class system that Facebook created, in some ways, reinforcing power structures that, you know, the company has said it was supposed to kind of overthrow.
GROSS: There was a 2019 internal review of the XCheck program. What did that review say?
HORWITZ: I think people inside Facebook did have, on a long-term basis, a sense that exempting users from enforcement and from punishment on the platform was just, like, clearly not the right thing to do. This is not what Facebook was set to do. This isn't democratic. It isn't fair. And in 2019, an internal review of the XCheck program found a few things. The first one is that it was completely widespread, that there were dozens and dozens of teams that were enrolling users in various protections and that, in fact, pretty much any employee had been allowed to enter people into the XCheck program in the first place.
The second thing is that it was just deeply mismanaged and unorganized. And no one really even knew how these lists were getting pulled together. They weren't being reviewed by lawyers. There was just sort of, kind of this ad hoc process where people would just put in names. And the final thing is that they found that this was just completely indefensible. This was a breach of trust with users. It was putting users in risk of harm. And it was clearly unfair. And as they noted, this was publicly indefensible and simply something that, you know, was completely at odds with the company's own sense of its legitimacy as an overseer of its own platform.
GROSS: What was Facebook executives' reactions after getting this report?
HORWITZ: Facebook - I mean, no one disputed that XCheck was a mess and that the program was unseemly and was in, you know, direct conflict with what the company had said publicly its rules are. That said, they really weren't willing to take on the mess of just simply doing away with it, particularly with the 2020 election coming up. I think this is something that - you know, over the period of time that the documents we reviewed cover, this company was paranoid about the possibility that it might be blamed for something in relation to the 2020 election. And so they desperately wanted to keep a low profile. And there was no way that they were going to rein the program in because this was kind of one of their main methods of trying to avoid criticism from high-profile people.
GROSS: Let's talk about anti-vax posts on Facebook. Mark Zuckerberg has made it a priority to promote vaccines and facts about vaccines. But at the same time, Facebook has been used widely to convey anti-vax falsehoods. And you found that internal documents reveal that the anti-vax comments were mostly coming not from the original post, but from commenters. Would you describe what happened with that?
HORWITZ: Sure. And I think a important place to start here is what you said about Mark Zuckerberg and his goals. This is something - fighting COVID was something that Facebook was, perhaps, uniquely inclined and positioned to do. They early on recognized the threat of the public health crisis back when a lot of other people were poo-pooing the possibility of the global pandemic. They sent all their moderators home, content moderators home, with pay. You know, they sort of really reframed and sort of sprinted to provide new tools, to provide information, to, you know, help out with public health efforts. They really were focused on this. And this was something that came from Mark Zuckerberg personally. I mean, this was kind of going to be Facebook's moment.
And I think the interesting thing about this is that there were, you know, sort of all these resources and good intentions put into it, and yet also this kind of failure by the company to recognize the risks that its own platform could pose. And it's not as if Facebook hadn't had plenty of warnings that the anti-vaccine movement was very active on its platform. If you remember the, you know, measles outbreaks back in 2019 at Disneyland and things like that, there was a very, very aggressive community of anti-vaccine activists that have been active on the platform, had gotten really sophisticated in terms of their methods and their approach. And so the company sort of focused on the positive and all the things it could do that would be helpful and really didn't pay much attention to the, I think, fairly obvious threat that a small band of people who were extremely dedicated could pose if they correctly harnessed Facebook's tools, which they did.
GROSS: Well, let's take a short break here. And then we'll talk some more. If you're just joining us, my guest is Jeff Horwitz, who is the lead reporter for The Wall Street Journal's new and ongoing series of articles called "The Facebook Files." We'll be right back after a short break. This is FRESH AIR.
(SOUNDBITE OF OF MONTREAL SONG, "GRONLANDIC EDIT")
GROSS: This is FRESH AIR. Let's get back to my interview with Jeff Horwitz, who is the lead reporter on a new and ongoing Wall Street Journal series called "The Facebook Files," based on a series of leaked documents from Facebook. These documents detail how Facebook executives are aware of the ways the platform causes harm, but executives often lack the will or the ability to address them. Is it harder to oversee or to apply rules to commenters than it is with people doing the original posts on Facebook?
HORWITZ: This was a bit of a blind spot for the company. They hadn't really ever put that much resources into trying to understand comments, which is kind of funny because Facebook really did engineer its platform to produce a ton of comments. And they - what they realized early in 2021 was that, you know, as the vaccine was rolling out - was that all of the authoritative sources of information about it - right? - the World Health Organization, UNICEF and so on - all of their posts were just getting swamped by anti-vaccine advocates who were, you know, producing, at extremely high volume, content in the form of comments that was kind of just hitchhiking around.
And I think the company understood this, to its credit, at that point as being a real threat because, you know, it's one thing to see something authoritative from UNICEF, and it's another thing to see that same thing and then a whole bunch of people saying don't believe it, right? And that's kind of the style of comment that was rising to the top of Facebook's own systems. So they realized that basically all of the things they were doing to try to promote authoritative information were in some ways being harnessed by the people who were trying to promote the exact opposite.
GROSS: Internal documents also show that Facebook knew - that it was really a small group responsible for most of the COVID misinformation on Facebook. So what was Facebook's response to this research that was delivered to executives?
HORWITZ: Yeah. So the initial response was just basically horror because they realized that, you know, there were just a very high proportion, not only of comments but also posts in general, that seemed to be - vaccine-hesitant was the company's phrase - so not necessarily straight misinformation - you know, false things like saying vaccines cause autism or make you sterile - but people who simply were exercising their right to speak on the platform as often as possible and in just extremely coordinated, almost cut-and-paste-style ways. And they were creating, basically, a false sense that there was a large public debate about the safety of vaccines, when there really isn't.
So the initial response was just, uh-oh, this is a huge problem. We've got to fix it. And then the second response was, OK, how do we do that because they didn't really have the tools in place. They hadn't planned for this. And so they had to kind of make do with a whole bunch of kind of ad hoc interventions and try to sort of start getting public discourse to be at least somewhat representative - right? - so that any time someone who was, you know, encouraging about vaccinations wouldn't just get dogpiled by a - you know, a very, very dedicated group of anti-vaccine advocates.
GROSS: Were these changes effective in stopping misinformation about the vaccine?
HORWITZ: I think it's kind of too soon to tell how well they did. Certainly in terms of preventing this stuff from getting traction in the first place, they failed - right? - means that there were, you know - the whole problem and the thing that kicked this - kicked Facebook's response into gear was that public debate on the platform about this thing was skewed. It was getting sort of manipulated by anti-vaccine advocates. And, I mean, the fact that this was happening in 2021, as the vaccine was getting rolled out, you know, from, you know, the initial sort of first responders and medical officials to the broader population, certainly seems like it could have had an impact.
And I think, you know, the company would note that it's not the only source of vaccine misinformation in the world by any means, right? There's plenty of stuff on cable TV that would have you believe bad things about the efficacy, safety and utility of the vaccine. But certainly, it's a remarkable thing for a company that really saw itself as being, you know, in the vanguard of solving a public health crisis that, you know, they're basically having to go back and fight with this highly active, somewhat ridiculous community that is just spamming their platform with bad information.
GROSS: Let's take another break here, and then we'll talk some more. If you're just joining us, my guest is Jeff Horwitz, a technology reporter for The Wall Street Journal who's the lead reporter for The Journal's new series of articles called "The Facebook Files," based on internal Facebook documents that were leaked to The Journal. We'll be back after we take a short break.
I'm Terry Gross, and this is FRESH AIR.
(SOUNDBITE OF CHARLIE HUNTER AND LEON PARKER'S "THE LAST TIME")
GROSS: This is FRESH AIR. I'm Terry Gross. Let's get back to my interview with Jeff Horwitz, a technology reporter for The Wall Street Journal who's the lead reporter for the Journal's new series of articles called "The Facebook Files," which detail how Facebook executives are aware of the ways the platform causes harm but executives often lack the will or the ability to address them. The series is based on internal Facebook documents that were leaked by a whistleblower to Jeff Horwitz.
Let's talk about Instagram, which is owned by Facebook. Internal research from Facebook shows that Instagram could have a very damaging impact on teenage girls' self-image, their anxiety, depression. Why does Instagram sometimes have that effect on teenage girls? - 'cause you write that the algorithms on Instagram create a perfect storm for many teenage girls.
HORWITZ: Yeah. So body image issues and social comparison obviously didn't originate with the internet. That said, Facebook's own research found that Instagram had some uniquely harmful features in terms of encouraging young women in particular to compare themselves with others and to think about the flaws of their bodies in relation to others.
And, you know, this wasn't intentional. The company certainly hadn't meant to design something that did this. But, you know, there was no question in their own findings that, you know, compared to even other social media products, Instagram was worse in this respect - that it was very focused on the body as opposed to the face or performance and that, for users who arrived at the platform in not the best mental place, it could really have a big impact on them.
GROSS: What is the way in which algorithms create a perfect storm for teenagers? - 'cause you say that in the article.
HORWITZ: Right, right. So I think there's some core product mechanics here, which is that Instagram will always show you the most popular and successful posts from your friends and the people you follow and - whereas you're comparing that to your regular posts and your regular life. So there's kind of this kind of highlight reel ethos to it that tends to lead users to think that everyone else is living their best life while, you know, they're not.
And so that's part of it. Another part of it is just simply that people tend to be attracted to content that sort of really resonates with them. And if you have body image issues already, Instagram - and you are engaged with sort of looking at people who are prettier than you are on the platform, Instagram's going to keep on doing that. If you have concerns about diet and fitness and you think you might be overweight, Instagram is likely going to pick up on that and feed you a ton of dieting and fitness content.
And so they're kind of this - there's this feedback loop that the platform can create. And it turns out for people who are in a vulnerable place in the first place, it can be really damaging and, in some ways, lead to almost addictive-type behavior per Instagram's own analysis.
GROSS: So what you've just described is reported in documents that were written by Facebook researchers and then delivered to Facebook executives. So executives knew what you just told us, right?
HORWITZ: Absolutely. And Adam Mosseri, who's the head of Instagram, in fact, commissioned a lot of this research in the first place. So, you know, I think there's some credit that should go to the company for determining that - given the extensive external criticism of the company on these fronts, that perhaps it should at least get to the bottom of them. And it did. I mean, I think there's no question that what it found, you know, was convincing. As the company's own presentation - one of the presentations to executives notes, we make body image issues worse in 1 in 3 teen girls.
GROSS: But you write that this represents one of the clearest gaps revealed in these internal documents, gaps between Facebook's understanding of itself and its public position.
HORWITZ: Yeah. Look; I can understand why someone in corporate communications isn't eager to make the sentence, we make body image issues worse in 1 in 3 teen girls, public, much less some of the other things in these findings which included that young women who had thought about self-harm or suicide in the last month - that a not-tiny fraction of them traced those feelings directly back to Instagram's platform. So think potentially life-threatening effects.
And I can understand why the company wouldn't want to acknowledge that publicly, you know, or wouldn't want to talk about it much. I think what's interesting is the company did talk about these issues. They just didn't say that. What they said is that there were perhaps small effects, that the research was inconclusive, that, you know, there wasn't any, you know - that, you know, if there was an issue, it was bidirectional, so it was good for some users and bad for some users - basically really downplayed the clarity that they had internally about what was going on and the effect of their product.
GROSS: What was Facebook's reaction to your article about teenagers and Instagram?
HORWITZ: They defended the research and keeping the research private as necessary for, you know, honest internal discussion. And they, I think, tried to argue a bit with whether or not the conclusions of causality that seem to be very present within their own - how their own researchers discussed this stuff even with management - they sort of tried to undermine, you know, the certainty that it really sort of feels like pervades the presentations that the company's researchers gave to executives.
But, you know, I don't think they disagree with the issues. They sort of defended the things that they have said previously about there being relatively small effects. And, you know, I've noted that for many users and users who are in sort of a healthy emotional place, Instagram is a lot more beneficial than it is harmful, all of which is true. None of that is wrong. It's just that the question is, at what cost to vulnerable users?
GROSS: Well, let's take another short break here. If you're just joining us, my guest is Jeff Horwitz, who is the lead reporter for The Wall Street Journal's new series of articles called "The Facebook Files." We'll be right back after a break. This is FRESH AIR.
(SOUNDBITE OF SOLANGE SONG, "WEARY")
GROSS: This is FRESH AIR. Let's get back to my interview with Jeff Horwitz, a technology reporter for The Wall Street Journal. He's the lead reporter for The Journal's new series of articles called "The Facebook Files," which detail how Facebook executives are aware of the ways the platform causes harm, but executives often lack the will or the ability to address them. The series is based on internal Facebook documents that were leaked by a whistleblower to Jeff Horwitz.
One of the articles in the series is headlined "Facebook Tried To Make Its Platform A Healthier Place. It Got Angrier Instead." And this article is about a change that was made in 2018 that rewarded outrage. What was the change?
HORWITZ: Facebook promoted something in 2018 called meaningful social interaction. And the idea was that passively scrolling through content wasn't good for people - you know, it just turned them into zombies - and that what Facebook should be doing is encouraging people to sort of connect and engage with each other and with Facebook content more often. And there were two parts to this. One part was promoting content from people's friends and families, which was kind of a throwback to kind of an earlier era of Facebook where it was much more about that stuff than it was about kind of a constant stream of information and content.
The second part, though, was rewarding content that did really well on engagement, meaning things that got a lot of likes, but even more important than likes, things that got a lot of emoji responses, comments, re-shares, direct message shares and things like that - so basically things that made users kind of pound the keyboard a bit and, you know, share and engage as much as possible. And you know, nothing about that seems, you know, atrocious in sort of a general high-level view. But it turns out, as Facebook realized down the road, that the effect that had was privileging angry, incendiary conflict because there is nothing more engaging than a fight.
GROSS: And news publications, as a result, found that a lot of their traffic was decreasing dramatically. What was the connection?
HORWITZ: So there was some element of this where they were just kind of reducing news overall in feed at the - you know, in other words - and to boost the stuff from friends and family. But I think the type of content that succeeded changed. And one thing we found was that BuzzFeed's - the head of BuzzFeed, Jonah Peretti, who is - you know, no one could accuse this guy of being unsophisticated when it comes to social media - was actually figured out that something had changed materially when Facebook rolled out this stuff and that, essentially, a type of content that was succeeding was - on the platform, was, like, sensationalistic, incendiary. Gross medical stuff was doing well - you know, things that sort of got a response. And you know, his point to Facebook when he got in touch was that, look, like, you guys are forcing us to produce worse content.
And the same thing was true of political parties. They also picked up on what had changed, and they started adjusting accordingly. And so parties told Facebook that because of, literally, this algorithm change - like, some reweighting, some math - that they were shifting not just their communication strategy for the internet but, in some instances, their actual platform.
GROSS: Once this was reported to Facebook executives, what actions did the executives take?
HORWITZ: Facebook's attraction to meaningful social interaction as a metric wasn't just that they thought it would be good for people. It's also - they thought it would be good for Facebook. They really needed people to be engaging with content more because they'd been in decline in commenting and interaction in a way that was threatening to the future of a social network dependent on user-generated content. And so this had been really successful in terms of getting engagement back up and getting people to comment more. And the problem was that doing the things that researchers said would be necessary to sort of correct the amplified anger issue was going to come at the expense of some of the growth metrics that Facebook was pursuing. And that's always a hard sell inside that company.
GROSS: What was Facebook's response to this article?
HORWITZ: So Facebook noted that they had made some changes, which is true. I think the thing that we were very focused on is that people up to and including Mark Zuckerberg kind of resisted anything that was going to cause sacrifices in user growth numbers and in user engagement numbers for the purpose of improving the quality of discourse on the platform. So they told us on this one that basically any engagement-based ranking system or any ranking system is going to have problems - right? - that yes, they acknowledged that incendiary content did benefit from what they'd done, but, you know, that's not to say that there aren't disadvantages to other systems as well.
GROSS: So one of your articles in The Journal reports that in developing countries, Facebook was often used by drug cartels, human traffickers, used to promote violence against ethnic groups. And developing countries are actually very important to Facebook now. And why is that?
HORWITZ: People in poorer countries - they don't provide Facebook much money, but they do provide it with a lot of growth. The Facebook has basically stalled out in developed economies. I mean, there isn't really many - there isn't much in the way of new user growth to be achieved in the U.S., Canada, Europe and wealthier nations. So this is kind of where pretty much all of the company's growth has been coming in recent years. And you know, that makes them kind of - places like India are sort of the company's future.
And at the same time, though, Facebook has never really invested much in safety in those environments. And you know, they had, for example, a team of just a few people trying to focus on human trafficking across the globe. That includes sex trafficking, labor trafficking, organ trafficking. And they were clearly overwhelmed. And there were some, I think, serious issues of the company just simply not really caring all that much.
I think one instance we found was that the company had identified sort of wide-scale human trafficking occurring, in which people from the Philippines and Africa were kind of indenturing themselves into domestic labor in the Gulf states. And they were - once there, kind of lost all autonomy. They could literally be resold without their permission. And Facebook actually had - first of all, had allowed this for a long time. Like, up until 2019, it was actually OK for people to be sold on Facebook so long as the selling was happening through brick-and-mortar establishments, as long as, you know, there was - it was in a country where this was allowed. And then I think more broadly, Facebook had just kind of turned a blind eye to this whole practice. One thing, you know, that I think was - really stood out to me just in terms of demonstrating the company's lack of will on some of these things is that Facebook, while it had identified widespread human trafficking, hadn't done anything about it - and in some instances for years.
The thing that Facebook - moved Facebook in 2019 to take aggressive action on this was Apple. You know, maker of my iPhone told Facebook that it was going to take away - it was going to remove Instagram and Facebook from its App Store, basically make it so that people couldn't download the apps unless Facebook got its human trafficking problem under control. And boom, that was it, right? Actually, understanding human trafficking was happening on its platform wasn't enough to get Facebook's attention - what did was the threat that Apple might take an action that would severely damage its business. So Facebook, literally within days, was just pulling down content all over the place. And the crisis passed. And then, as we found, things went back to normal. And normal means that human trafficking is happening on a pretty widespread scale on the platform.
GROSS: Another obstacle that you report is Facebook doesn't have enough people monitoring posts who speak the dialect needed to identify dangerous or criminal uses of Facebook.
HORWITZ: Yeah. And this is something that I think - look; like, I think we're all familiar with Facebook's apologies right now, right? Like every couple of months or weeks or days, depending on how closely you're monitoring it, the company ends up saying that it's sorry that something happened. And particularly overseas, it seems like there's just this kind of succession of inadvertent oversights that come with large human consequences. And the thing we found is that these aren't accidents. These aren't due to the company, you know, just simply having too much to possibly do. These are issues of direct neglect. So for example, with Arabic, it's the third - world's third most commonly spoken language. It has many dialects that are mutually incomprehensible. Facebook literally can't - doesn't have anyone who can speak most of them or can understand most of them in terms of sort of the vernacular. And it also doesn't have a system to route content in those dialects to the right people.
So when something happens like the Israeli-Palestinian violence earlier this year, the company is just sort of woefully unprepared to deal with it. They can't process content. They don't have people on staff. And, I mean, one of the things that's kind of tragic that we could see inside the documents was that you had all of these people who work for Facebook with Middle Eastern backgrounds who were just desperately trying to, like, kick in ad hoc to try to, like, help steer the company in a better direction because it was just screwing up so much at a time that was, like, so crucial on its platform.
GROSS: Nick Clegg, who's the Facebook vice president of global affairs, recently published a blog post saying that The Wall Street Journal articles have contained deliberate mischaracterizations of what Facebook is trying to do and conferred egregiously false motives to Facebook's leadership and employees. What's your reaction to that?
HORWITZ: My reaction is that Facebook has the right to say whatever they would like to say in response to our reporting. I think the more useful reaction to that isn't mine. It's that there actually have been in recent days a large number of former Facebook employees who have directly taken issue with what Mr. Clegg and what the company has said on these subjects. And I mean, these are people who actually were doing the work. Like, there are names that are popping up on Twitter that are the names that were sort of protagonists, I suppose, in some of the stories I could see playing out inside of the company.
And what they've said very clearly is that - you know, one, that the things that we're raising are pretty much correct and, two, that there is, in fact, this history of kind of disregarding the work of the people Facebook's asked to do integrity work - integrity just being platform safety and content quality stuff. And so, you know, I think there's something really encouraging about some of these voices coming to the fore because these are people who sort of pioneered not just the ways to measure problems on the platform, but also ways to address them. And so the idea that they might be able to come out and talk more about the work they did is, I think, really interesting to me and, in some ways, would be very healthy for the company.
GROSS: My guest is Jeff Horwitz, who is the lead reporter for The Wall Street Journal's new and ongoing series called "The Facebook Files." This is FRESH AIR.
(SOUNDBITE OF YO LA TENGO'S "WEATHER SHY")
GROSS: This is FRESH AIR. Let's get back to my interview with Jeff Horwitz, a technology reporter for The Wall Street Journal, who's the lead reporter for the Journal's new series of articles called "The Facebook Files." The series details how Facebook executives are aware of the ways the platform causes harm. But the series also says executives have often lacked the will or the ability to address those problems. The series is based on internal Facebook documents that were leaked by a whistleblower to Jeff Horwitz. What are some of the suggestions current or former Facebook employees have made, that you're aware of, of how to improve some of the problems that you've reported on?
HORWITZ: Yeah, I think Facebook tends to treat social media as if it's - you know, Facebook is the only way in which it could possibly exist - right? - kind of a love-it-or-leave-it approach. And that, for their own - per their own employees, is absolutely not true. There are a number of things that can be changed, right? So in addition to just simply the question of resources, which would address a lot of problems, there are also ways in which the platform perhaps has grown too complex to be safe. So, for example, in developing countries, is it really a good idea for things to be able to go viral in a matter of minutes? Maybe that's not good if you're worried about information quality. So virality restrictions is one thing.
There's other work that I think seems like it would be really promising, such as trying to give more prominence to voices that seem to have respectful conversations. It's the - the concept is called earned voice. And rather than just sort of rewarding the biggest loudmouth, this would reward people who tend to be able to have conversations with people who aren't necessarily exactly like them that are nonetheless respectful and, you know, mutually satisfying. Now, that's not, of course, the way you get the most engagement, but it is something that could potentially provide a different style of conversation that would be, I think, recognized by most people outside the company as healthier.
GROSS: Recently, Facebook created what's been described as a Supreme Court for Facebook, an outside entity of experts who would help Facebook make complicated decisions about content. How has that been actually functioning?
HORWITZ: So this came up in the XCheck story that we did about the sort of special protections for VIPs. Facebook spent $130 million creating the Oversight Board and - with the stated purpose of providing transparency and accountability into its operations. And one of the powers it gave the Oversight Board was the ability to ask Facebook questions that Facebook would then have to answer, assuming that they were relevant. And in the case of XCheck, the board asked the right questions. In relation to Donald Trump's suspension from the platform, the board asked, very specifically, for data about the program and for the XCheck program and about protections for VIP users. And Facebook said it didn't exist. And this is obviously awkward, given the stuff we've seen, because, you know, we can actually see there were internal dashboards of metrics as well as just voluminous documentation of the program's problems, of the number of accounts, of how many bad views of content occurred as a result of the lag in review times. You know, this is a pretty well-documented program internally, and Facebook told its supposed overseers that it just simply didn't have the information and couldn't possibly gather it.
And the Oversight Board has, at this point, issued some pretty strong statements of discontent with that situation. But I think it does seem like a bit of a crisis in the sense that, you know, oversight does imply the ability to actually see what's going on inside the company. And I think the Oversight Board has, to its credit, recognized that that isn't something that Facebook is readily willing to provide. So what their role is, I think, going forward is going to be an interesting question, because they're, - you know, they're kind of being asked to play a self-regulatory role for Facebook. At the same time, they are fully independent, and they also seem to not have much trust in Facebook and whether Facebook's going to give them the truth about what Facebook is itself doing.
GROSS: Well, Jeff Horwitz, thank you for your reporting, and thank you for coming on our show.
HORWITZ: Thank you so much, Terry.
GROSS: Jeff Horwitz is the lead reporter on The Wall Street Journal series "The Facebook Files." If you'd like to catch up on FRESH AIR interviews you missed, like this week's interviews with B.J. Novak, who played Ryan in "The Office" and has a new TV series, or Max Chafkin, author of a new book about the controversial co-founder of PayPal, Peter Thiel, check out our podcast. You'll find lots of FRESH AIR interviews.
(SOUNDBITE OF JOHN COLTRANE'S "GIANT STEPS")
GROSS: FRESH AIR'S executive producer is Danny Miller. Our technical director and engineer is Audrey Bentham. Our interviews and reviews are produced and edited by Amy Salit, Phyllis Myers, Roberta Shorrock, Sam Briger, Lauren Krenzel, Heidi Saman, Ann Marie Baldonado, Thea Chaloner, Seth Kelley and Kayla Lattimore. Our digital media producer is Molly Seavy-Nesper. Therese Madden directed today's show. I'm Terry Gross.
(SOUNDBITE OF JOHN COLTRANE'S "GIANT STEPS")
Transcripts are created on a rush deadline, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of Fresh Air interviews and reviews are the audio recordings of each segment.