Facebook & Google Play the Censor: Are our Civil Liberties Endangered?

Rolling Stone’s Matt Taibbi joins us to examine the dangerous juncture our freedom finds itself in when Facebook, Twitter, and Google work with the government and its intelligence services to control what we see and hear


Sorry, we couldn't find any posts. Please try a different search.

Story Transcript

MARC STEINER: Welcome to The Real News Network, everybody. This is Marc Steiner. Great to have you all with us.

As you know, we spoke recently with Nadine Strossen from the ACLU about the dangerous slippery slope we find ourselves on. But not only cheering on the fact that Facebook shut down the vile Alex Jones, but more importantly, not understanding the danger that that poses for all of us. And more to the point, to the civil liberties that we all seem to cherish. The platforms like Google and Facebook are where many of us increasingly get all of our information. We talk to our friends and family. We use it as a public space. But they and all the digital media that we use don’t belong to us. They’re private entities that are now making a regular habit of banning images and words for a variety of both nebulous and extremely dangerous reasons, or potentially dangerous reasons. Facebook and others are also teaming up with some pretty shady characters and businesses as they develop a new model of censorship. That should give us all pause.

One of the people writing about this a great deal is journalist and writer Matt Taibbi, who has been writing about all this in the Rolling Stone and many other places. We’re going to focus on the articles he’s been doing for Rolling Stone. And Matt, welcome, good to have you here on The Real News.

MATT TAIBBI: Thanks for having me, Marc.

MARC STEINER: I said before we started, this is a really dangerous piece we’re looking at. I mean, the pieces we’ve done so far on this can almost make it seem arbitrary, being done by technocrats or technology that are unseen by all of us. But what you’re implying and writing about, it goes a great deal deeper than that.

MATT TAIBBI: Yeah, I think the really, the wakeup call for me was the news that Facebook announced a partnership with a group called the Atlantic Council. Which is essentially sort of a NATO-backed think tank. If you look at its board it’s packed with ex-security state officials; people like Henry Kissinger and Michael Morrell, the former acting CIA Director Michael Hayden. It’s a quasi-governmental agency that sees itself essentially as a sort of government in exile. It imagines itself as the sort of watchdog that would have existed had Hillary Clinton won the presidency.

And Facebook has been working in conjunction with the Atlantic Council for some time now. They’ve also been consulting with Congress, and apparently also some White House officials to police what they call ‘inauthentic content,’ which theoretically means that they’re deleting fake news accounts. But it could mean more than that. And we’re not exactly sure what the methodology is. And that’s very scary, and has very serious First Amendment implications.

MARC STEINER: Let’s talk about this ‘inauthentic content.’ One of the things that bothered me a great deal as I was reading your pieces were that some of the more liberal Democratic members of the United States Congress, senator Mazie Hirono of Hawaii, and Murphy of Connecticut, were pushing this because they wanted to shut down what they called Russian disinformation that was taking place on Facebook and other parts of the digital world. And the danger is using that as an excuse to go even deeper. I mean, that to me is a real- I mean, a disconnect. Question is, is there a disconnect between folks like that in Congress who don’t really understand the depth of this, what censorship means? Or do they? There’s a real, it’s a conundrum here.

MATT TAIBBI: It’s really, it’s kind of an amazing moment in the history of media in this country, and it seems to have kind of slipped past the attention of elected officials and the media. We’ve never really had an aggressive media regulator in the United States. Of course we’ve had the FCC, but they’ve never really been intrusive to an enormous degree. Certainly not on the level of political content.

And what we have here is a sort of two-stage story. The first stage of the story happened over the course of the past few decades, when Google and Facebook essentially became a duopoly in terms of media distribution in this country. They control somewhere between 70 and 75 percent of the media distribution. That means most people get their news from one of those two sources. So if you control Google and Facebook, and they’re allowed to decide who gets in the feed and who gets deemphasized, then you can really control the media. And so the idea that they’re working with governmental bodies to push some content down and push others up is unbelievable. It’s a situation that we’ve never had a parallel to in this country before.

MARC STEINER: And as you point out in your article, Facebook has 2.2 billion users. One million people at any given time in the course of a day. So can you describe- I mean, how do they even begin to do this? I mean, how did they begin to censor things? People know the name of Alex Jones, and while he may be despicable, throwing him off, you know, everybody is next. But I mean, how do they do it if they have so many users? How can they hone in at all?

MATT TAIBBI: You know, I talked to a former Facebook employees including some people who were actually involved in this process of what they call ‘content integrity.’ And you know, again, the sheer quantity of content is too big for human beings to watch. More than a billion pieces of content a day. So most of it is flagged by machines. What they’re looking for, you know, skin tones for, you know, obscenity or pornography. Certain keywords for abusive language. And then what happens is the automatic sensors will flag material, and then it will go to a human being for review.

The problem with this is that they’re now starting to review not just for obscenity or pornography, but also for political content. And you know, we saw the beginnings of what could be a problem a couple of years ago when they deleted the iconic image of the napalm girl in Vietnam. And now we’re seeing wholesale deletions of political accounts in countries all over the world. You know, 10,000 accounts got wiped out ahead of Mexican elections on July 1. We have hundreds of accounts in Brazil wiped out. And we don’t know exactly what criteria they’re using to delete some of these accounts, and I think that’s the problem, is that it’s not transparent. So yeah, we’ve never had a media regulator, and that’s one problem. But we’ve also always had transparency in the courts when we punish people for that speech. We don’t have that either, now.

MARC STEINER: And you were referring earlier that that photograph, the iconic logo of Kim Phuc, who is the young girl who was hit by napalm running down the street naked. And they didn’t want naked images of children, so they took that off of Facebook.

MATT TAIBBI: Right, yeah, exactly. And as the person I talked to pointed out, the algorithm doesn’t know that that’s iconic journalism. So they wipe it out. The concern would be is that they’re going to start looking at content like that and painting with a broader brush what their idea of abusive language is, what hateful content is, what inauthentic content is. All these things can be used to to remove smaller media from these platforms without too much backlash, and that’s what we’re really worried about.

MARC STEINER: But is it all technology? I mean, when you, when you go after Alex Jones, when you go after Venezuelanalysis, when all of the, what I’ve been reading in the media from all of the kinds of pages by Black Lives Matter activists and other people who have been fighting against racism, and those folks have their pages either temporarily or permanently deleted. I mean, is that, is that technology? Is that an algorithm? Or are those people?

MATT TAIBBI: No, and again, this is the problem. The company, of course, is wholly unsuited for the task of monitoring and sifting through content to decide what is, what is real news and what isn’t. They’re not a news organization. They’ve never really claimed to be. In fact, they’ve aggressively- you know, Facebook, up until a year and a half ago, was openly saying that’s not what we do. We are not editors. We don’t sift through content. So inevitably what’s going to happen is they’re going to appeal to an outside body for help in sifting through all this material, and that’s where you have this issue of their cooperating with something like the Atlantic Council in the United States, or you have a situation like in China, where Google is cooperating with the Chinese government to maybe sift out content there. The Israeli government is also working with both of these companies. That’s where you have the problem. These are not news organizations. They don’t have the traditional urge to be separate from governments the way that news organizations do. They want to work with the governments, and that’s, that’s a serious problem.

MARC STEINER: It’s a very dangerous problem. And let me pull out some of that from what you were just saying, which you also wrote in your articles. When you quoted Mark Zuckerberg in 2016, saying editing content, that’s not us, right. So what, what changed in Zuckerberg? Was it his fear of the government? Was it his desire to cooperate with government entities? I mean, what do you think was the dynamic that moved all this around.

MATT TAIBBI: I don’t think there’s a whole lot of mystery there. After the fake news scandal and the associated controversy with the 2016 election, the Internet platforms were all hauled before Congress last year in groups. And individually, Zuckerberg appeared before the Senate last year. All these companies saw sharp drops in share price ahead of their meetings with legislators on the Hill because investors were afraid of increased regulation. And essentially what the people on the Hill said to these companies was we want you to draw up a plan to help prevent the sowing of discord. And the implied threat was if you don’t, we’re going to increase the amount of regulation on these firms. And so yes, they’re agreeing, they’re acceding to the request of the government to be more involved in this filtration process, because they don’t want to be regulated. They don’t want to be taxed. They don’t want to have greater oversight.

So unlike a news organization, which would vehemently resist any call by the government to be involved in what they do, these platforms are likely just going to roll over, because it has nothing to do with their commercial model whether they produce good journalism or not.

MARC STEINER: So what does your investigative sense say to you about what this has to do with the Atlantic Council? I mean, in a sense there is this kind of conflict going on between the U.S. government and Russia, and much of NATO and Russia. And this seems to be playing into it. And you talked about who the Atlantic Council was, and how many of them come out of NATO and out of the CIA. So I mean, what- I mean, not that you can make suppositions you don’t have facts about. But what do you think the dynamic may be, or what’s it pointing to?

MATT TAIBBI: Well, look, if the United States government had an open contract with Facebook, then it would be a clear First Amendment violation. So it can’t be done in that way. The way it’s done now, if you have a private organization- and the Atlantic Council still is that, technically- working with that other private organization, which Facebook is, which Google is, which Twitter is, then there’s no First Amendment issue. Technically, at least not yet, because the First Amendment only speaks to the government suppressing or regulating speech. If it’s done on private property they can get around that entire issue. And that’s my concern with this, is that this is really an end run around the First Amendment. You take a bunch of quasi-governmental ideas and you have them work with these duopolistic corporations, and you’re essentially controlling the flow of information. You know, perhaps on some legitimate national security grounds. But there’s a huge potential for abuse there as well.

MARC STEINER: And so again, let me come back to something else you wrote about, which has to do with Israel and China, and those two countries kind of leaning on Facebook and Google to do their will, whether it’s in Israel to censor Palestinian activists, or whether it’s China just to be able to watch what goes on, and make sure that they’re censoring the activity, they don’t want somebody there. But this is almost in some senses- maybe I’m being really overly paranoid here.

MATT TAIBBI: No, you’re- I don’t think you are.

MARC STEINER: I mean, just, that this is great training ground for how to control things.

MATT TAIBBI: Well, right. I mean, this, this is the concern, right, is that you have these companies that are making an enormous amount of money. Their entire business model that based upon volume, and getting as many people onto their platforms as possible, talking as much as possible, giving the companies as much information as possible so they can be targets for advertising. The business model for them does not rely upon them being a trusted source for media, or for protecting their First Amendment rights. So for them to get into markets like China, or in other politically perhaps not completely free countries as well- I mean, one could argue about Israel, but they have more or less an open relationship with the Israeli national security agencies as well. You know, that’s what’s troubling, is you have these platforms that have a history of cooperating with governments that don’t really have any concern about speech rights.

And now we see the same kind of cooperation perhaps taking place in the United States. And we never had a public conversation about this. And I think that’s that’s what’s really troubling, is that this could happen in a flash. We could have a, you know, essentially a government censorship body, or something like that, without ever having talked about it. And that’s a real problem.

MARC STEINER: Well, Matt Taibbi, I really look forward to watching what you unfold next, and having more and more in-depth conversations. I appreciate the writing you’ve been doing and the investigative work you’ve been doing, as you’ve always done. Thank you so much for that, and I look forward to our conversation again soon.

MATT TAIBBI: Thanks very much, Marc. Take care.

MARC STEINER: And I’m Marc Steiner for The Real News Network. Thank you so much for watching. We will be covering this in greater depth. Take care.