September 18, 2019 10:53AM

A First Look at Facebook’s Oversight Board

Today Facebook released a Charter for its Oversight Board. This institution may well face insuperable difficulties and come to nothing. But it is possible, perhaps likely, that the Facebook board will significantly influence the future of speech on the internet. The charter announced today offers a kind of constitution for the Oversight Board. What does the Charter mean for free expression?

First, some context. Facebook maintains Community Standards that users agree to abide by when joining the platform. Facebook censures or removes users who violate those standards. Such “content moderation” goes back almost to the founding of Facebook. Facebook may suppress speech in this way because the First Amendment does not apply to privately-owned forums like social media.

Facebook officials often say content moderation involves a tradeoff or as the Charter notes, “Free expression is paramount, but there are times when speech can be at odds with authenticity, safety, privacy, and dignity.” This statement is both ominous and reassuring. It suggests ominously that free expression will frequently give way to other values. It is reassuring because free expression is “paramount” which means “more important than anything else.”

If you read a lot about Facebook’s statements about this tradeoff, you might at times get the impression that free speech is just another value on par with safety and the other values mentioned above. But this statement (and others) indicates free expression has a higher standing for the company than the other values though it does not trump them in every instance. The Charter itself begins by saying free expression is a “fundamental human right.”

Mark Zuckerberg’s letter accompanying the Charter reinforces this view. The first paragraph states:

Facebook is built to give people a voice. Free expression is fundamental to who we are as a company, just as it is to a free, inclusive and democratic society. We believe the more people who have the power to express themselves, the more progress our society makes together. We want to make sure our products and policies support this.

Free expression comes first in the letter, and the CEO later says free expression is “paramount,” the same word that appears at the start of the Charter. Of course, the second paragraph of the letter deals with the values that limit free expression. But those values do not come first in the Charter or the letter.

Consider also that the Charter itself says “the purpose of the [Oversight] board is to protect free expression by making principled, independent decisions about important pieces of content and by issuing policy advisory opinions on Facebook's content policies.” That’s different from saying the Board seeks to attain the best tradeoff between free expression and other important values. This mention of free expression in the preamble to the Charter matters. What is not mentioned about the purpose of the Board – values to be balanced against free speech – also informs our understanding of Facebook and its Board project.

One hundred Facebook employees have been working on this Charter for many months. The rhetorical priority given free expression is unlikely to be an accident. And it need not just be empty rhetoric. Saying free expression is paramount for the Facebook community should matter to the interpretations that issue from the Facebook Board.

In the United States, two institutions matter most to free speech: the Constitution and the Court that interprets it. We have Facebook’s Community Standards. What does the Charter tell us about the makeup of its “court”?

Here are the qualifications to be a Board member according to the Charter:

Members must not have actual or perceived conflicts of interest that could compromise their independent judgment and decision-making. Members must have demonstrated experience at deliberating thoughtfully and as an open-minded contributor on a team; be skilled at making and explaining decisions based on a set of policies or standards; and have familiarity with matters relating to digital content and governance, including free expression, civic discourse, safety, privacy and technology. (emphasis added)

The last phrase is disappointing. Free expression is one “matter” among others. It would have been more consistent with the “paramount” status of free speech to say members “should have a strong commitment to free expression and familiarity with matters relating to digital content and governance, including civic discourse, safety, privacy and technology.”

But the disappointment does not last. Four sections later, the Charter states: “members will contribute towards building a board that, as an institution, upholds and advances free expression.” Since Facebook says elsewhere they are seeking Board members dedicated to this institution, a commitment to free expression is a qualification for its members.

The Charter documents are not always wholly coherent. The Charter states, “When reviewing decisions, the board will pay particular attention to the impact of removing content in light of human rights norms protecting free expression.” This evokes international law on behalf of free speech. On the other hand, Zuckerberg’s letter states that the values that constrain speech are “guided by international human rights standards.” Indeed, Facebook’s documents reflect a tension in international law itself which both protects and limits free expression. Perhaps Facebook’s secondary values should be rooted in the company’s culture rather than international law.

The Facebook Board’s impact on free speech will be determined over time, decision by concrete decision. Free expression could have been treated as just another competing value in this Charter and related documents. It is more than that. Free expression is paramount. Now will the Facebook Board live up to its Charter?

January 10, 2019 12:19PM

Can Pluralism Work Online?

divThe Wall Street Journal reports that Facebook has consulted with conservative individuals and groups about its content moderation. Recently I suggested that social media managers would be inclined to give stakeholders a voice (though not a veto) on content moderation policies. Some on the left were well ahead in this game, proposing that the tech companies essentially turn over content moderation of “hate speech” to them. Giving voice to the right represents a kind of rebalancing of the play of political forces. 

divI argued earlier that looking to stakeholders had a flaw. These groups would be highly organized representatives of their members but not of most users of a platform. The infamous “special interests” of regular politics would thus come to dominate social media content moderation which in turn would have trouble generating legitimacy with users and the larger world outside of the internet.  

divBut another possibility exists which might be called “pluralism.” Both left and right are organized and thus are stakeholders. Social media managers recognize and seek advice from both sides about content moderation. But the managers retain the right of deciding the “content” part of content moderation. The groups are not happy, but we settle into a stable equilibrium that over time becomes a de facto speech regime for social media.  

divA successful pluralism is possible. A lot will depend on the managers rapidly developing the political skills necessary to the task. They may be honing such skills. Facebook’s efforts with conservatives are far from hiring the usual suspects to get out of a jam. Twitter apparently followed conservative advice and verified a pro-gun Parkland survivor, an issue of considerable importance to conservative web pundits, given the extent of institutional support for the March for Our Lives movement. Note I am not saying the Right will win out but rather the companies may be able to manage a balanced system of oversight.  

divBut there will be challenges for this model.  

divSpending decisions by Congress are often seen as a case of pluralist bargaining. Better organized or more skillful groups get more from the appropriations process; those who lose out can be placated with “side payments” to make legislation possible. Overall you get spending bills that no one completely likes, but everyone can live with until the next appropriations cycle. (I know that libertarians reject this sort of pluralism, but I not discussing what should be but rather what is as a way of understanding private content moderation). 

Here’s the challenge. The groups trying to affect social media content moderation are not bargaining over money. The left believes much of the rhetoric of the right has no place on any platform. The right notes that most social media employees lean left and wonder if their effort to cleanse the platforms begins with Alex Jones and ends with Charles Murray (i.e. everyone on the right). The right is thus tempted to call in a fourth player in the pluralist game of content moderation: the federal government. Managing pluralist competition and bargaining is a lot harder in a time of culture wars, as Facebook and Google have discovered.  

divTransparency will not help matters. The Journal article mentioned earlier states: 

For users frustrated by the lack of clarity around how these companies make decisions, the added voices have made matters even murkier. Meetings between companies and their unofficial advisers are rarely publicized, and some outside groups and individuals have to sign nondisclosure agreements. 

divMurkiness has its value! In this case, it allows candid discussions between the tech companies and various representatives of the left and the right. Those conversations might build trust between the companies and the groups from the left and the right and maybe even, among the groups. The left might stop thinking democracy is threatened online, and the right might conclude they are not eventually going to be pushed off the platforms. We might end up with rules for online speech that no one completely likes and yet are better than all realistic alternatives.  

Now imagine that everything about private content moderation is made public. For some, allowing speech on a platform will become compromising with “hate.” (Even if a group’s leaders don’t actually believe that, they would be required to say it for political reasons). Suppressing harassment or threats will frighten others and foster calls for government intervention to protect speech online. Our culture wars will endlessly inform the politics of content moderation. That outcome is unlikely to be the best we can hope for in an era when most speech will be online.