Everyone dislikes the social media companies these days. Conservatives (that is, Republicans) decry alleged bias against their views. Liberals (that is, Democrats) complain about hate speech online and market domination. And that’s just the start for both sides. Of course, both sides believe “something must be done,” ranging from federal agencies controlling the platforms to breaking up the companies. Both see government as the solution for what is thought to ail social media. In his own way, Mark Zuckerberg agrees. He could not be more wrong.
Engineers and economists often focus on the tradeoffs inherent in a project. Zuckerberg has emphasized the tradeoff between the values of voice (speech) and safety (or protection from speech). At a recent appearance, he said most people do not want private companies making such fundamental tradeoffs by themselves. He continues, “I think we would be better off if we had a more robust democratic process setting the rules on how we want it to arbitrate and draw some of the tradeoffs between a lot of these values that we hold dear.”
What is a “more robust democratic process”? It could mean a process where majorities, highly organized minorities, or elected and unelected officials determine the tradeoff between speech and safety. They might do it indirectly as with Sen. Josh Hawley’s proposal to empower the Federal Trade Commission with life‐or‐death authority over tech companies and thereby, over their policies. All of us, including Zuckerberg, have little desire for government without constraints.
We live in a republic. The rule of the people comes with restraints on political power, not least protection for “the freedom of speech,” which places it, mostly, outside the reach of government decisions. The courts protect speech by interpreting and enforcing the First Amendment. They generally favor free speech with a few limited exceptions.
Zuckerberg wants democracy to help determine the tradeoff between speech and safety online. He may believe public involvement might accord legitimacy to online content moderation at least compared to efforts by private companies.
But what could the federal government actually do? The courts could apply the First Amendment to the tech platforms by overturning or ignoring precedents. They would invalidate parts of Facebook’s Community Standards including rules against hate speech. If other parts of the republic – say, Congress – decided to draw a line between acceptable and unacceptable speech online, the courts would be obligated to strike down such rules. Even if the courts allow Congress to act, one side or the other of our highly polarized polity would likely see a win or a loss in that outcome leading to acrimony and distrust rather than consent and legitimacy. Government actions in the tech space cannot foster legitimacy, not now at least.
Of course, the courts have generally found that the First Amendment does not apply to private companies like social media. If that policy status quo remains valid, we would be back to industry self‐regulation. What’s a tech mogul to do?
Facebook in fact is moving forward in creating an appeals board for its content moderation. Zuckerberg notes, “We’re starting this as a project just for Facebook. But over time I could see this expanding to be something that more of the industry joins.” In other words, self‐regulation just by Facebook might not work, but a private effort across companies might. Keep in mind also that we are in early days with such self‐regulation; the Facebook board (or an industry‐wide board) does not yet exist so it hardly can be judged a failure. Could they gain legitimacy? Perhaps. Might it fail for many reasons? Of course. But the board is an attempt to deal with immense and surprising challenges that have been evident since at least November 2016. The board deserves a chance.
We know the other path forward. The concrete alternatives are the courts enforcing the First Amendment in private forums or Congress, state officials, and the president drawing lines between permitted and suppressed speech, lines that the courts then invalidate (we hope). Neither option seems likely or desirable. The leaders of social media may regret the political and social dangers and complexities of content moderation. The tradeoff among values is hard, no doubt. But their task cannot be avoided. No one else, least of all government, can or should do the job.
For more on the limits of government regarding online speech, see my recent policy analysis.