Meta’s Covid misinformation policies represent a state of exception, a temporary suspension of, or exception to, the rule of law, dictated by the sovereign in response to an emergency. This Policy Advisory Opinion is an opportunity to prevent Meta’s exceptional measures from becoming permanent. The Oversight Board should not allow this opportunity to pass. Continuing to allow the normal order of third-party fact checking to be supplanted by top-down truth arbitration saps third-party fact checking of legitimacy, encourages censorship creep, irresponsibly empowers fallible authorities, and implicates Meta in their failures. A permanent state of exception would be bad for Meta’s platforms’ governance.
Even before Covid, Meta retained ultimate control of its platforms’ governance. However, it formalized and standardized its content moderation by promulgating platform policies, moderation systems, and community standards. These community standards act as law for its platforms. One such system is Meta’s third-party fact checking program.
Instead of resolving factual questions itself, Meta maintains a stable of third-party fact checkers — mostly journalistic institutions — which can choose to evaluate and label disputed truth claims on Meta’s platforms. Meta actions labeled content but does not decide which content is labeled false. This allowed it to avoid becoming an “arbiter of truth.”
However, in response to Covid-19, Meta superseded this system with an alternative method for dealing with disputed factual claims about the novel virus. Meta has identified 80 distinct false claims about Covid-19 that it removes on sight. While Meta describes its process of false claim identification as “rely[ing] on reports and official statements from credible health organizations, such as the WHO and certain governmental health organizations”, Meta decides which content includes these false claims.
Thus, for the past two years, Meta’s normal rules for handling factual disputes have, in this area, been supplanted by exceptional measures. Indeed, in its Policy Advisory Opinion Request, Meta describes the adoption of its Covid misinformation policy as an “extraordinary step”
However, in January 2020, based on the rapidly unfolding COVID-19 pandemic, we took the extraordinary step of removing entire categories of misinformation about the pandemic from our platforms.1
If Meta maintains its present state of exception indefinitely, allowing the Covid emergency to permanently change its content moderation paradigms, emergencies will become a means of altering platform policy rather than extraordinary protective measures.
These measures reified Meta’s sovereignty at the expense of its moderation policies’ legitimacy. After all, in the classic Schmittian formulation, “Sovereign is he who decides on the exception.”
This legitimacy sap intensifies as these measures are maintained. Whatever we think of normal policies unsuited for a moment of emergency, what are we to think of their apparent continued unsuitability even as the moment of crisis passes?
Creep and Permanence
Layering extraordinary measures on top of the normal rules not only delegitimizes the normal order, it also tends to make the emergency measures permanent. The longer exceptions persist, the stickier they become. They gain constituencies, status quo bias begins to work in their favor, and they are increasingly viewed as integral parts of the policy apparatus, rather than exceptions to it.
This is not a problem unique to Meta or social media governance. Pragmatic or temporary government necessities have a long track record of being used to permanently alter or circumvent the rule of law. In Napoleonic France, interwar Italy, Weimar Germany, and post‑9/11 America temporary-ultimately-permanent emergencies permanently altered the rule of law.2 What were once viewed as extraordinary measures, such as demanding travelers remove their shoes at the airport, or indefinitely detaining foreign nationals suspected of terrorism, have become ordinary. 21 years after Richard Reid’s failed shoe-bombing attempt, we still remove our shoes as we pass through airport security, and 36 prisoners remain at Guantanamo Bay.
Although the restrictions of liberty produced by Meta’s Covid misinformation policy are not nearly as severe, they represent just as drastic a turn away from Meta’s prior commitments to refrain from making determinations of fact and fiction.
Although Meta has taken emergency steps to remove health disinformation twice in the past, these interventions were geographically limited responses to particular rumors, minimizing opportunities for mission creep and marking them as clear exceptions to the global policy norms. In contrast, the current exception applies globally, and therefore runs a greater risk of being conflated with the normal order of moderation. One middle-ground solution might be to preserve emergency Covid misinformation removal policies in locales surpassing some case-rate threshold, making them a local exception rather than the norm. As well, the breadth of claims included in the current exceptional policy makes it particularly vulnerable to scope-creep. The policy has already expanded to cover non-covid-specific claims about vaccines, which might discourage use of the Covid vaccine, and 5G anxieties that are often co-mingled with Covid misinformation.
Fallibility
The authorities that Meta relies upon are not infallible. Treating their official advice as a source of absolute truth threatens to enforce error at scale and quash vital dissent. It also puts Meta in the awkward position of having to override health officials’ judgement when it becomes clearly erroneous. In May 2021, Meta rescinded its nearly year and a half long prohibition on claims that Covid-19 was man-made or manufactured after a Wall Street Journal report lent new institutional credence to theories that the virus had leaked from a lab.3 How much speech about lab-leak theories was removed during this time? This incident puts the lie to Meta’s claims that “As with its overall approach to removing misinformation from its platforms” under the exceptional Covid misinformation policy, “Meta would not make its own truth or falsity assessments regarding harmful health misinformation.“4 In choosing to police claims about Covid-19 itself, Meta must either treat health authorities as infallible even in the face of clear errors, or exercise its sovereign authority to correct them, revealing itself as the true arbiter of truth.
Although the Oversight Board has heard two cases involving Meta’s Covid misinformation policy, in the first, Meta exercised its right to refrain from adopting the board’s policy recommendations, greatly limiting the impact of its decision. In the second, the board upheld Meta’s decision to refrain from removing content, and therefore presents no conflict of authority. Indeed, even this review has been prompted by Meta, rather than the Oversight Board. Nevertheless, it presents a valuable opportunity to return to the normal order of content moderation.
Fact Checking
The third-party fact checking program is not without its critics. It may need reform. However, the efficacy and fairness of the third-party fact checking program is not the issue before the Oversight Board today. Whatever its flaws, the third-party fact checking program is Meta’s normal process for handling factual disputes. It is a method of handling contested claims that concords with Mark Zuckerburg’s belief that “Facebook or internet platforms in general should not be arbiters of truth.“5 It was not implemented as an extraordinary measure in response to a recognized emergency.
Allowing the third-party fact checking program to handle disputed Covid claims would return Meta’s content moderation to its normal order. It also might reveal deficiencies in the program that have been harder to recognize or address while contentious claims are handled via extraordinary measures.
While Covid-19 persists, the emergency that necessitated, or at least justified extraordinary action has waned. The Oversight Board should make use of this opportunity to formally bring Meta’s Covid-19 content moderation state of exception to a close, and avoid the establishment of a “new normal” of platform responsibility for policing users’ truth claims.
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.