Senator Ted Cruzâs (RâTX) staff on the Senate Commerce Committee recently published a report that investigates several instances of what it calls âOnline Service Providers ⌠silencing conservativesâ by relying âupon biased leftââwing organizations.â While these specific examples may be viewed as mere anecdotes, the report dives deeper than many other accusations of political bias and tries to understand how exactly these companies reached their decisions to deââplatform various conservatives.
The report concludes that âThe ideal solution is for market forces to correct Online Service Providersâ discriminatory policies.â While the report also embraces some limited legislative fixes that do not live up to this ideal, its emphasis on understanding how content decisions are made and how the market can help address real or perceived bias is constructive. In this same vein, I will be publishing a paper next month that explains how social media content moderation works and how the market and civil society can support greater expression and greater choice online.
Itâs worth noting that it is and should be the right of such companies to set their own policies and deny their services to those with whom they disagree. But, the report argues, then we should dispense with the pretense that these are neutral companies trying to offer their services to most Americans so that the market can satisfy those no longer being served by these large tech companies. And unlike many other criticisms of tech company decisions, the report spoke directly with the companies to understand how and why these decisions were made.
The first highlighted decision was by Slack, a workplace and communications system, to suspend the rightââwing social media influencer Libs of TikTok because of its various posts and reporting about what myriad LGBT organizations and activists were themselves posting online. The report notes that the deplatforming effectively caused Libs of TikTok employees to lose their prior communications on Slack, causing significant disruption to their organizationâs operations.
The report then looks at Eventbriteâs cancellation of various events by groups like Young Americas Foundation, College Republicans, and others hosting conservative commentator Matt Walsh or his documentary What is a Woman? for questioning a progressive view of sex and gender. Similarly, Eventbrite cancelled an event featuring Riley Gaines, an elite college swimmer, for her views regarding transgender athletes participating in female sports. These cancellations often caused significant disruption and costs to the event organizers.
The report also notes other instances of deplatforming including the immigrationââcritical Foundation for American Immigration Reform (FAIR), the Independent Womanâs Forum, and other organizations that were excluded by these and other online service providers because of mainstream political and social views on important issues of our day.
The enforcement described in this report is different than your typical social media content moderation. When dealing with billions of pieces of content, AIââpowered enforcement tools are necessary to weed through the sheer amount of material, often paired with large contingents of human moderators who must make relatively quick moderation decisions. This report describes more comprehensive reviews by expert and executive trust and safety teams that are looking at a broad range of factors.
This was one of my roles when I was a member of Metaâs content policy team and was often reserved for the most difficult and socially or politically sensitive content. And while using highly trained and expert moderation for difficult decisions makes sense, it also risks allowing external and internal biases to influence the outcome.
For example, the committee report identified outside organizations that may exert significant influence over not just the writing of various policies but also how those policies are enforced. Eventbrite testified to the committee that it ârelies on thirdââparty sources, including Southern Poverty Law Center (SPLC) and the AntiââDefamation League, in determining whether an organizationâs event violates the Eventbrite Community Guidelines.â Similarly, Slack told the committee that âit generally relies on âthirdââparty expertsâ and âindustryâârecognized resourcesâ in enforcing its policies.
The way this works in practice is that external and generally progressive âexpertsâ and âpartners,â frequently, persistently, and aggressively lobby for certain users, pieces, or types of content to be removed as hateful or otherwise harmful. It is rare for rightââleaning or libertarian groups to be as trusted internally or to report as much content as these progressive âpartners.â
Often, the content did not violate the clear letter of the policies, but pressure was applied by these organizations through media campaigns, organized boycotts, or oneââsided research designed to make companies appear complicit in some harmful content. The result of such external pressure, together with receptive internal teams, is that sometimes companies cave or willingly support the suppression of certain viewpoints. They invoke highââlevel principles or certain contexts rather than specific policy lines or may even change the policy to align with activist demands.
This can be seen at work in this committee report, in which the service providers often avoid detailing how a user specifically violated their policies but instead refer to vague principles or contextual information they believe to be important. For example:
-
When asked if they were removing events featuring Riley Gaines because she made an X post about how women lack a Y chromosome, Eventbrite said the post âspeaks for itself.â
-
â[W]hen asked multiple times if Eventbrite would remove another event concerning womenâs sports that featured Riley Gaines, Eventbrite dodged the question.â
-
Libs of TikTok was âproblematicâ on Slack because of its âspecific audience.â
-
Eventbrite cited âthe overall tone and messageâ of the trailer for the What is a Woman? documentary ⌠âin combination with Matt Walshâs related public statements,â rather than anything specifically violating in the film, for removing various organizationâs viewing parties of the film or even unrelated events with Matt Walsh.
-
FAIR was deââplatformed by Slack for being âaffiliated with a known hate groupâ (likely referencing lists maintained by groups like the SPLC).
As I noted earlier, these organizations can be as vague, imperfect, or biased as they want in creating or enforcing their rules. And as the report concludes, âFirst Amendment protections generally do not apply to actions by private companies, who have the freedom to associate and do business with the customers they choose.â Therefore, âthe ideal solution is for market forces to correct Online Service Providersâ discriminatory policies. If Online Service Providers continue to cancel conservative organizations, it should create a new demand for Online Service Providers that service the conservative market.â
This is exactly right. Nothing is stopping new service providers from entering into these fields to provide these important servicesâexcept for the threat of additional tech regulation by those on the right and the left. Those who want more expression online should make the case to current and prospective service providers that policies limiting significant amounts of socially important speech may not be serving these companies or the broader society well. Those who want more restricted speech have no problem aggressively making their case for less speech, leveraging increasing sympathy in academia, government, and society with their views as various metrics point to increasing conflict over the value of free expression.
Those of us who want to see greater free expression must make the case not only for the First Amendment legal protection from government censorship but also make the case for a culture of free expression, including to the companies that are limiting vibrant discussions of important social issues.
The research done by the Senate report is part of this effort to promote a broader culture of free expression, but it goes a bit too far in seeking legislation that makes some significant demands of private companies. Specifically, the report calls for intrusive transparency from companies regarding their policies and enforcement actions. While many of these recommendations may be best practices that I think companies would benefit from employing, I also know that sometimes companies donât want to provide complete transparency in their rules. A common reason is to prevent adversarial actors from gaming their policies.
Similarly, requiring specific transparency around why users are being punished or deplatformed appeals to our desire for due process and fairness, but companies donât need a reason, much less a clear or good reason, to remove users from their services. Ultimately, their lack of transparency and due process are creating bad experiences with their products that, unchecked, will likely result in alternative solutions emerging in the marketplace, a reality that the Senate report elsewhere understands.
Judge Learned Hand famously stated, âLiberty lies in the hearts of men and women; when it dies there, no constitution, no law, no court can even do much to help it. While it lies there it needs no constitution, no law, no court to save it.â We must remind our fellow Americans why a culture of free expression is so important and how the free market can address bias and discrimination.