A carefully posed photo of dangerous driving attracted some attention online in early May. The photo shows a picture from the driver’s seat of a Nissan. The photographer is driving, doing 90 mph as he brandishes a handgun with his finger resting on the trigger. To make matters worse, there’s an alcoholic cider propped against the dash. This extensive set of unsafe behaviors was intended to outrage, offend, and attract attention — all goals it undoubtedly met. And such foolishness is an invitation to a lengthy imprisonment. But it would be a mistake to treat Nissan, Heckler & Koch, Angry Orchard Hard Cider, the driver’s cell phone manufacturer, and whatever platform he used to share the photo as responsible for his misbehavior.
Unfortunately, two ongoing lawsuits against Snapchat apply this logic to the app’s speed filter feature. Alongside other sensor‐based filters, like altimeters and location‐based geofilters, Snapchat provides a speedometer filter that superimposes the user’s current speed over a photograph. Passengers can use the filter safely in all manner of vehicles, from boats to airplanes. However, it can also be used dangerously by reckless drivers speeding on public roads in pursuit of a high speedometer reading.
In September of 2015, teen driver Crystal McGee was allegedly traveling at over 100 miles‐per‐hour while using Snapchat when she struck Wentworth Maynard’s vehicle. McGee was later charged with causing serious injury by vehicle, but Maynard sued both McGee and Snapchat, attempting to hold the company responsible for McGee’s reckless driving.
In most cases, Section 230 of the Communications Decency Act indemnifies the creators of an “interactive computer service” against liability for consumer misuse of their publishing tools. The law prevents social media platforms from being treated as the “publisher or speaker” of user‐generated content.
Indeed, the case was initially dismissed on Section 230 grounds, but this decision was reversed by the Georgia Court of Appeals. The court reasoned that because McGee did not actually post the photo, Snapchat was not being treated as the publisher of her speech, but the creator of a dangerous product that had somehow, per Maynard’s complaint, “facilitated McGee’s excessive speeding.” The court allowed the case to go forward because the suit “seek[s] to hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter and its failure to warn users that the Speed Filter could encourage speeding and unsafe driving practices.”
It’s hard to see how the existence of Snapchat’s speedometer encouraged Crystal McGee to drive at 113 miles‐per‐hour on a busy road. Snapchat doesn’t reward users for achieving high speedometer ratings, and opening the filter triggers a popup warning reading: “Please, DO NOT Snap and drive.” Snapchat may have made it easier for her to record and share her behavior, but reckless drivers have long taken photos of their speed as displayed on the dash. One might just as easily claim that the existence of dashboard speedometers similarly encourages speeding. Arguably, driving fast might be less alluring without a way to determine how fast you’re actually going. The collection of items in the photo above all contribute to its outrageousness, yet none of the companies represented are responsible for the reckless tableau.
In Lemmon v. Snap, a similar case dismissed with leave to amend in February, the District Court for the Central District of California found that Section 230 protected Snapchat from liability because the filter “is a neutral tool, which can be utilized for both proper and improper purposes. The Speed Filter is essentially a speedometer tool, which allows Defendant’s users to capture and share their speeds with others.” While a user might behave recklessly in pursuit of a high recorded speed, the decision is theirs and theirs alone. The court describes the recorded speed as content submitted by the user. “While a user might use the Speed Filter to Snap a high number, the selection of this content (or number) appears to be entirely left to the user,” the Court reasoned. Snapchat doesn’t play a role in selecting the user’s speed, making it a “neutral tool” protected by Section 230.
While Maynard and Lemmon may seem like instances of overly litigious ambulance‐chasing, and Snapchat will likely win its case even in the absence of Section 230, the suit’s sweeping theory of intermediary liability has supporters in Congress.
In a recent Federalist Society teleforum, Josh Divine, Deputy Counsel to Sen. Josh Hawley, argued that Snapchat should be held responsible for users’ misuse of the filter. Divine asserts that “most people recognize that this kind of tool is primarily attractive to reckless drivers and indeed encourages reckless driving,” ignoring both the varied, user‐defined applications of the filter, and its inbuilt warning. He contends that plaintiffs in the speed filter lawsuits are “complaining about a reckless platform design decision” rather than anything “specific to speech.” However, Maynard and similar suits hinge on platform design’s facilitation of user speech. Snapchat is being sued upon the belief that it contributed to the plaintiffs’ injuries by providing a tool that allows speakers to easily tell others how fast they’re moving. Any remedy would involve limiting the sorts of speech that Snapchat can host.
Section 230 was intended to protect the creation and operation of communicative tools like Snapchat. In Maynard, litigants attempt to circumvent Section 230 by, in essence, suing over Snapchat’s non‐use, alleging that Section 230 should not apply because McGee did not actually publish any photos taken before the crash.If merely creating a tool that can be used illegally or dangerously opens platforms to liability, Section 230 offers little real protection, and such a determination would imperil more than camera‐speedometer amalgamations. Responsibility for one’s behavior — be it the dangerous acts pictured above, or the reckless driving at issue in Lemmon and Maynard — should rest with the individual.