Family members of demonstrators killed in Kenosha have filed a lawsuit against Facebook, alleging a failure to remove a Facebook group event organized by a local militia, some of whose members allegedly later committed violence.
Prior to the filing of this lawsuit, Facebook’s CEO Mark Zuckerberg acknowledged Facebook’s role in the deadly incident, citing an “operational mistake” that led to the group’s promoted event (but not their page) remaining active before, during, and after the shooting incident, despite more than 400 flags being issued against the group itself. According to Zuckerberg, “the contractors, the reviewers, who the initial complaints were funneled to, didn’t, basically didn’t pick this up.”
The emergence of the lawsuit is likely to bring new pressure on Facebook to improve the way it monitors and moderates groups, messages, and online behavior within its platform. Critics have made ample note of the fact that content moderation is hard, expensive, and requires top-notch human resources to accomplish, yet Facebook has chosen to outsource this vital function to a worldwide diaspora of freelance contractors.
According to Will Oremus, who argues convincingly in a Medium post that Facebook has “abdicated its responsibility to both its users and society at large” by outsourcing content moderation, Facebook’s decision is in part due to an engineer-centric corporate culture in which content moderation is regarded as a lowly janitorial operation unworthy of serious staff attention.
It remains to be seen whether the claim that Facebook has deliberately under-resourced content moderation becomes a central pillar of the lawsuit. But it is clear that “operational mistakes” — in transport, commerce, medicine, and other industries — especially when they result in the loss of life, are evaluated and resolved by some entity outside the organization suffering the mistake. When an airplane or bus crashes, the NTSB is called in; when a doctor accidentally kills a patient, the medical boards weigh in and often so does the legal system in the form of a trial.
Facebook’s defense will most likely be based on the safe-harbor permissions of Section 230 of the Communications Decency Act of 1996, which immunizes sites from liability from messaging on their platforms in the same way that railroad or bus companies are not generally responsible for the statements of their passengers. But this broad defense may not remove its exposure to other serious forms of liability, including defective product liability. If, for example, it was demonstrated that a fatal airplane crash was caused or contributed to by a corporate decision to outsource maintenance tasks to contractors in a way that jeopardized safety concerns, serious liability would certainly follow.
Keep your eye on this lawsuit, especially as it enters the discovery phase, because much about Facebook’s internal operations — including past incidents of “operational mistakes,” may soon be in the public eye. I expect that lawyers will liken Facebook’s conduct in respect to how it allocates moderation duties to that of the hypothetical airline choosing the cheapest — not necessarily the safest — means to conduct its maintenance operations.