Thursday, October 20, 2022

AG James Wants To Hold Platforms Liable For Mass Murders

There are black holes on the internet where few of us normies tread, And no doubt if we  unwittingly stumbled into 4chan, our reaction would be shock, as the unpleasantness of certain unfiltered humanity would make us question how our species survived. And to be fair to New York Attorney General Letitia James, it can well be a festering, puss-filled sore that any decent person would want cured.

But as much as we may hope and pray such black holes didn’t exist, and that our fellow humans wouldn’t be this way, can it be stopped? Tish James calls for criminalizing the broadcasting and distribution of mass murderers on social media, and civil liability for the platforms that fail to prevent it from happening.

Fringe Platforms Fuel Radicalization: Anonymous, virtually unmoderated websites and platforms radicalized the shooter. By his own account, the Buffalo shooter’s path towards becoming a white supremacist terrorist began upon viewing on the 4chan website a brief clip of a mass shooting at a mosque in Christchurch, New Zealand. His radicalization deepened further through engagement with virulent racist and antisemitic content posted by users on 4chan. The anonymity offered by 4chan and platforms like it, and their refusal to moderate content in any meaningful way ensures that these platforms are and remain breeding grounds for racist hate speech and radicalization. In the wake of the Buffalo shooting, graphic video and images of the shooting proliferated through 4chan more than any other site viewed by my office. When discussing its policy on such content, a head moderator said that “it’s not even against the rules” because “the footage itself isn’t illegal, any more than footage of any act of violence is illegal.” In the absence of changes to the law, platforms like 4chan will not take meaningful action to prevent the proliferation of this kind of content on its site.

There are two versions of how video of these atrocities end up online, the first being that the killer intentionally livestreams his crimes for glory, which, James argues, encourages other psychos to commit their own mass murders to gain similar recognition. This may involve only the shooter, or may involve someone else “acting in concert” with the shooter to capture the crimes.

As Eugene Volokh notes, there is a fairly good possibility that the second person would be found to be acting in concert, and thus bear criminal culpability for the underlying crime.

It’s certainly a crime to commit homicide (assuming the proposal would be limited to criminal homicide, and not to self-defense and the like), and to conspire with others to commit homicide. (Even if the only “acting in concert” is recording the images, that may well be viewed as purposeful aiding and abetting the killing, much as yelling words of encouragement during a crime can so qualify.) It’s not clear whether the First Amendment would allow the law to tack on extra punishment for photographing or videorecording the homicide. But certainly there are ample tools to punish such people for the homicide itself.

Of course, video-recording things that happen in public, whether police encounters, crimes or otherwise is largely a right, and while the connection between the person doing the killing and the person doing the recording or livestreaming in furtherance of the killing could give rise to a claim of content-based violation of speech, there may well be sufficient conduct involved that purposefully aids the killing so as to criminalize the conduct not for its recording or distributing, but for its empowerment and enabling of mass murder.

Which is why James’ second prong, modifying Section 230 to allow for online platform liability for not preventing this content from being created and distributed.

Online Platforms Currently Lack Accountability: By his own account, the shooter’s path to
radicalization was cemented by explicitly racist, bigoted, and violent content he viewed online on 4chan, Reddit, and elsewhere. He used the platform Discord to keep a private journal for months, where he wrote down his hateful beliefs and developed specific plans for equipping himself and perpetrating his massacre. He livestreamed his attack through both Twitch and Discord. In the wake of the attack, other users disseminated graphic video of his attack throughout the internet, everywhere from fringe websites to mainstream platforms like Facebook, Instagram, Twitter, and others. The First Amendment has no
categorical exemption for hate speech; most of the content the shooter viewed is rankly offensive, but its creation and distribution cannot, constitutionally, be unlawful. Moreover, even when a user posts content that is unlawful, Section 230 of the Communications Decency Act of 1996 (CDA), codified at 47 U.S.C. § 230, largely insulates platforms from liability for claims related to their content moderation decisions.

James has a plan to “remedy” this insulation of online platforms from liabilty.

We also recommend imposing civil liability for the distribution and transmission of this content, including making liable online platforms that fail to take reasonable steps to prevent unlawful violent criminal content from appearing on the platform. Significant penalties, sufficient to realize the goal of deterrence, should be levied in cases where an online platform fails both to take such reasonable steps and to prevent the transmission of content that is captured by or created by the perpetrator of a homicide, or one working in concert with the perpetrator of a homicide, and that depicts a homicide.

Of course, “unlawful violent content” covers the murder of George Floyd as well as the Buffalo shooter. What would constitute “reasonable steps” will likely be whatever an armchair AG decides should have been done after the fact, even though content moderation at scale is essentially impossible given the volume of content and the limits of human oversight and algorithmic nuance.

And Volokh is quite right to note that should such duties be imposed on platforms, it will almost invariably mean that content will be removed with a heavy hand since there is no reason platforms want to be held liable for the stuff others put on it.

The First Amendment protects people’s rights to convey images of crime, which often reveal important information about what happened, how it might have been prevented, and the like—whether those images were captured by criminals or by innocent witnesses. And the First Amendment protects people’s rights as recipients of information to view such images.

It’s hard to fault James for wanting to do something to prevent more mass murders. The problem is that some human beings are bad people who do bad things and, despite the rosiest of aspirations, won’t be stopped with a little empathy and validation. But shifting the burden to the periphery, such as gun makers and online platforms, implicates the rights of the rest of us as well as the bad dudes and mass murderers. It’s not that simple.

No comments:

Post a Comment