Two pending Supreme Court cases interpreting a 1996 law could drastically alter the way we interact online. That law, Section 230 of the Communications Decency Act, is often disparaged as a handout to Big Tech, but that misses the point. Section 230 promotes free speech by removing strong incentives for platforms to limit what we can say and do online.
Under Section 230, platforms generally may not be held liable for the content posted by users. Without this protection, important speech such as communication about abortion, especially in states where abortion is outlawed, could be silenced. Movements like #MeToo and #BLM may not have been able to catch on if platforms were worried that they’d be sued, even improperly, for defamation or other claims. People could have found their voices censored, especially when talking about ideas that are under political attack today: race and racism, sexuality, and gender justice. The internet as we know it would be a very different place.
Section 230 promotes free speech by removing strong incentives for platforms to limit what we can say and do online.
Before Section 230, companies cultivating online communities were legally responsible for what their users posted, while those that exercised no editorial control were not. The natural consequence of this was that some platforms would choose to limit conversations to only the most uncontroversial matters, while other platforms had an incentive to host free-for-all spaces, tolerating pornographic, abusive, or other unwanted content to avoid any legal responsibility. Congress wisely recognized that the internet could be so much more than this and passed Section 230.
While Section 230 immunizes online platforms from legal liability for the posts, comments, and other messages contributed by their users, it does not free platforms from liability for content that violates federal criminal law, intellectual property rights, or a few other categories of legal obligations. Section 230 also does not apply to platform conduct that falls outside the publication of others’ content, such as discriminatory targeting of ads for housing or employment on the basis of race or sex.
If we lose Section 230, we stand to lose the internet as we know it.
It also does not provide a safe harbor for platforms that provide advertisers with tools designed to target ads to users based on sex, race, or other statuses protected by civil rights laws. Nor does it provide immunity from claims that a platform’s own ad delivery algorithms are discriminatory. The ACLU recently explained why this conduct falls outside the scope of Section 230. In these scenarios, where the alleged basis for liability is the platform’s own discrimination, the ACLU seeks to stop platforms from misusing or misinterpreting Section 230 immunity.
Today, the internet enables people to communicate with one another at a previously impossible scale. It is one of the “principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge” as the Supreme Court recently recognized in Packingham v. North Carolina. At the same time, platforms are free to manage user content, taking down problematic posts containing nudity, racist slurs, spam, or fraudulent information.
This term, the Supreme Court will consider the scope of the law’s protections in Twitter v. Taamneh and Gonzalez v. Google. These cases were brought by family members of U.S. citizens who were killed by ISIS in terrorist attacks. The suits allege that platforms, including Twitter and Google’s YouTube, are “aiding and abetting” ISIS attacks by failing to adequately block or remove content promoting terrorism.
But Twitter and YouTube did not, and do not, have any intention of promoting terrorism. The videos plaintiffs identified were posted by ISIS operatives and, while lawful, violate Twitter’s and YouTube’s terms of service. The companies would have removed them if they were flagged. There is also no allegation that the people behind the terrorist attack were inspired by these videos.
The ACLU’s amicus brief in Twitter v. Taamneh asserts that imposing liability under these circumstances would improperly chill speech. Of course, a platform could promote terrorism through its policies and actions. But imposing liability merely for hosting content without malicious intent or specific knowledge that any specific post furthered a particular criminal act would squelch online speech and association. It already happens, such as when Instagram confused a post about a landmark mosque with one about a terrorist group. These relatively common errors would become the new norm.
The Gonzalez case asks a different question: whether Section 230 immunity applies to amplified content. The plaintiffs argue that when platforms suggest content to users, such as in “Up Next,” “You Might Like,” or “Recommended For You,” those suggestions are not protected by Section 230. So, while a provider would remain immunized for merely hosting content, it would be responsible for highlighting it.
The ACLU filed an amicus brief in the Gonzalez case to explain why online platforms have no choice but to prioritize some content over others, and should be immune from liability for those choices when they include content from a third party. Given the vast amount of material posted every minute, platforms must select and organize content in order to display it in any usable manner. There is no way to visually present information to app or webpage users without making editorial choices that are, at the very least, implicit “recommendations.”
Moreover, organizing and recommending content helps us to find what we are looking for, to receive and create information, to reach an audience and to build community. If Section 230 doesn’t apply to this kind of content organization, platforms will be incentivized to present information in a disorganized jumble and will feel pressure to include only the most innocuous content that lawyers can be certain wouldn’t inspire anyone to sue.
Section 230 has allowed public expression on the internet to flourish. It has created space for social movements; enabled platforms to host the speech of activists and organizers; and allowed users and content creators on sites like Instagram, TikTok, and Twitch to reach an audience and make a living. Without it, the internet will be a far less hospitable place for human creativity, education, politics, and collaboration. If we lose Section 230, we stand to lose the internet as we know it.