What's this Section 230 thing that guy is talking about?
I hear it's getting kids addicted to fentanyl over the Internet

You may have seen a recent skeet by Senator Dick Durbin (D-IL) saying that Section 230 is causing kids to get addicted to fentanyl over the Internet and so it must be repealed.
Sounds pretty scary, right? Bad law that lets Facebook do whatever they want regardless of consequences?
Well...not quite. If you want to learn more about Section 230, here's a read.
If there's one thing every American politician claims to care about, it's "holding Big Tech accountable."
It's a good way to score points, especially alongside "won't somebody please think of the children." Republicans worry that the Internet lets their children know that trans people exist (by the way: Trans People Exist). Democrats worry that the Internet (rather than the state of the world in general) is making their children depressed and getting them addicted to drugs.
Or they get to score points through a "free speech" lens as well. Democrats worry that it's too difficult to control speech - including hate, lies, and misinformation - on the Internet. Republicans worry that their views (read: hate, lies, and misinformation) are being censored by major platforms.
We'll see how committed politicians are to these views now that the world's richest person - who also owns one of the world's largest social media platform - is running the government and bringing in lots of his silicon valley and venture capital friends. And now that other social media platforms are gutting their content moderation teams and explicitly allowing certain types of hate speech.
These politicians have, over the years, had a huge number of horrible ideas for "fixing the Internet" and "holding Big Tech accountable." Partially because they don't really know what they're talking about and have a limited understanding of what even is the Internet.
We don't have time or space here to talk about all of them over the years. Or even all of them currently in-flight. But let's talk about the focal point of many of these proposed reforms: Section 230.
Section 230
A B̶r̶i̶e̶f̶ History
Section 230 - a section of US Code Title 47 (Telecommunications) entitled "Protection for private blocking and screening of offensive material" - is in some ways the lynchpin that holds the Internet together. Or certainly it has been integral in the development of the Internet so far, in ways both good and bad.
It was passed in response to a growing number of lawsuits regarding user-generated content posted to online services. Two cases in particular, Cubby, Inc. v. CompuServe Inc. (from 1991) and Stratton Oakmont, Inc. v. Prodigy Services Co. (from 1995) are cited as the inspiration for this. In these two cases, service providers were brought to court for libel and defamation due to content posted by external parties (the former a user, the latter a subcontracted content manager) and attempted to answer the question of whether the service provider could be held liable for information posted to its service by an external party.
Ultimately, the question was: Are service providers that host content generated by external parties held to be publishers of that content (and therefore be legally liable for that content) or merely as distributors (without liability)?
Oddly, despite the significant parallels between the two cases and despite the later case citing the earlier as precedent, they came to two different conclusions in ways that proponents of the Internet found to be disturbing.
In the CompuServe case, CompuServe was ruled to only be a distributor of the content and could not be held liable for that content. The content generation and management had been contracted (and later subcontracted) out. Since CompuServe did not itself engage in any active review or moderation of the content, they had no reason to be aware of the defamatory content and could not be held liable for it.
In the Prodigy case, Prodigy was ruled to be a publisher of the content created by anonymous users while citing the CompuServe case as precedent. Weird, right? How did the court come to an opposite conclusion? The court held that, since Prodigy had a posted content policy for the forum and enforced that policy via content moderation, they were exercising editorial control over the content of the forum and could be held liable for any content present on the forum.
In the world created by these two rulings, any Internet system that hosted content created by an external party had two options:
- Make no effort to find and remove offensive content (and be absolved of any liability for that content)
- Make a good faith effort to find and remove offensive content (but be held liable if you miss any)
Option 1 results in a shitty user experience and option 2 opens providers up to liability for any content they do not remove.
This creates a perverse incentive where service providers are disincentivized to perform any content moderation with their services because they can be held liable for any content that remains, including things like libel or defamatory remarks that the service provider is likely in no position to evaluate the truthfulness of.
In such an environment, the only sane option is the implicit option 3: Do not host any user-generated content.
And that's why, to this day, there is no user-generated content available on the Internet. No wait that's not right. That's why Section 230 exists.
So what is it?
Section 230 offers "protection for good samaritan blocking and screening of offensive material." That is, it makes moderation of user-generated content possible by removing risk of liability.
No post would be complete, though, without the quote:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
So Facebook (a provider of an interactive computer service) cannot be sued for defamatory remarks that you (an information content provider) post on the service. We'll talk more about other protected scenarios and what exactly that protection entails in a minute.
But it goes further than that, giving service providers explicit rights to enforce their own content policy:
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
Drastically simplified, this - among other things - is why Bluesky is not "censoring" you when they delete posts in accordance with their content policy even if the content is not strictly illegal. Nor is Instagram causing harm to you when they give you a 7-day ban because their automated systems thought a picture of your dinner contained sexual content.
This doesn't imply immunity for knowingly hosting content that violates criminal or copyright law, but does provide service providers the latitude to manage their own services and enforce their own community standards while still allowing for user-generated content.
And, well, that's the basics of it. If you're looking for a bit more reading, the Subsequent History and Case Law sections of its Wikipedia page are pretty voluminous if you've got the time and inclination to read, but we'll be dipping into and summarizing some of that content in just a second.
What's the problem?
Without going over everything on that aforelinked Wikipedia page, let's just say that especially since 2016 (I wonder why) Section 230 has come under fire from all sides.
Some feel that the "distributor" protection has allowed for the easy spread of misinformation and disinformation about the 2016 and 2020 elections, lies about covid and vaccines, and I guess getting kids hooked on fentanyl(?). That a stronger obligation for platforms to remove certain types of speech would reduce certain harms they cause to society.
Some feel that the protection for "good faith" moderation is too broad and allows platforms to silence certain opinions and viewpoints. That many of these platforms function as a digital town square and should be required to allow any politically-protected speech.
"Holding Big Tech accountable" sounds like something I should be 100% behind, right? Let's repeal Section 230 and really stick it to Elon Musk, right?
Well...as usual it's not that simple
So I'm going to temporarily step off of my "Big Tech bad" soapbox and get over onto my "consolidation of the Internet bad" soapbox. Ultimately they're just different parts of the same soapbox, but I do need to be precise.
When you think of "the Internet," especially in the context of "user-generated content" you're probably thinking of Facebook, Instagram, X...megacorp social media sites. But that's not the entirety of the Internet. Let's think through a few more examples.
- Is DigitalOcean liable for any defamatory claims I make on this blog?
- Is the Ghost Foundation liable for my other blog?
- Is the operator of some random Mastodon server liable for posts federating in from other instances?
- Is a news site liable comments submitted under its articles?
- Is a library hosting an informational webinar for its community liable for viewer-submitted questions?
Aside from a duty to remove content that violates criminal or copyright law - especially once notified of it - Section 230 says no.
And let's remember Section 230 also protects service providers moderating their service by removing constitutionally-protected content that is against the service's content policy.
Maybe you think Facebook should allow all constitutionally-protected speech, but should me deleting an offensive, spammy, or even just irrelevant comment under a post open me up to a lawsuit?
So how do we fix it?
Well it's that time again. The time where there are no easy answers.
Repealing 230 to "stick it to Big Tech" will, counterintuitively, further consolidate the Internet and cause immediate and immense harm to "the little guys."
You see, the Section 230 protections can protect service providers from lengthy and expensive litigation processes. If a defendant can show that a case meets the Section 230 criteria, the case will be summarily dismissed with no further fuss. If they cannot meet that bar, the case proceeds to litigation, where the defendant may or may not be found liable for other reasons.
Without Section 230 protections in place, service providers are open to lawsuits that, even if they end up winning, can drain a person or company of resources. Expensive litigation - or even the threat thereof - can allow rich and powerful companies and individuals to create a chilling effect on speech throughout the Internet and lock smaller competitors out of the market.
There have been several proposals for amendments to target Big Tech for increased accountability without removing protections from smaller businesses, communities, or individuals. Things like number of employees, market cap, revenue, user counts, etc. Or removing protections from monetized content.
Each of these approaches have pros and cons (see the below link on regulating services by size for more), but when evaluating them it's important to include experts from the impacted fields in the discussions, not just people who want to be able to post conspiracy theories.
Ultimately, and without adding another 2000 words to this post, it seems like Section 230 is delicate enough that it needs a deft hand if we're going to touch it. Or better yet, leave it alone and explore alternate solutions (most of which are a whole selection of different bad ideas for another post) to fix any issues you have with Big Tech.
Because killing Section 230 or thoughtlessly amending it isn't the quick win many politicians seem to think it is.
Further Reading
Lots of people much smarter than me have written much better articles about Section 230. Its importance, potential consequences of repeal or amendment, possible alternative solutions, etc. Please do give them a read if you want to learn more.
- EFF on Section 230
- TechDirt "Hello! You've Been Referred Here Because You're Wrong About Section 230 Of The Communications Decency Act"
- ITIF on Section 230
- Library Futures "More than a Big Tech Issue"
- TechDirt on a proposed amendement from a few years ago
- A paper on Regulating Internet Services by Size (found linked from here)
- The Twenty-Six Words That Created the Internet by Jeff Kosseff
- And of course the wikipedia page