About Section 230

KALM-150x150"

What you need to know to form an opinion about Section 230, the “safe harbor” law in the US for tech platforms.

Featuring Tom Merritt.

MP3

Please SUBSCRIBE HERE.

A special thanks to all our supporters–without you, none of this would be possible.

Thanks to Kevin MacLeod of Incompetech.com for the theme music.

Thanks to Garrett Weinzierl for the logo!

Thanks to our mods, Kylde, Jack_Shid, KAPT_Kipper, and scottierowland on the subreddit

Send us email to [email protected]

Episode transcript:

Two cases are before the US Supreme Court regarding protections provided by Section 230 of the US Communications Decency Act. Gonzalez v. Google claims that a platform, in this case, YouTube, should be liable for content it recommends to users. Twitter v. Taamneh argues that Twitter provided unlawful material support for failing to remove users from its platform.
A lot of people are talking about these cases. And a lot of well-intentioned and well-informed people are going to make arguments based on misunderstandings of Section 230. So in this special episode I want to cover just what Section 230 is and what it isn’t. In other words I’ll help you Know a Little More about Section 230.

We covered the history and meaning of Section 230 in depth in the episode About Safe Harbor in July 2020. So if you want the deep dive please listen to that.
This episode will focus on how to properly explain and think about Section 230 no matter what argument you may or may not be trying to make. You may think Section 230 promotes censorship. You may think it protects big tech companies from responsibility. You may think it should be repealed. Those are all reasonable positions to take. But I often hear people argue these sorts of positions from a starting point that is wrong. I just want to give you the correct starting point from which you can make your argument.
So let’s start with the folks who say we should just get rid of it. There is a misconception that if we get rid of Section 230 companies would have to take responsibility for the content on their platform or that they would have to stop censoring. Neither one of those things is assured.
Without Section 230, ANY platform. And it’s worth pointing out this applies to a forum you might run on your own website, as well as to Facebook. Without Section 230, any platform would be seen in the eyes of the law as either a publisher of information or a distributor. A publisher is responsible for what it publishes. A distributor is not responsible for the contents of what it distributes.
The easiest way to think about this is a brick and mortar bookstore. The publisher of the books and magazines it sells are responsible for what’s in the books and magazines. The book store is just the distributor. In fact a 1959 Supreme Court case ruled that a bookstore owner cannot be reasonably expected to know the content of every book it sells. They should only be liable if they know or should have known that selling something was specifically illegal. Otherwise the publisher is liable for what’s in the book or magazine.
Now let’s think about that for a minute. The bookstore can decide what magazines to carry. But it’s not deciding what’s in the magazine. And it still isn’t allowed to sell magazines that it knows are illegal. Also of note is that letters to the editor published in the magazines are still the responsibility of the publisher. Just because a reader wrote the letter doesn’t free the publisher from liability. Because the margin publisher chose to publish it. It exercised editorial control.
So the bookstore gets protection because it’s not exercising editorial control of what’s in the books.
Fast forward to the 1990s. Compuserve and Prodigy are vibrant new parts of the internet where people are talking to each other like never before.
It’s April 1990. Sinead O’Connor’s new song Nothing oppress 2 U (written by Prince) tops the Billboard charts.
Robert Blanchard’s company Cubby Inc. has developed Skuttlebut – with a K- a database for TV news and radio gossip. It’s a new competitor for Compuserve’s Rumorville. Rumorville is published by Don Fitzpatrick Associates on Compuserve’s Journalism forum. Skuttlebut and Rumorville are in stiff competition for the burgeoning online audience that wants TV and radio news industry gossip. This is FIVE YEARS before the Drudge Report mind you.
In the heat of the competition Rumorville USA posts that Scuttlebutt has been getting info from a back door at Rumorville. And that Skuttlebutt’s owner, Robert Blanchard got “bounced” by WABC. And– and you don’t do this folks– described Scuttlebutt as a “scam.”
Blanchard and Cubby Inc. sued Don Fitzpatrick Associates, but also sued Compuserve as the publisher. But here’s the thing. Compuserve did not review Rumorville’s content. Once it was uploaded it was available. Compuserve also didn’t get any money from Rumorville. The only money it made was off the subscribers to Compuserve itself, whether they read Rumorville or not.
In Cubby Inc. v Compuserve, the judge ruled that Compuserve was not a publisher. It was a distributor. It could not reasonably know what was in the thousands of publications it carried on its service. Therefore, like a bookstore, Compuserve was not liable for what was published in Rumorville USA.
Reminder. This is without Section 230. The platform was not exercising control over the content so it was not liable for what was in it.
On to October 1994. Boyz II Men is dominating the charts with a long run at number one with “I’ll Make Love to You.”
Prodigy’s Money Talk message board is still awash in talk about the bond market crisis. And an anonymous user posted that securities investment firm Stratton Oakmont had committed crime and fraud related to a stock IPO. Stratton Oakmont files a lawsuit against Prodigy alleging the company is the publisher of the information.
So you’d think, given the Compuserve case that prodigy is in good shape. It didn’t publish the comments the commoner did.
Except. It’s been a few years, and a few raging internet flame wars later, and Prodigy, like many other platforms, has developed some Content Guidelines for users to follow. It also has Board Leaders who are charged with enforcing those guidelines. And prodigy even uses some automated software to screen for offensive language. This is all good community moderation practice right? Clear set of guidelines. Consequences if you violate them. And even some automated ways to keep some of the bad stuff from ever even showing up.
The court looked at that and said, well, looks to us like you’re exercising editorial control. You’re deciding who gets to post what. That feels a lot more like the letters to the editor than it does the bookstore. The court wrote “Prodigy’s conscious choice, to gain the benefits of editorial control, has opened it up to a greater liability than CompuServe and other computer networks that make no such choice.”
In Stratton Oakmont v. Prodigy, the court ruled in favor of Stratton Oakmont.
After that case the law stands that courts will give you the protection of a distributor, as long as you don’t moderate. If you moderate the content, you’re on the hook for it.
So in other words before Section 230, you could either leave everything up or you’d have to be responsible for everything, meaning you’d have to pre-screen all posts. Your choice is either zero moderation or prior restraint.
Republican Chris Cox and Democrat Ron Wyden both thought this was not an ideal situation. So they wrote Section 230 of the Communications Decency Act which read “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
Those are the 26 words usually cited as section 230. But that’s just paragraph 1 of subsection c. There are a lot of other subsections related to definitions and why the act is being made etc. But there’s a second subparagraph of section c which is also important. It’s called Civil liability It reads:
No provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1)
In other words, even if it’s protected free speech, the platform can take down content it finds objectionable and not lose its protections from liability for other content.
All of this is a long way to say if the platform didn’t create the content, it’s not responsible for it. ..with a few exceptions.
This is another part of the discussion of Section 230 that gets left out. Section 230 specifically says that this law will have no effect on criminal law, intellectual property law, communications privacy law or sex trafficking law. So the DMCA for example still has to be followed. You have to respond to copyright takedown notices.
So back to the two Supreme Court cases Gonzalez v. Google and Twitter v. Taamneh.
Section 230 does not let Facebook publish anything without being responsible. It just means it’s not on the hook for what I post just because it removes other posts. It’s an interesting question whether recommendations count as content created by the platform or not. It would certainly count as editorial control before Section 230, but Section 230 was put in place specifically to allow a measure of editorial control without having to take responsibility for all posts.
It’s also an interesting question whether “terrorist” content qualifies as criminal content which Section 230 does not protect. And should Twitter have known about it and removed the accounts.
Bearing on both those questions is one more case that tested Section 230 shortly after it became law.
It’s April 25, 1995. Montell Jordan’s “This is How We Do It” tops the charts.
And someone has posted a message on an AOL Bulletin Board called “Naughty Oklahoma T-Shirts” describing the sale of shirts featuring offensive and tasteless slogans related to the Oklahoma City bombings which had happened 6 days before. The posting listed the phone number of Kenneth Zeran in Seattle, Washington who had no knowledge of the posting. He then received a high volume of calls, mostly angry about the post. Some calls were death threats. Zeran called AOL which said they would remove the post. However the next day a new post was made and new posts were made over the next four days. One of the posts was picked up by a radio announcer at KRXO in Oklahoma City who encouraged listeners to call the number. Zeran required police protection and sued KRXO and then separately AOL.
In its decision, the United States Court of Appeals for the Fourth Circuit wrote “It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted. Congress considered the weight of the speech interests implicated and chose to immunize service providers to avoid any such restrictive effect.”
It also wrote that Section 230 “creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service. Thus, lawsuits seeking to hold a service provider liable for its
exercise of a publisher’s traditional editorial functions — such as
deciding whether to publish, withdraw, postpone or alter content —
are barred.”
Zeran argued that even if AOL wasn’t a publisher, it was a Distributor and under the 1959 case, a distributor would still need to be responsible for speech it knew was defamatory. And Zeran argued AOL knew, because he called them about it after the first post. The judge however says that AOL is a publisher not a distributor plain and simple. But Section 230 shields them from the liability normally afforded a publisher. So you can’t just redefine them.
This ended up as a stricter protection for a distributor than the 1959 case. Instead of having to take it down once you know about it. Internet services were given a broader shield.
And that became the principle justification for CDA 230.
And if the Supreme Court follows that precedent it might also consider recommendations to be publishing behavior and therefore protected. That’s not the only way it could rule but it is a possibility.
In the end what I want folks to take away is that Section 230 doesn’t free a tech platform to do whatever it wants. It frees a platform to choose to moderate and exercise editorial control over the posts of others without having to assume responsibility for the thousands, and now millions of posts made every day.
It’s reasonable to argue that perhaps there are some responsibilities that should be restored to tech platforms through legislation. I think it’s worth pointing out that repealing Section 230 altogether would not necessarily achieve that.
So I hope now you have a firmer basis upon which to base your opinion whatever it is. In other words, I hope you know a little more about section 230.