After right-wing extremists attacked the Capitol on Jan. 6, and Twitter banned then-President Donald Trump from further postings on its platform, some wondered why the social network waited so long to impose the ban. Others asked why Twitter silenced Trump and not other provocateurs, and raised questions about agenda-driven enforcement. Still others asked what right Twitter has — as a supposedly neutral platform for the exchange of ideas and information — to ban anyone? All good questions.
It’s hard to believe now, but there was a time when early online forums and content providers had to read everything they put online and approve everything on their sites. Quite simply, they were responsible for the content they published. The internet was a small place, and content control was more manageable. Then came the internet revolution, which changed the way we communicate. Online speech, internet platforms and related responsibility for content moved front and center, and needed to be addressed.
Twenty-five years ago, Congress passed the Telecommunications Decency Act of 1996. Section 230 of that law provides immunity for website publishers for third-party content. Under the terms of the law: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” And the law goes on to provide a form of Good Samaritan protection from civil liability for online operators who remove or moderate third-party materials they deem obscene or offensive, even if the speech is otherwise constitutionally protected, so long as the removal or moderation is done in good faith.
Section 230’s broad protections helped the growth of social media and the wide sharing of information and opinions. It has provided protection for users to upload videos on YouTube and Vimeo, for Facebook and Twitter to offer social networking to millions of internet users, for bloggers to host comments of others on their sites, for Amazon and Yelp to publish user reviews, and for services like Craigslist to publish all sorts of classified ads. But there is a tension between Section 230 and the First Amendment. While private companies can create rules to restrict speech that is subject to their “control,” government is prohibited from restricting most forms of speech, and cannot require tech companies to do so.
After its 25-year run, Section 230 is ready for reform. But what kind remains the question. “The debate about Section 230 shows that people of all political persuasions are unhappy with the status quo,” Facebook Chief Executive Mark Zuckerberg told Congress last October. “People want to know that companies are taking responsibility for combating harmful content — especially illegal activity — on their platforms.”
We agree. But social media can’t have it both ways. If the operators of social media sites have the right to regulate content on their sites, they should also be answerable for what they publish. Any other approach allows for abuse and is unfair.