In the run-up to last week’s U.S. Supreme Court hearing in the closely watched case entitled Gonzalez v. Google, we were told that the court’s decision had the potential to wreak havoc with the internet as the justices were called on to decide how online platforms are supposed to handle speech and content moderation.
The case focuses on Section 230 of the Communications Decency Act, which provides protection from liability to tech platforms for most content contributed to their sites by third parties. That means that when users post defamatory tweets or inciting comments on Twitter, Facebook, YouTube and similar platforms, the platforms themselves aren’t deemed legally responsible for that content. But what happens when the platform’s own algorithm promotes offensive tweets, comments or videos? Does the Section 230 protection continue, or does it disappear?
The Gonzalez case was brought by the family of a 23-year-old American college student who was killed in a Paris restaurant attack by Islamic State followers. The family argues that YouTube has some responsibility for the death, since YouTube’s “Up Next” algorithm promoted radicalizing material to viewers who viewed similar radicalized posts, and that process further influenced the viewers to engage in terror attacks.
There is appeal to the Gonzalez theory. It makes sense to distinguish between material that a platform merely hosts from material that the platform itself promotes. But if platforms will be liable for algorithmically “recommended” content — which is a process that simply provides links based upon what the viewer has already selected to watch — the ramifications could be monumental. It would change the way the internet operates and would likely cause platforms to abandon any systems that recommend or prioritize material based upon a user’s inquiry or viewing history. Instead, as asserted by one of Google’s lawyers, the internet would be rendered a useless jumble.
In the near three hours of argument on the case, we saw very little of the “activist” court so many have complained about. None of the justices seemed anxious to take this one on. Instead, we heard practical, thoughtful and challenging questions along with palpable frustration with the court being called upon to decide an issue that requires legislative clarification and guidance.
Although we sympathize with the Gonzalez family and others who have been so brutally impacted by horrific internet posts, it is not the responsibility of the courts to rewrite legislation. Section 230 and its 26 words that are the focus of the debate in the Gonzalez case were created by Congress in 1996, when the internet was in its infancy. A lot has changed since then, including the advent of the algorithms that are at issue in the case and developing artificial intelligence that impacts our internet interactions and our daily lives.
Congress needs to address these issues in a comprehensive manner. Congress needs to develop new laws, standards and guidance to deal with today’s ever-expanding technological development and use. That is the job of the legislature, not the courts. ■