Blog content

Are platforms responsible for user content? The Supreme Court can reset the rules.

Much of the social media ecosystem — love it or hate it — was made possible by a 1996 federal law.

It’s called the Communications Decency Act. Section 230 of this law protects online publishers like Facebook, Twitter and YouTube from liability for much of the content posted on their platforms.

This week, the Supreme Court announced that it would hear challenges to this law. One of the cases – Reynaldo Gonzalez v Google LLC — wonders whether Article 230 protects platforms that use algorithms to recommend content to users.

Marketplace’s Meghan McCarty Carino spoke with Eric Goldman, a law professor at Santa Clara University School of Law. He said there were several ways the decision could go.

Below is an edited transcript of their conversation.

Eric Goldman: Algorithmic recommendations are like a fancy way of saying that services prioritize and curate content for their audience. The way the plaintiffs framed the Gonzalez v. Google case invites the court to potentially do one of three things. They could say that algorithmic recommendations are like other types of services provided by online social media providers or other publishers, and are therefore covered by section 230. This also invites a court to say that none of this is covered by Section 230. Or the third situation, they might say some of the things that social media services qualify for Section 230, but not the algorithmic recommendations.

Meghan McCartyCarino: And what is at stake here? How could a decision change the way we interact with the Internet?

Goldman: One possibility is that the Supreme Court would say that not all recommendations made by social media services fall under Section 230. Then it would say that services can host content without fear or responsibility for what users say, but they can’t promote or maintain that content. And the internet would look a lot like Google Drive or Dropbox then. Someone who wants to share content online can download it and then the services stop. And at this point, individuals are responsible for trying to find their own audience.

McCarty Carino: And what could this mean for the business of tech companies? I mean, many have really relied on algorithmic recommendations for their business model.

Goldman: I think what will likely happen if the Supreme Court were able to circumscribe Section 230 is that some companies will migrate to more professionally produced content. Instead of allowing users to talk to each other, the services will choose a small number of voices, pay them to create the content, and then share it with their audience. And the most likely way to pay for this is with paywalls. And it will exacerbate the digital divides that already exist.

Eric Goldman follows the hundreds of cases that involve Section 230 in one way or another, and you can read his thoughts on many of those cases on his Blog.

Now I said the Supreme Court is considering multiple challenges to Section 230. Another case – Twitter against Taamneh— wonders whether online platforms should be held responsible for not detecting and preventing terrorist groups from using their services.

Twitter is appealing a 9th Circuit court ruling that said it should be held liable.

Professor Goldman pointed out that Section 230 protections already exclude matters related to federal criminal investigations, which could conceivably include terrorism.