Both Google and Twitter are appearing before the Supreme Court this week to defend Section 230 protections.
- Two legal cases—Gonzalez v. Google and Twitter v. Taamneh—are scheduled to appear before the Supreme Court on Tuesday, February 21, and Wednesday, February 22nd, to defend Section 230 protections, which offer immunity for social-media networks against illicit or harmful content posted by users.
- Google is defending itself from the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack by individuals radicalized by YouTube videos. The family accuses Google’s algorithms of recommending videos that encouraged terrorism.
- Twitter faces a similar allegation from the family of Nawras Alassaf, who are accusing Twitter of aiding, abetting, and assisting individuals being radicalized by ISIS content.
- The court agreed to hear both cases in October and could issue a ruling by the summer of 2023—potentially damaging liability protections for social media, ScotusBlog reports.
Why It’s Important
Its decision will have a significant impact on the internet going forward, potentially allowing social media networks to be liable under the Anti-Terrorism Act and eligible for additional lawsuits; this could create further far-reaching consequences and liabilities.
Both Republicans and Democrats agree that Section 230 needs to change. However, both parties have different views about implementing change—Republicans accusing the provision of allowing censorship on platforms and Democrats accusing corporations of allowing harmful misinformation to spread.
As ScotusBlog notes, a striking down of Section 230 could place social-media networks in an uncomfortable position. These terrorism cases are potentially difficult ones to use as precedent-defining laws. Additionally, the court is considering two other cases—NetChoice vs. Paxton and Moody vs. NetChoice—that could heavily restrict social media companies from taking material down.
Google and Twitter could be caught between precedents that punish them for taking down too much content and not taking down enough content simultaneously.
“YouTube abhors terrorism and, over the years, has taken increasingly effective actions to remove terrorist and other potentially harmful content. But Section 230 forecloses petitioners’ claims. YouTube provides a website that publishes four third-party videos using algorithms to sort and list related videos that may interest viewers so that they do not confront a morass of billions of unsorted videos,” says Google.
“The combined effect of the Ninth Circuit’s errors creates a statute of impossible breadth. Any provider of widely available services that can be exploited by terrorists risks treble damages for unrelated terrorist attacks if a jury later determines that it should have done more to root out such exploitation,” says Twitter.