Should YouTube, Twitter Be More Responsible For Dangerous Content? Supreme Court Considers Tech Critics

Topline

The Supreme Court considers how responsible major social media platforms—Twitter, Facebook, YouTube, especially—are for their most dangerous posts, challenging broad protections that tech companies claim are necessary to keep the Internet from turning into a bleak wasteland, but that critics claim go too far.

Key Facts

The Supreme Court will hear oral arguments Monday in the case (Gonzalez v. Google) where family members of a victim in the 2015 Paris terrorist attacks sued Google, alleging YouTube (a Google company) should be held liable after its algorithm recommended ISIS recruitment videos to potential supporters, and hear arguments Wednesday in Twitter v. Taamneh, which takes similar aim against social media companies over their role in a 2017 terrorist attack in Turkey.

The first case challenges whether YouTube can be held liable for the recommendations it makes under Section 230 of the Communications Decency Act of 1996, which shields social media platforms and other Internet companies from legal liability by saying they’re not legally responsible for third-party content posted on their platform.

Tech platforms including Google, Meta, Twitter, Microsoft, Yelp, Reddit, Craigslist, Wikipedia and others have argued in filings that a court ruling saying YouTube can be held liable would have disastrous consequences, resulting in online platforms broadly restricting any content that could possibly be considered legally objectionable—or taking the opposite approach and leaving everything up with no filtering of obviously problematic content.

First Amendment advocacy groups including the ACLU and Knight Foundation have warned such restrictions could chill free speech, and if tech platforms are forced to get rid of recommendation algorithms, Google argued the Internet could turn into a “disorganized mess and a litigation minefield.”

The Twitter case, which also involves Facebook and Google, doesn’t concern Section 230, but instead asks if social media companies can be held responsible under the Anti-Terrorism Act, which allows lawsuits against anyone who “aids and abets” an act of international terrorism.

After a lower court found that merely knowing terrorists were among the company’s users would be enough grounds for a lawsuit, Twitter argued a ruling against it would also result in “particularly broad liability” for social media companies, and Facebook and Google suggested that could extend to other organizations who may have to work, even indirectly, with terrorists, including humanitarian groups that work on the ground in countries like Syria.

Chief Critic

The plaintiffs who sued Google rejected the dire predictions made by tech companies in a brief to the court, arguing they’re overbroad and “largely unrelated to the specific issues” in the case. “Predictions that a particular decision of this Court will have dire consequences are easy to make, but often difficult to evaluate,” the petitioners argued, noting that while social media companies still have other legal safeguards in place to protect them like the First Amendment, there’s “no denying that the materials being promoted on social media sites have in fact caused serious harm.”

Contra

The Biden Administration has argued the Supreme Court should narrow the scope of Section 230 to make it more possible to sue social media platforms, warning against an “overly broad reading” of the statute that could “undermine the importance of other federal statutes.” The White House argued Section 230 does not protect YouTube from lawsuits against harmful recommendations its algorithm makes, given that its recommendations are created by the company and not content from third parties. Supporters of the plaintiffs have also suggested a ruling against Google could help social media platforms clean up algorithms that have resulted in harmful recommendations for minors, with the Electronic Privacy Information Center arguing social media companies take advantage of Section 230’s broad nature and “use Section 230 as a shield instead of making their products safer.”

Crucial Quote

“Denying Section 230(c)(1)’s protection to YouTube’s recommendation display could have devastating spillover effects,” Google argued to the court in a brief, arguing that gutting Section 230 “would upend the internet and perversely encourage both wide ranging suppression of speech and the proliferation of more offensive speech.”

What To Watch For

Rulings in the two cases will come by the time the Supreme Court’s term wraps up in late June or early July. It’s also possible the court won’t issue a sweeping ruling on when social media companies can be held liable under Section 230: Google argued if the court throws out the Twitter case by saying the victim’s family didn’t have grounds to sue, it could also dismiss the Google case on the same grounds without getting into Section 230 at all.

Key Background

The Google case comes to the Supreme Court after lower district and appeals courts have both sided with the social media platform, ruling it’s protected by Section 230 and can’t be sued. The case was heard together with the Twitter case before the Ninth Circuit Court of Appeals, but the appeals court ruled against the social media platforms in the Twitter case, holding that Twitter, Facebook and Google could all be held liable under anti-terrorism laws even as it separately upheld Section 230’s protections. The social media cases come to the Supreme Court as the increasing power of Big Tech and platforms’ failure to successfully moderate harmful content have come under fire from both sides of the political aisle, and the Supreme Court took up the cases after conservative-leaning Justice Clarence Thomas suggested the court should consider the issue of Section 230.

Tangent

Republican lawmakers have particularly taken aim at Section 230 and sought to hold social media companies more legally accountable, as they’ve accused social media companies of chilling conservatives’ speech. Sen. Ted Cruz (R-Texas) led 11 GOP lawmakers in filing a brief arguing for the Supreme Court to narrow the scope of Section 230, arguing social media companies have used the broad interpretation of the statute to “[not be] shy about restricting access and removing content based on the politics of the speaker.”

Further Reading

Supreme Court To Consider Whether Tech Companies—Like Google, Twitter—Can Be Held Liable For Content Recommendations (Forbes)

Everything you need to know about Section 230 (The Verge)

These 26 words ‘created the internet.’ Now the Supreme Court may be coming for them (CNN)

Source: https://www.forbes.com/sites/alisondurkee/2023/02/20/should-youtube-twitter-be-more-responsible-for-dangerous-content-supreme-court-considers-tech-critics/