Topline
The Supreme Court will consider to what extent tech companies can be held legally liable for the content that gets published on its platforms, as the court announced Monday it will take up a case that concerns whether Google was in the wrong for recommending YouTube videos that helped encourage ISIS recruitment, and by extension a separate case brought by Twitter over similar content.
Key Facts
The court agreed to take up Gonzalez v. Google, which was brought by the father of a woman killed in the 2015 terrorist attack in Paris and alleges Google “recommend[ing] ISIS videos to users” was “critical to the growth and activity of ISIS” and the company should be held legally accountable.
Social media companies have so far been shielded from legal liability regarding the content that users publish on its platform under section 230 of the Communications Decency Act, which states no computer service providers “shall be treated as the publisher or speaker of any information” published by another content provider, meaning its users.
The case asks the Supreme Court whether Section 230’s protections should include targeted video recommendations on social media platforms, or if they should only be legally protected when it comes to content being published on the platforms.
Reynaldo Gonzalez, who brought the case, argued platforms’ legal liability should be limited to “traditional editorial functions” like “whether to publish, withdraw, postpone or alter content,” and not recommendations, while Google argues its recommendations are protected under section 230.
District and appeals courts both sided with Google in the case previously, though other appeals courts have ruled in favor of tech companies being held liable for recommendations.
The Supreme Court also announced it will take up Twitter, Inc. v. Taamneh, a related case that was brought against Twitter, Facebook and YouTube seeking to hold them liable for extremist content published on their platforms in light of a 2017 terrorist attack in Turkey, after Twitter asked the Supreme Court to take up the case if it also took up Gonzalez v. Google.
Crucial Quote
“Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media,” Gonzelez’s attorneys wrote in his petition to the Supreme Court in the case. “Application of section 230 to such recommendations removes all civil liability incentives for interactive computer services to eschew recommending … harmful materials, and denies redress to victims who could have shown that those recommendations had caused their injuries, or the deaths of their loved ones.”
Chief Critic
If the court rules that YouTube recommendations can’t be shielded from legal liability, “section 230 would be a dead letter,” Google argued in a court filing to the Supreme Court. “This Court should not lightly adopt a reading of section 230 that would threaten the basic organizational decisions of the modern internet.” The company has not yet responded to a request for comment on the Supreme Court’s decision Monday.
Key Background
The Supreme Court decided to take up the case Monday after Justice Clarence Thomas previously suggested the court should weigh in on section 230, saying in 2020 as part of a separate case that when a more “appropriate” one comes up, the justices “should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.” YouTube has drawn widespread scrutiny for how its algorithm pushes videos related to extremist or partisan topics to users, including misinformation, with a Mozilla Foundation study in July 2021 finding 70% of objectionable videos that participants flagged were found through the platform’s recommendation system. Current and former YouTube engineers told the Wall Street Journal in 2018 that while YouTube wasn’t consciously trying to recommend extremist content, the platform’s algorithm highlights videos that are “already drawing high traffic and keeping people on the site,” which tend to be “sensationalistic.”
Tangent
While unrelated to the specific complaints in Gonzalez v. Google, Republicans have also railed against section 230 in recent years because of what they perceive to be a “bias” that tech companies have against conservatives, and have called for the statute to be reformed to open tech companies up to more legal liability. Democrats have also called for reforms on the policy as a way to hold platforms accountable for misinformation and hate speech—closer to what the Supreme Court case aims to do—with President Joe Biden calling for the statute to be “revoked” in a 2020 interview with the New York Times.
Further Reading
Twitter, Google, Facebook Mostly Immune to ISIS Attack Lawsuits (Bloomberg)
Republicans and Democrats both want to repeal part of a digital content law, but experts say that will be extremely tough (Texas Tribune)
A guide for conceptualizing the debate over Section 230 (Brookings Institute)
Source: https://www.forbes.com/sites/alisondurkee/2022/10/03/supreme-court-to-consider-whether-tech-companies-like-google-facebook-can-be-held-liable-for-content-recommendations/