top of page
  • snitzoid

The Supremes to examine whether Google should have any liability for what it....?

I say let them do and say whatever the hell they matter who gets thrown under the bus. That's how it works at the Spritzler Report!

ISIS, YouTube and Section 230 at the Supreme Court

The Justices take up a case about Google’s immunity from liability.

By The Editorial Board, WSJ

Updated Feb. 13, 2023 6:58 pm ET

The Supreme Court on Feb. 21 is taking up Gonzalez v. Google, a case on the Section 230 immunity enjoyed by internet platforms. This dispute probably won’t produce the blockbuster ruling that critics of Silicon Valley want, but it might illustrate why the questions surrounding Section 230, which will keep coming, are best answered by legislators, not the judiciary.

Under Section 230, internet platforms can’t be “treated as the publisher or speaker” of information provided by their users. If a restaurant is defamed on Yelp or Facebook or Twitter, the party who is legally liable is the author of the malicious review, not the website. This makes sense, given that such services are open bulletin boards where enforcement of standards is done by moderators after the fact.

But are internet sites liable for the algorithms they use to sort and present content? The petitioners in Gonzalez v. Google say yes. The case was brought by the family and estate of Nohemi Gonzalez, a 23-year-old American student who was killed in a 2015 ISIS attack in Paris. Their argument is that YouTube, which is part of Google, aided and abetted the terrorist group, because its algorithms “recommended ISIS videos to users,” which helped spread its message.

The family says YouTube’s recommendation engine isn’t covered by Section 230. There are a few problems, though. Google’s brief says the petitioners “did not allege that any Paris attacker saw any ISIS video” or “that YouTube played any role in bringing about the Paris attack.” A related case the Justices will hear Feb. 22, Twitter v. Taamneh, asks whether aiding-and-abetting laws even cover “generic, widely available services” that aren’t connected to a specific terrorist act.

If the internet is going to be usable, platforms need some way to sift the deluge created by the online masses. About 720,000 hours of video are posted to YouTube each day. Its algorithms collate relevant videos based on “thousands of inputs, including factors like a viewer’s YouTube search and watch history, location, and time of day,” Google says. The company says this conduct is akin to publishing, and Section 230 says YouTube isn’t legally liable as the publisher of user videos.

The petitioners try to distinguish YouTube from search engines such as Google, but the effort is unconvincing because the functionality is similar. That isn’t to say Section 230 always reaches as far as the tech companies want. Courts are probably stretching the law if they apply its immunity to content that a platform knew was illegal or had a direct hand in creating.

One reason for these complexities is that Section 230 passed in 1996, two years before Google was founded, three years before the word “blog” was invented, and when Mark Zuckerberg was 11 years old. That’s why it seems to talk past today’s controversies. Do social-media sites have immunity for fact-checks they append to disputed posts? What if search engines use language models to directly answer user queries, with text synthesized from the web?

It’s hard to see how the internet as we know it would function without the core liability protection of Section 230, and any GOP attempt to create a Fairness Doctrine to monitor speech on the web would be a grave mistake.

But lawmakers could mandate more transparency about how moderation policies are enforced. They could set rules to stop government officials from secretly jawboning platforms into censorship. They could also clarify how a law from the AOL era applies to an AI age that was unimaginable in 1996.

7 views0 comments

Recent Posts

See All


Post: Blog2_Post
bottom of page