By David G. Bjornstrom

Will the Supreme Court Hold Google Accountable?

October 28, 2022
Column: Catching Air
The Supreme Court has agreed to hear a new case, Gonzales v. Google, seeking to hold Google and its subsidiary, YouTube, legally responsible for injuries to victims of terrorism promoted on their sites. This case has implications far beyond terrorism, potentially making these companies responsible whenever their web search algorithms steer viewers to dangerous content. 
A ruling in this case could also affect the other big internet platforms such as Twitter and Meta (formerly Facebook) as they increasingly influence the flow of information to the http://bit.ly/CTP-CBJpublic.
Google is not the neutral web platform it pretends to be. Internet search results are driven by complicated algorithms designed to help users find what they want, but those results can be “moderated” by the company to promote or restrict what the user sees.

Despite rather noble-sounding policies such as ‘no dangerous content or hate speech’, those policies may be unevenly applied and their implementation affected by the personal biases of company management and programmers.

Web search algorithms also take into account the individual user’s personal search history, so two people entering the same search query will not necessarily get the same results. In that way, Google and YouTube target users with information and videos according to their individual profiles.

Why this case, Gonzales v. Google?

The Google case stems from a lawsuit by parents of an American woman killed by ISIS terrorists in Paris. It is alleged that Google (through its YouTube service) aided and abetted potential terrorists, violating the Anti-terrorism Act, by directing web searches to ISIS recruitment videos. Google did this by its automated search algorithms that recommend or restrict content to amplify a user’s existing point of view. According to the lawsuit, young men with an inclination toward terrorism were knowingly directed by YouTube to the ISIS sites.
 
Some argue that it is not fair to blame Google and YouTube for terrorist murders when all they did was create a system to direct viewers to whatever interests them. But Google should have known it was promoting terrorism when its web search algorithms send susceptible users to ISIS recruiting sites. That is like telling kids where to get illegal guns after they express an interest in school shootings.

What is at stake in this case?

The Supreme Court will be limiting its inquiry to an abstract legal issue, but one with far-reaching consequences.
The issue is whether Google and YouTube should be categorically exempt from liability by virtue of their status as internet “platforms” that simply publish other people’s content, like a digital bulletin board. Or should they be legally responsible as directors of information when they actively steer users to specific content based on individual user profiles?
If Google loses at the Supreme Court, it could open a floodgate to other cases. How about liability for school shootings when YouTube search algorithms refer disturbed young men to sites that glorify senseless violence? Or teen suicides when they send young people suffering from depression to suicide “how to” videos?

Why do Google and YouTube do this?

Google and YouTube have a huge financial incentive to direct user web searches. They make most of their money selling ad space and advertisers pay based on how often a site is viewed. Unfortunately, violence and extremism sell, so Google and YouTube make money directing viewers to those sites.

What about Google’s and YouTube’s right to free speech?

Free speech has its limits and one of those limits is that it must not incite people to likely and imminent crime or violence such as terrorism or human trafficking. http://bit.ly/CTP-CBJ
Nor does free speech insulate a business from liability for intentional defamation, fraud or false advertising. Is Google engaged in false advertising when it promotes Covid vaccines while censoring respected medical professionals warning that the vaccines are not so “safe and effective”?

Section 230

Google argues for a special exemption from the normal rules of liability since it is an interactive social media platform under section 230 of the Communications Decency Act. Section 230 was enacted by Congress to encourage free speech by protecting the internet platforms from legal liability when they act as mere publishers or distributors of other people’s content. The question, however, is whether Google is really just a web host conduit for other people’s information, or is it actually conveying its own message when it makes “targeted recommendations.”
Section 230 allows digital publishers to exercise traditional editorial functions, like deciding whether to display or withdraw content created by others. But Google’s business model goes far beyond traditional editorial functions when it refers content to users based on their individual profiles.
Google also argues that its modern search engine, unlike the old internet bulletin boards and chat rooms, must necessarily pick and choose, analyze, search, organize and translate content in order to be useful to users without irrelevant or unwanted information. If that is really true, however, the logical conclusion might be that the industry has simply outgrown section 230 and can no longer claim its benefits.

What next?

This case is just the tip of the legal iceberg as the Court appears poised to review other broad limits on big tech website practices, including laws in Florida and Texas seeking to prevent Google, Twitter, Facebook and YouTube from censoring political viewpoints.
Justice Clarence Thomas has suggested a narrow reading of section 230 to pare back the sweeping immunity that other courts have erroneously read into that section. In light of their dominant market position, he also suggests treating the big internet platforms as public utilities, like the phone company or a digital public square. As public utilities they would be insulated from liability only so long as they serve all members of the public fairly as an open platform for legitimate free speech.http://bit.ly/CTP-CBJ
What is the right balance? Should Google and YouTube be required to exercise more control over their sites, like blocking terrorist propaganda, or should they be required to exercise less control, forced to keep their sites open to all viewpoints? Perhaps there is a middle ground, avoiding unfair censorship while also refraining from actively promoting violence, intentional defamation, fraud and false advertising.
These are big issues for the future of the internet with enormous potential liability risk for the big internet platforms. Depending on the Court’s decision, these companies may need to change their business models.

David G. Bjornstrom is a member of the U.S. Supreme Court bar and retired California attorney at law with 38 years specializing in business, estate and... MORE »

Leave a Reply