Is YouTube A Murderer?

Is YouTube A Murderer?

It seems like only yesterday that Instagram filled my feed with young influencers and their cat eye and food ball tutorials. Today Instagram sends me sloppy moms struggling with "guarding windows" and "watching at night with baby cold symptoms." The algorithm never missed a beat; I am truly a new mother. But what happens when Insta's algorithm treats me badly, spreads defamatory lies about my family, sends my kids to anorexia classes, or worse?

In the case of Gonzalez v. Google heard in the Supreme Court on Tuesday 21 February that the court ruled that a website - YouTube, Facebook, Twitter, etc. the "Next" function. Family of Woman Killed in 2015 Paris Attacks Says YouTube's Actions Aid Terrorism; In other words, YouTube should be responsible for the murder. But before the family could file a lawsuit (which would fail), YouTube said federal law bars the family from suing even if they win.

YouTube is right.

The law, section 230 of the Communications Decency Act, states that websites cannot be sued as "publishers" of user information. The goal, adopted in 1996, was to make the Internet accessible. If websites could be sued every time someone uploaded false, illegal or obscene material, websites (except perhaps the very richest) would almost certainly stop creating content. Goodbye TripAdvisor, Truth Social, Netflix, Rumble, Yelp, etc.

Congress held out the plaintiffs' lawyers — at least here — in exchange for a staggering amount of content: "Users upload more than 500 hours of video to YouTube every minute," according to a brief YouTube statement.

Of course, the victim's family isn't trying to wipe the internet. The family agrees that Article 230 protects YouTube from posting bad videos, but says the law does not protect YouTube when it performs a more onerous role, such as role-playing. B. YouTube's "Next" list or thumbnails of uploaded content. During the hearing, the family's lawyer said YouTube essentially creates and distributes a catalog of information. According to the family, the catalog is fair game for a case under current law.

It was a complicated argument that the judges barely understood, possibly because YouTube thumbnails are so far from advocating terrorism that the whole process was deemed futile.

But the family feud didn't save the internet as we know it. Good websites should organize and present information in a way that users can digest. Even an innocuous sorting mechanism like newest or most popular videos involves choosing a website. If every algorithm decision resulted in a website being sued, the tech giants would be taking too much risk by allowing third-party content in the first place. Should the Supreme Court help this world?

Justice Alena Kagan warned about this. He said he could "imagine a world where none of this survived." In other words, many demands! Maybe that's a good thing, because "every other industry has to internalize the costs involved." But it may not be good. These suits could end the internet. "We really don't know anything about these things. You know, they're not like the nine greatest experts on the Internet,” Justice Kagan joked.

Maybe a conservative will shrug and say Big Tech's algorithms are so crooked that endless litigation can right the ship. After all, Google employees give 88% to Democrats, Netflix employees 98% to Democrats, and Facebook employees 77% to Democrats. In this regard, any damage done to Big Tech and its algorithms will hit the left the hardest.

But it is unclear whether the push for more censorship called for in the statement is the best way forward.

Conservatives generally argue that censorship of Big Tech is excessive. Twitter and Facebook blocked newsworthy information about possible corruption by then-candidate Joe Biden. YouTube has suspended thousands of discussions about COVID-19, including one with Governor Ron DeSantis. And it recently removed a video from Project Veritas that raises serious legal and ethical questions about Pfizer.

But in reality, the problem can get worse. As sites are pressured to "remove third-party content that could lead to litigation," they will be less likely to allow "political (including conservative) talk about current affairs," the YouTube summary says. I don't like danger, but it is based on practical reality.

Ultimately, the Gonzalez case raises the question of where to draw the line. In some cases, the algorithm may indicate that Big Tech is doing it maliciously; in others, it simply allows users to post and access content. The limit would be difficult to define, and the 1996 law, written before today's algorithms, is not up to the task. However, hearings must be held before elected officials, not unelected judges. Fortunately, things seem to be moving in that direction.

Mae Mailman is a senior fellow at the Independent Women's Legal Centre.

KEEP READING:

Digital currency and you

Big Tech gets special treatment from Big Government

Chapter 230: Be Careful What You Wish For

NEW ROBLOX MURDER MYSTERY 2 SAINT PATRICK UPDATE?