GOP Senator urges SCOTUS to rein in big tech's content censorship, defies 'logic'

Sen. Josh Hawley is weighing in landmark Supreme Court case that could reshape the legal landscape for Big Tech companies and how they censor user content.

FIRST ON FOX - Senator Josh Hawley, R-Mo., is urging the Supreme Court to not buy into arguments from Big Tech platforms that they should have First Amendment protections to censor user content while simultaneously demanding legal liability from content posted on their platforms. 

Next month, the Supreme Court will hear arguments in a set of cases that question whether state laws that limit Big Tech companies’ ability to moderate content on their platforms curbs the companies’ First Amendment liberties.

The Missouri Republican filed an amicus brief in the cases Tuesday arguing that the platforms are attempting to have their cake and eat it, too, by wanting to keep liability protections granted by congress for content on their sites, and simultaneously asking for unfettered ability to censor content, citing their First Amendment liberties.

The court "should not bless the platforms’ contradictory positions, much less constitutionalize them," Hawley argued, and that "doing so would effectively immunize the platforms from both civil liability in tort and regulatory oversight by legislators."

GOP AGS ASKS SUPREME COURT TO PEEL BACK CONTENT MODERATION FROM BIG TECH IN LANDMARK FIRST AMENDMENT CASE

The cases before the high court stem from two separate laws that passed in Florida and Texas that would require large Big Tech companies like X, formerly Twitter, and Facebook to host third-party communications but prevent those businesses from blocking or removing users' posts based on political viewpoints. 

A federal appeals court had ruled for the tech industry in the Florida case, saying as private entities, those companies were "engaged in constitutionally protected expressive activity when they moderate and curate the content that they disseminate on their platforms." But the Fifth Circuit ruled in favor of a similar law in Texas, creating a circuit split on the issue ripe for the nine justices to take up. 

Hawley in his brief explains that in the 1990s following the advent of the internet, Congress and the courts needed to square the longstanding principle in American publication law that "individuals who play an active role in disseminating others’ speech are liable for any unlawful harm that speech causes."

The result was Section 230 of the Communications Decency Act, which broadly insulates platforms from civil liability for hosting user-generated content. 

"At the time, Section 230 was justified on the theory that platforms could not exercise publisher-level control over the speech generated by third-party users," says Hawley. 

"Despite decades arguing for this position, today the tech platforms take precisely the opposite line. They claim that their content hosting and curation decisions are in fact expressive—expressive enough that they enjoy First Amendment protection," the lawmaker’s brief states. 

JAN 6 RIOTERS, ABORTION, GUN RIGHTS: A LOOK AHEAD AT LANDMARK CASES SCOTUS WILL HEAR IN 2024

In an interview with Fox News Digital, Hawley charged that the social media giants "always have some excuse as to why the law doesn't apply to them."

"It doesn't matter that they've made exactly opposing arguments in court. They don't care about that. All they care about is preserving their ability to control speech and censor at-will, he said. 

The platforms told the Supreme Court that the state laws in Florida and Texas "openly abridge" their "First Amendment right to exercise editorial judgment over what content to disseminate on their websites via requirements that are speaker-based, content-based, and viewpoint- discriminatory."

But Hawley says the platforms’ argument "completely undercuts the logic of Section 230," which the platforms have long sought to keep in place despite bipartisan pressure to repeal all if not some of that statute. 

"Extending an historical blanket immunity to this sector will have real-world consequences. To invoke a frighteningly realistic hypothetical, nothing could stop a web platform’s algorithm from promoting content designed to addict and harm young people," Hawley writes in his brief. 

SUPREME COURT APPEARS READY TO REEL IN ADMINISTRATIVE STATE IN LANDMARK CHALLENGE FROM EAST COAST FISHERMEN

"Take, as an example, content promoting eating disorders (a shockingly common phenomenon on modern social media). Companies could choose to affirmatively undermine the mental and physical health of America’s youth, while enjoying the protections of Section 230. While teens starved and parents looked on, no private action would lie. And then, when the government stepped in, the platforms could simply invoke their First Amendment immunity. Promoting eating disorders could be, after all, an editorial choice," he argues. 

"Nestled in a comfortable fissure between legal doctrines, the platforms could look on as their algorithms—or affirmative curation decisions—devastated a generation," he added. 

The court will hear arguments in the cases, Moody v. NetChoice, LLC and NetChoice LLC v. Paxton on Monday, Feb. 26. 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.