SCOTUS to hear content moderation cases that could change social media

[ad_1]

The Supreme Court on Friday agreed to hear two cases that stand to have sweeping consequences on the First Amendment rights of online platforms and their ability to moderate content. 

The cases—NetChoice v. Paxton and Moody v. NetChoice—challenge the constitutionality of two separate laws passed in Florida and Texas, both of which sought to limit the type of content that online platforms could remove or downrank. In Florida, the law specifically prohibits these companies from “deplatforming” politicians or prioritizing or deprioritizing posts “by or about” political candidates, among other things. In Texas, the law bans platforms from moderating on the basis of “viewpoint.” 

Two tech industry groups, NetChoice and CCIA, challenged both laws in court, arguing that they violated the platforms’ own First Amendment rights to moderate content. But the lower courts were divided in their rulings. While the 11th Circuit Court blocked the Florida law from going into effect, the 5th Circuit Court upheld the Texas law, creating just the sort of circuit split that the Supreme Court is designed to resolve. 

Now, the Court is poised to do just that. 

“We are pleased the Supreme Court agreed to hear our landmark cases,” Chris Marchese, NetChoice’s litigation director, said in a statement. “Online services have a well-established First Amendment right to host, curate and share content as they see fit. The internet is a vital platform for free expression, and it must remain free from government censorship.”

There are two key questions the Court will have to answer: The first is whether the actual restrictions on moderating content within the laws comply with the First Amendment. The second is whether the First Amendment conflicts with certain provisions within both laws that require platforms to provide users with individualized explanations of their content moderation decisions. 

While this will be the first time the Court has issued a decision on these questions, it’s not the first time the justices have had to weigh in on these laws. After the 5th Circuit lifted the injunction on the Texas law, which would have allowed it to go into effect, NetChoice and CCIA filed an emergency application with the Supreme Court, asking it to take up the case on its shadow docket. Last May, the Supreme Court blocked the law temporarily in a 5-4 decision, with Justices Samuel Alito, Neil Gorsuch, and Clarence Thomas writing a dissent. In his dissent, Justice Alito hinted that it wouldn’t be the last the Court heard of the case. “This application concerns issues of great importance that will plainly merit this Court’s review.” 

The court has also heard other cases recently that deal with online platforms’ liability with regard to harmful content. Earlier this year, in Gonzalez v. Google, the Court was asked to decide whether Section 230 protects platforms from liability for the recommendations that their algorithms serve to users. The Court ultimately vacated the suit, forgoing the question altogether. But even that case was only concerned with the way that online platforms themselves recommend content. It had little to do with the questions now before the Court regarding what limits the government can impose on online platforms when they’re moderating other people’s content.

This is not a trivial question: In recent years, bills seeking to prohibit content moderation in various ways have been introduced in dozens of states. The Court’s decision in these cases will determine whether any of those bills can survive. “These cases provide the Court the opportunity to clarify the constitutional limits on legislatures’ power to regulate social media,” Scott Wilkens, senior counsel at the Knight First Amendment Institute, said in a statement. “How the Court deals with the cases will have broad implications for speech online and for democracy.”



[ad_2]

Source link

Comments are closed.