Judge Denies Elon Musk’s Legal Bid To Stop California Law Over Social Media Moderation

Elon Musk’s X, formerly known as Twitter, lost a bid to temporarily halt a California law demanding social media platforms share their terms of service and provide semiannual reports to the state government over how content is moderated.
After the passage AB 587, a law requiring social media companies to disclose their terms of service and submit reports indicating that their content moderation focuses on removing hate speech, racism, extremism, disinformation, and harassment, X sued California Attorney General Rob Bonta (D). The company claimed that the law violated the First Amendment by compelling the moderation of speech.
“The legislative record is crystal clear that one of the main purposes of AB 587 — if not the main purpose — is to pressure social media companies to eliminate or minimize content that the government has deemed objectionable,” X’s legal complaint states, according to The Hollywood Reporter.
However, U.S. District Judge William Shubb ruled on Thursday that the company’s motion to stop the law is “unjustified or unduly burdensome within the context of First Amendment law,” according to The Hollywood Reporter. The judge noted that the state mandates are “uncontroversial” and only require public disclosure of existing content moderation.
“The required disclosures are also uncontroversial,” Shubb wrote in his ruling. “The mere fact that the reports may be ‘tied in some way to a controversial issue’ does not make the reports themselves controversial.”
Shubb also denied arguments that the law is superseded by section 230 of the Communication Decency Act, which has given social media platforms legal protection as third-party publishers for years.
“AB 587 only contemplates liability for failing to make the required disclosures about a company’s terms of service and statistics about content moderation activities, or materially omitting or misrepresenting the required information,” the judge added. “It does not provide for any potential liability stemming from a company’s content moderation activities per se.”