Sep 26, 2025, 12:00 AM
Sep 26, 2025, 12:00 AM

California targets social media platforms for bias and threats liability

Highlights
  • California lawmakers have introduced SB 771 to hold social media platforms liable for harmful content generated by their users.
  • The measure seeks to impose substantial penalties for violations, particularly concerning hate crimes and discriminatory conduct.
  • This legislation emphasizes platform responsibility in managing algorithms to mitigate harmful content while aiming to protect civil rights.
Story

In California, lawmakers passed a significant legislative measure now awaiting Governor Gavin Newsom's approval, known as SB 771. This bill introduces a new statute that would impose liability on social media platforms generating over $100 million in annual revenue. If these platforms violate specific sections of the California Civil Code through their algorithms that propagate user content or engage in any aiding and abetting of discriminatory actions, they could face substantial civil penalties. Specifically, the penalties include up to $1 million for knowing violations and $500,000 for reckless violations, with potential doubling if the plaintiff is a minor. The bill highlights increasing concerns surrounding online safety, especially regarding historically marginalized communities. Lawmakers noted the alarming rise in hate crimes and bias incidents, including a noted 31 percent increase in anti-immigrant hate incidents and a 52.9 percent rise concerning anti-Jewish bias just in the past year into 2023. This legislation seeks to clearly establish that while the bill does not aim to regulate free speech, social media platforms cannot knowingly facilitate content that breaches state civil rights laws. A key aspect of SB 771 is its focus on the algorithms used by these platforms. Under the new statute, deploying algorithms to relay content independently constitutes an action by the platform, irrespective of the actual content being shared. This means that lawmakers are placing greater responsibility on social media companies to monitor and manage the behavior of their algorithms, acknowledging that they must have an understanding of how these systems operate. The broader consequences of this move could lead social media companies to adopt more precautionary measures when filtering content to avoid legal repercussions, potentially resulting in the removal of various materials that may appear problematic. By clarifying the scope of liability, California's new law sets a precedent for future discussions regarding the responsibilities of tech companies in regulating user-generated content and safety on their platforms.

Opinions

You've reached the end