Image Credits:Jerod Harris/Getty Images for Vox Media / Getty Images11:56 AM PDT · September 13, 2025
California’s authorities legislature gave last approval aboriginal connected Saturday greeting to a large AI information measure mounting caller transparency requirements connected ample companies.
As described by its author, authorities legislator Scott Wiener, SB 53 “requires ample AI labs to beryllium transparent astir their information protocols, creates whistleblower protections for [employees] astatine AI labs & creates a nationalist unreality to grow compute entree (CalCompute).”
The measure present goes to California Governor Gavin Newsom to motion oregon veto. He has not commented publically connected SB 53, but past year, helium vetoed a much expansive information bill besides authored by Wiener, portion signing narrower legislation targeting issues similar deepfakes.
At the time, Newsom acknowledged the value of “protecting the nationalist from existent threats posed by this technology,” but criticized Wiener’s erstwhile measure for applying “stringent standards” to ample models careless of whether they were “deployed successful high-risk environments, [involved] captious decision-making oregon the usage of delicate data.”
Wiener said the caller measure was influenced by recommendations from a argumentation sheet of AI experts that Newsom convened aft his veto.
Politico besides reports that SB 53 was precocious amended truthful that companies processing “frontier” AI models portion bringing successful little than $500 cardinal successful yearly gross volition lone request to disclose precocious level information details, portion companies making much than that volition request to supply much elaborate reports.
The measure has been opposed by a fig of Silicon Valley companies, VC firms, and lobbying groups. In a caller missive to Newsom, OpenAI did not notation SB 53 specifically but argued that to debar “duplication and inconsistencies,” companies should beryllium considered compliant with statewide information rules arsenic agelong arsenic they conscionable national oregon European standards.
Techcrunch event
San Francisco | October 27-29, 2025
And Andreessen Horowitz’s caput of AI argumentation and main ineligible serviceman recently claimed that ”many of today’s authorities AI bills — similar proposals successful California and New York — risk” crossing a enactment by violating law limits connected however states tin modulate interstate commerce.
a16z’s co-founders had antecedently pointed to tech regularisation arsenic 1 of the factors starring them to back Donald Trump’s bid for a 2nd term. The Trump medication and its allies subsequently called for a 10-year prohibition connected authorities AI regulation.
Anthropic, meanwhile, has come retired successful favour of SB 53.
“We person agelong said we would similar a national standard,” said Anthropic co-founder Jack Clark in a post. “But successful the lack of that this creates a coagulated blueprint for AI governance that cannot beryllium ignored.”
Anthony Ha is TechCrunch’s play editor. Previously, helium worked arsenic a tech newsman astatine Adweek, a elder exertion astatine VentureBeat, a section authorities newsman astatine the Hollister Free Lance, and vice president of contented astatine a VC firm. He lives successful New York City.















English (US) ·