California’s new AI safety law shows regulation and innovation don’t have to clash
SB 53, the AI safety and transparency bill that California Gov. Gavin Newsom signed into law this week, is proof that state regulation doesn’t have to hinder AI progress.
Rebecca Bellan
Published on October 5, 2025 · Updated October 5, 2025

Adam Billen of Encode AI on SB 53 and the Future of AI Legislation
So says Adam Billen, vice president of public policy at the youth-led advocacy group Encode AI, on today’s episode of Equity. “The reality is that policymakers themselves know that we have to do something, and they know from working on a million other issues that there is a way to pass legislation that genuinely does protect innovation — which I do care about — while making sure that these products are safe,” Billen told TechCrunch.
At its core, SB 53 is the first bill of its kind in the U.S. requiring large AI labs to be transparent about their safety and security protocols — specifically regarding how they prevent their models from being used for catastrophic risks, such as cyberattacks on critical infrastructure or the creation of bioweapons. The law also mandates that companies adhere to those protocols, which will be enforced by the Office of Emergency Services.
“Companies are already doing much of what we ask them to do in this bill,” Billen told TechCrunch. “They perform safety testing on their models. They release model cards. Are some companies starting to cut corners in certain areas? Yes. And that’s why bills like this are important.”
Billen also noted that some AI firms have policies allowing them to relax safety standards under competitive pressure. OpenAI, for example, has publicly stated that it may “adjust” its safety requirements if a rival AI lab releases a high-risk system without similar safeguards. Billen argues that legislation can help hold companies to their existing safety promises, preventing them from cutting corners under competitive or financial stress.
While public opposition to SB 53 was muted compared to its predecessor, SB 1047 — which Governor Newsom vetoed last year — the rhetoric in Silicon Valley and among most AI labs remains that nearly any AI regulation is an obstacle to progress and will ultimately hinder the U.S. in its race against China. That’s why companies like Meta, venture capital firms like Andreessen Horowitz, and influential figures such as OpenAI president Greg Brockman are collectively pouring hundreds of millions into super PACs to back pro-AI politicians in state elections. It’s also why these same forces earlier this year pushed for an AI moratorium that would have barred states from regulating AI for ten years.
Encode AI led a coalition of more than 200 organizations to help defeat that proposal, but Billen says the fight isn’t over. Senator Ted Cruz, who championed the moratorium, is now pursuing a new strategy to achieve the same goal — federal preemption of state laws. In September, Cruz introduced the SANDBOX Act, which would allow AI companies to apply for waivers to temporarily bypass certain federal regulations for up to ten years. Billen also anticipates a forthcoming bill establishing a federal AI standard that would be pitched as a “middle-ground” solution but would, in reality, override state-level legislation.
He warned that narrowly scoped federal AI laws could “delete federalism for the most important technology of our time.”
“If you told me SB 53 was the bill that would replace all the state bills on everything related to AI and all of the potential risks, I would tell you that’s probably not a very good idea, and that this bill is designed for a particular subset of issues,” Billen said.
Adam Billen, vice president of public policy, Encode AI
Image Credits: Encode AI
While he agrees that the AI race with China matters and that policymakers must enact regulation supporting American progress, Billen says killing state bills — which mainly focus on deepfakes, transparency, algorithmic discrimination, children’s safety, and government use of AI — is not the right approach.
“Are bills like SB 53 the thing that will stop us from beating China? No,” he said. “I think it’s genuinely intellectually dishonest to claim that this is what will stop us in the race.”
He added: “If what you care about is beating China in the AI race — and I do care about that — then what you would push for are things like export controls in Congress,” Billen said. “You’d make sure American companies have the chips. But that’s not what the industry is pushing for.”
Legislative efforts like the Chip Security Act aim to prevent the diversion of advanced AI chips to China through export controls and tracking mechanisms, while the existing CHIPS and Science Act seeks to boost domestic chip production. However, some major tech companies, including OpenAI and Nvidia, have expressed reluctance or opposition to parts of these efforts, citing concerns about effectiveness, competitiveness, and security risks.
Nvidia has its own reasons — it has a strong financial incentive to continue selling chips to China, which has historically represented a significant portion of its global revenue. Billen speculates that OpenAI might be holding back on chip export advocacy to stay in the good graces of key suppliers like Nvidia.
The Trump administration has also sent mixed messages. Three months after expanding an export ban on advanced AI chips to China in April 2025, the administration reversed course, allowing Nvidia and AMD to sell certain chips to China in exchange for 15% of the revenue.
“You see lawmakers on Capitol Hill moving toward bills like the Chip Security Act that would impose export controls on China,” Billen said. “In the meantime, there’s going to continue to be this propping up of the narrative to kill state bills that are actually quite light-touch.”
Billen added that SB 53 is an example of democracy in action — of industry and policymakers working together to produce a version of a bill that everyone can live with. It’s “very ugly and messy,” he said, but “that process of democracy and federalism is the entire foundation of our country and our economic system, and I hope that we continue to do it successfully.”
“I think SB 53 is one of the best proof points that that can still work,” he said.
Related Articles

With its latest acqui-hire, OpenAI is doubling down on personalized consumer AI
OpenAI has acquired Roi, an AI-powered personal finance app. In keeping with a recent trend in the A...

The Pulse of Cloud and Cyber
In this edition of “Nabz-e Abr & Cyber,” we track five meaningful waves—from Microsoft’s $15.2B bet...

What to expect at OpenAI’s DevDay 2025, and how to watch it
OpenAI is gearing up to host its third annual developer conference, DevDay 2025, on Monday.