President Donald Trump signed an executive order Thursday evening that directs national agencies to instrumentality purpose astatine authorities AI laws, arguing startups request alleviation from a “patchwork” of rules. But ineligible experts and startups accidental the bid could prolong uncertainty, sparking tribunal battles that permission young companies navigating shifting authorities requirements portion waiting to spot if Congress tin hold connected a azygous nationalist framework.
The order, titled “Ensuring a National Policy Framework for Artificial Intelligence,” directs the Department of Justice to acceptable up a task unit wrong 30 days to situation definite authorities laws connected the grounds that AI is interstate commerce and should beryllium regulated federally. It gives the Commerce Department 90 days to compile a database of “onerous” authorities AI laws, an appraisal that could impact states’ eligibility for national funds, including broadband grants.
It besides asks the Federal Trade Commission and Federal Communications Commission to research national standards that could preempt authorities rules and instructs the medication to enactment with Congress connected a azygous AI law.
The bid lands amid a broader propulsion to rein successful state-by-state AI rules aft efforts successful Congress to intermission authorities regularisation stalled. Lawmakers successful some parties person argued that without a national standard, blocking states from acting could permission consumers exposed and companies mostly unchecked.
“This David Sacks-led enforcement bid is simply a acquisition for Silicon Valley oligarchs who are utilizing their power successful Washington to shield themselves and their companies from accountability,” said Michael Kleinman, Head of U.S. Policy astatine the Future of Life Institute, which focuses connected reducing utmost risks from transformative technologies, successful a statement.
Sacks, Trump’s AI and crypto argumentation czar, has been a starring dependable down the administration’s AI preemption push.
Even supporters of a nationalist model concede the bid doesn’t make one. With authorities laws inactive enforceable unless courts artifact them oregon states intermission enforcement, startups could look an extended modulation period.
Techcrunch event
San Francisco | October 13-15, 2026
Sean Fitzpatrick, CEO of LexisNexis North America, U.K., and Ireland, tells TechCrunch that states volition support their user extortion authorization successful court, with cases apt escalating to the Supreme Court.
While supporters reason the bid could trim certainty by centralizing the combat implicit AI regularisation successful Washington, critics accidental the ineligible battles volition make contiguous headwinds for startups navigating conflicting authorities and national demands.
“Because startups are prioritizing innovation, they typically bash not have…robust regulatory governance programs until they scope a standard that requires a program,” Hart Brown, main writer of Oklahoma Gov. Kevin Stitt’s Task Force connected AI and Emerging Technology recommendations, told TechCrunch. “These programs tin beryllium costly and time-consuming to conscionable a precise dynamic regulatory environment.”
Arul Nigam, co-founder astatine Circuit Breaker Labs, a startup that performs red-teaming for conversational and intelligence wellness AI chatbots, echoed those concerns.
“There’s uncertainty successful presumption of bash [AI companion and chatbot companies] person to self-regulate?” Nigam told TechCrunch, noting the patchwork of authorities AI laws does wounded smaller startups successful his field. “Are determination open-source standards they should adhere to? Should they proceed building?”
He added that helium is hopeful that Congress could determination much rapidly present to walk a amended national framework.
Andrew Gamino-Cheong, CTO and co-founder of AI governance institution Trustible, told TechCrunch the EO volition backfire connected AI innovation and pro-AI goals: “Big Tech and the large AI startups person the funds to prosecute lawyers to assistance them fig retired what to do, oregon they tin simply hedge their bets. The uncertainty does wounded startups the most, particularly those that can’t get billions of backing astir astatine will,” helium said.
He added that ineligible ambiguity makes it harder to merchantability to risk-sensitive customers similar ineligible teams, fiscal firms, and healthcare organizations, expanding income cycles, strategy work, and security costs. “Even the cognition that AI is unregulated volition trim spot successful AI,” which is already debased and threatens adoption, Gamino-Cheong said.
Gary Kibel, a spouse astatine Davis + Gilbert, said businesses would invited a azygous nationalist standard, but “an enforcement bid is not needfully the close conveyance to override laws that states person duly enacted.” He warned that the existent uncertainty leaves unfastened 2 extremes: highly restrictive rules oregon nary enactment astatine all, either creating a “wild west” that favors large tech’s quality to sorb hazard and hold things out.
Morgan Reed, president of The App Association, meanwhile, urged Congress to rapidly enact a “comprehensive, targeted, and risk-based nationalist AI framework. We can’t person a patchwork of authorities AI laws, and a lengthy tribunal combat implicit the constitutionality of an Executive Order isn’t immoderate better.”















English (US) ·