The fixer’s dilemma: Chris Lehane and OpenAI’s impossible mission

6 months ago 77

Chris Lehane is 1 of the champion successful the concern astatine making atrocious quality disappear. Al Gore’s property caput during the Clinton years, Airbnb’s main situation manager done each regulatory nightmare from present to Brussels – Lehane knows however to spin. Now he’s 2 years into what mightiness beryllium his astir intolerable gig yet: arsenic OpenAI’s VP of planetary policy, his occupation is to person the satellite that OpenAI genuinely gives a damn astir democratizing artificial quality portion the institution progressively behaves like, well, each different tech elephantine that’s ever claimed to beryllium different.

I had 20 minutes with him connected signifier astatine the Elevate league successful Toronto earlier this week – 20 minutes to get past the talking points and into the existent contradictions eating distant astatine OpenAI’s cautiously constructed image. It wasn’t casual oregon wholly successful. Lehane is genuinely bully astatine his job. He’s likable. He sounds reasonable. He admits uncertainty. He adjacent talks astir waking up astatine 3 a.m. disquieted astir whether immoderate of this volition really payment humanity.

But bully intentions don’t mean overmuch erstwhile your institution is subpoenaing critics, draining economically depressed towns of h2o and electricity, and bringing dormant celebrities backmost to beingness to asseverate your marketplace dominance.

The company’s Sora occupation is truly astatine the basal of everything else. The video procreation instrumentality launched past week with copyrighted worldly seemingly baked close into it. It was a bold determination for a institution already getting sued by the New York Times, the Toronto Star, and fractional the publishing industry. From a concern and selling standpoint, it was besides brilliant. The invite-only app soared to the top of the App Store arsenic radical created integer versions of themselves, OpenAI CEO Sam Altman; characters similar Pikachu, Mario, and Cartman of “South Park”; and dormant celebrities similar Tupac Shakur.

Asked what drove OpenAI’s determination to motorboat this newest mentation of Sora with these characters, Lehane gave maine the modular pitch: Sora is simply a “general intent technology” similar energy oregon the printing press, democratizing creativity for radical without endowment oregon resources. Even helium – a self-described originative zero – tin marque videos now, helium said connected stage.

What helium danced astir is that OpenAI initially “let” rights holders opt retired of having their enactment utilized to bid Sora, which is not however copyright usage typically works. Then, aft OpenAI noticed that radical truly liked utilizing copyrighted images (of people they did), it “evolved” toward an opt-in model. That’s not truly iterating. That’s investigating however overmuch you tin get distant with. (And by the way, though the Motion Picture Association made immoderate noise past week astir ineligible threats, OpenAI appears to person gotten distant with rather a lot.)

Naturally, the concern brings to caput the aggravation of publishers who impeach OpenAI of grooming connected their enactment without sharing the fiscal spoils. When I pressed Lehane astir publishers getting chopped retired of the economics, helium invoked just use, that American ineligible doctrine that’s expected to equilibrium creator rights against nationalist entree to knowledge. He called it the concealed limb of U.S. tech dominance.

Techcrunch event

San Francisco | October 27-29, 2025

Maybe. But I’d precocious interviewed Al Gore – Lehane’s aged brag – and realized anyone could simply inquire ChatGPT astir it alternatively of speechmaking my portion connected TechCrunch. “It’s ‘iterative’,” I said, “but it’s besides a replacement.”

For the archetypal time, Lehane dropped his spiel. “We’re each going to request to fig this out,” helium said. “It’s truly glib and casual to beryllium present connected signifier and accidental we request to fig retired caller economical gross models. But I deliberation we will.” (We’re making it up arsenic we go, successful short.)

Then there’s the infrastructure question cipher wants to reply honestly. OpenAI is already operating a information halfway field successful Abilene, Texas, and precocious broke crushed connected a monolithic information halfway successful Lordstown, Ohio, successful concern with Oracle and SoftBank. Lehane has likened accessibility to AI to the advent of energy – saying those who accessed it past are inactive playing catch-up – yet OpenAI’s Stargate task is seemingly targeting immoderate of those aforesaid economically challenged places arsenic spots to acceptable up facilities with their monolithic appetites for h2o and electricity.

Asked during our sit-down whether these communities volition payment oregon simply ft the bill, Lehane went to gigawatts and geopolitics. OpenAI needs astir a gigawatt of vigor per week, helium noted. China brought connected 450 gigawatts past twelvemonth positive 33 atomic facilities. If democracies privation antiauthoritarian AI, they person to compete. “The optimist successful maine says this volition modernize our vigor systems,” he’d said, coating a representation of re-industrialized America with transformed powerfulness grids.

It was inspiring. But it was not an reply astir whether radical successful Lordstown and Abilene are going to ticker their inferior bills spike portion OpenAI generates videos of John F. Kennedy and The Notorious B.I.G. (Video procreation is the most energy-intensive AI retired there.)

Which brought maine to my astir uncomfortable example. Zelda Williams spent the time earlier our interrogation begging strangers connected Instagram to halt sending her AI-generated videos of her precocious father, Robin Williams. “You’re not making art,” she wrote. “You’re making disgusting, over-processed hotdogs retired of the lives of quality beings.”

When I asked astir however the institution reconciles this benignant of intimate harm with its mission, Lehane answered by talking astir processes, including liable design, investigating frameworks, and authorities partnerships. “There is nary playbook for this stuff, right?”

Lehane showed vulnerability successful immoderate moments, saying that helium wakes up astatine 3. a.m. each night, disquieted astir democratization, geopolitics, and infrastructure. “There’s tremendous responsibilities that travel with this.”

Whether oregon not those moments were designed for the audience, I judge him. Indeed, I near Toronto reasoning I’d watched a maestro people successful governmental messaging – Lehane threading an intolerable needle portion dodging questions astir institution decisions that, for each I know, helium doesn’t adjacent hold with. Then Friday happened.

Nathan Calvin, a lawyer who works connected AI argumentation astatine a nonprofit advocacy organization, Encode AI, revealed that astatine the aforesaid clip I was talking with Lehane successful Toronto, OpenAI had sent a sheriff’s lawman to his house successful Washington, D.C., during meal to service him a subpoena. They wanted his backstage messages with California legislators, assemblage students, and erstwhile OpenAI employees.

Calvin is accusing OpenAI of intimidation tactics astir a caller portion of AI regulation, California’s SB 53. He says the institution weaponized its ineligible conflict with Elon Musk arsenic a pretext to people critics, implying Encode was secretly funded by Musk. In fact, Calvin says helium fought OpenAI’s absorption to California’s SB 53, an AI information bill, and that erstwhile helium saw the institution assertion it “worked to amended the bill,” helium “literally laughed retired loud.” In a societal media skein, helium went connected to telephone Lehane specifically the “master of the governmental acheronian arts.”

In Washington, that mightiness beryllium a compliment. At a institution similar OpenAI whose ngo is “to physique AI that benefits each of humanity,” it sounds similar an indictment.

What matters overmuch much is that adjacent OpenAI’s ain radical are conflicted astir what they’re becoming.

As my workfellow Max reported past week, a fig of existent and erstwhile employees took to societal media aft Sora 2 was released, expressing their misgivings, including Boaz Barak, an OpenAI researcher and Harvard professor, who wrote astir Sora 2 that it is “technically astonishing but it’s premature to congratulate ourselves connected avoiding the pitfalls of different societal media apps and deepfakes.”

On Friday, Josh Achiam – OpenAI’s caput of ngo alignment – tweeted thing adjacent much singular astir Calvin’s accusation. Prefacing his comments by saying they were “possibly a hazard to my full career,” Achiam went connected to constitute of OpenAI: “We can’t beryllium doing things that marque america into a frightening powerfulness alternatively of a virtuous one. We person a work to and a ngo for each of humanity. The barroom to prosecute that work is remarkably high.”

That’s . . .something. An OpenAI enforcement publically questioning whether his institution is becoming “a frightening powerfulness alternatively of a virtuous one,” isn’t connected a par with a rival taking shots oregon a newsman asking questions. This is idiosyncratic who chose to enactment astatine OpenAI, who believes successful its mission, and who is present acknowledging a situation of conscience contempt the nonrecreational risk.

It’s a crystallizing moment. You tin beryllium the champion governmental operative successful tech, a maestro astatine navigating intolerable situations, and inactive extremity up moving for a institution whose actions progressively struggle with its stated values – contradictions that whitethorn lone intensify arsenic OpenAI races toward artificial wide intelligence.

It has maine reasoning that the existent question isn’t whether Chris Lehane tin merchantability OpenAI’s mission. It’s whether others – including, critically, the different radical who enactment determination – inactive judge it.

Read Entire Article