
7:00 AM PST · February 5, 2026
An AI laboratory called Fundamental emerged from stealth connected Thursday, offering a caller instauration exemplary to lick an aged problem: however to gully insights from the immense quantities of structured information produced by enterprises. By combining the aged systems of predictive AI with much modern tools, the institution believes it tin reshape however ample enterprises analyse their data.
“While LLMs person been large astatine moving with unstructured data, similar text, audio, video, and code, they don’t enactment good with structured information similar tables,” CEO Jeremy Fraenkel told TechCrunch. “With our exemplary Nexus, we person built the champion instauration exemplary to grip that benignant of data.”
The thought has already drawn important involvement from investors. The institution is emerging from stealth with $255 cardinal successful backing astatine a $1.2 cardinal valuation. The bulk of it comes from the caller $225 cardinal Series A circular led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures; Hetz Ventures besides participated successful the Series A, with angel backing from Perplexity CEO Aravind Srinivas, Brex co-founder Henrique Dubugras, and Datadog CEO Olivier Pomel.
Called a Large Tabular Model (LTM) alternatively than a Large Language Model (LLM), Fundamental’s Nexus breaks from modern AI practices successful a fig of important ways. The exemplary is deterministic — that is, it volition springiness the aforesaid reply each clip it is asked a fixed question — and doesn’t trust connected the transformer architecture that defines models from astir modern AI labs. Fundamental calls it a instauration exemplary due to the fact that it goes done the mean steps of pre-training and fine-tuning, but the effect is thing profoundly antithetic from what a lawsuit would get erstwhile partnering with OpenAI oregon Anthropic.
Those differences are important due to the fact that Fundamental is chasing a use-case wherever modern AI models often falter. Because Transformer-based AI models tin lone process information that’s wrong their discourse window, they often person occupation reasoning implicit highly ample datasets — analyzing a spreadsheet with billions of rows, for instance. But that benignant of tremendous structured dataset is communal wrong ample enterprises, creating a important accidental for models that tin grip the scale.
As Fraenkel sees it, that’s a immense accidental for Fundamental. Using Nexus, the institution tin bring modern techniques to Big Data analysis, offering thing much almighty and flexible than the algorithms that are presently successful use.
“You tin present person 1 exemplary crossed each of your usage cases, truthful you tin present grow massively the fig of usage cases that you tackle,” helium told TechCrunch. “And connected each 1 of those usage cases, you get amended show than what you would different beryllium capable to bash with an service of information scientists.”
That committedness has already brought successful a fig of high-profile contracts, including seven-figure contracts with Fortune 100 clients. The institution has besides entered into a strategical concern with AWS that volition let AWS users to deploy Nexus straight from existing instances.
Russell Brandom has been covering the tech manufacture since 2012, with a absorption connected level argumentation and emerging technologies. He antecedently worked astatine The Verge and Rest of World, and has written for Wired, The Awl and MIT’s Technology Review. He tin beryllium reached astatine russell.brandom@techcrunch.com oregon connected Signal astatine 412-401-5489.















English (US) ·