LIVE
OPUS 4.7$15 / $75per Mtok
SONNET 4.6$3 / $15per Mtok
GPT-5.5$10 / $30per Mtok
GEMINI 3.1$3.50 / $10.50per Mtok
SWE-BENCHleader Claude Opus 4.772.1%
MMLU-PROleader Opus 4.788.4
VALS FINANCEleader Opus 4.764.4%
AFTAv1.0 whitepaper live at /whitepaper
OPUS 4.7$15 / $75per Mtok
SONNET 4.6$3 / $15per Mtok
GPT-5.5$10 / $30per Mtok
GEMINI 3.1$3.50 / $10.50per Mtok
SWE-BENCHleader Claude Opus 4.772.1%
MMLU-PROleader Opus 4.788.4
VALS FINANCEleader Opus 4.764.4%
AFTAv1.0 whitepaper live at /whitepaper
All systems operational0 AI providers monitored, polled every 2 minutes
Live status
Back to Originals

SAP Just Bought Prior Labs. Europe Has a Frontier AI Lab Now.

Marcus Chen··7 min read

SAP signed a definitive agreement on May 4 to acquire Prior Labs, an 18-month-old German AI startup based in Freiburg. The headline number is the four-year capital commitment: more than 1 billion euros to scale Prior Labs into what SAP and the founders are calling a globally leading frontier AI lab in Europe.

That phrase has been wishful thinking on the continent for two years. As of this week, it has a balance sheet behind it. Europe's most valuable listed company just used its market cap to buy a frontier lab outright, anchored to a German university town and pointed at a category nobody else is racing for.

The play is not LLMs. It is tabular foundation models. And the more time I spent with the deal, the more it started to look like a smarter bet than the obvious one.

What Prior Labs Actually Builds

Prior Labs was founded in late 2024 by Frank Hutter, a longtime AutoML researcher at the University of Freiburg and ELLIS Tübingen, alongside Noah Hollmann and Sauraj Gambhir. Their flagship is the TabPFN model series: transformer-based foundation models pre-trained on roughly 130 million synthetic tabular datasets. The original paper, "Accurate predictions on small data with a tabular foundation model," ran in Nature in 2025.

The benchmark result that put TabPFN on the map is also the one that explains why SAP cares. On the standard small-data tabular setting, TabPFN beats an ensemble of strongly tuned XGBoost, CatBoost, and LightGBM baselines that took four hours of compute to tune. TabPFN gets there in 2.8 seconds with no tuning at all. The current flagship, TabPFN-2.6, sits at the top of TabArena, the leading tabular benchmark.

That speed and accuracy delta matters because of what tabular data actually is. It is the rows and columns that run businesses: transactions, supplier records, GL entries, customer accounts, claims tables, parts inventories, lab results. SAP is the company that stores most of those tables for the Fortune 500. The Prior Labs team, by SAP's own description, was recruited from Google, Apple, Amazon, Microsoft, G-Research, Jane Street, Goldman Sachs, and CERN. That is a tabular-AI roster you cannot rebuild from scratch in under three years.

Why This Category, Not LLMs

Here is the thing nobody on AI Twitter says out loud. Large language models are extraordinary at unstructured text and code, and surprisingly bad at structured business data. They have a rudimentary statistical understanding of tables, they hallucinate arithmetic on long columns, and the "just put it in the context" pattern collapses past a few thousand rows. That is why every enterprise AI deployment of the last 24 months has ended with the same architecture diagram: an LLM out front, a SQL warehouse and a forecasting library doing the actual numerical work in back.

Tabular foundation models attack the back of that diagram directly. They eat the SQL output and the forecasting library both. Once a single model can do classification, regression, anomaly detection, and time series across heterogeneous tables without per-task training, an enormous chunk of enterprise data work collapses into a single API call.

SAP's framing is honest about it. Their press release says LLMs struggle to make accurate predictions on structured business data because they have only a rudimentary understanding of tables, numbers, and statistics. That is the pitch. Prior Labs is the model that does what GPT-5.5 and Claude Opus 4.7 cannot.

The Numbers on the Deal

Terms were not disclosed, but the publicly committed pieces tell a coherent story.

MetricValueNotes
Capital commitment1B euros+Over 4 years, post-close
Prior Labs prior funding9M eurosSingle pre-seed, Feb 2025, led by Balderton
Time from pre-seed to exit~15 monthsAmong the fastest in European tech
Independence post-closeYesOperates as independent entity under SAP
HeadquartersFreiburg, GermanyPlus Berlin and New York City offices
Expected closeQ2 or Q3 2026Subject to regulatory approval

The 9 million to 1 billion-plus arc is the part that will get repeated. A pre-seed-only startup, less than 18 months from first check, hits a billion-euro programmatic commitment from the continent's most valuable public company. European venture partners (Balderton Capital, Atlantic Labs, XTX Ventures, Hector Foundation) booked a return that, even unstated, almost certainly beats anything they have written this cycle.

More importantly for the ecosystem, Prior Labs stays independent. That is the structure that lets a research lab keep operating like a research lab inside a 70-year-old enterprise software vendor. It is the same shape Google used with DeepMind for its first decade, and the same shape Anthropic and OpenAI maintain inside their respective cloud relationships. Independence is the load-bearing legal phrase.

SAP's Two-Acquisition Stack

Prior Labs did not come alone. SAP also disclosed an agreement to acquire Dremio, the open data lakehouse company, on the same announcement cadence. Read together, the two deals describe an enterprise AI architecture in plain language.

Dremio is the unified query layer over the data lake. Prior Labs is the foundation model that operates on the result. SAP's own ERP and HANA install base is the demand side. Joule, SAP's existing assistant, is the surface where customers interact with the stack. The architecture is: open data, open compute, frontier tabular model, embedded into SAP's enterprise products.

The competitive read is that this is SAP looking at Salesforce Agentforce, ServiceNow Now Assist, Microsoft Copilot, and the emerging Anthropic-finance-vendor stack we covered last week, and deciding that the way to outcompete in the next decade is not to ship a chatbot on top of someone else's model. It is to own the model layer for the data shape SAP customers care about most.

The European Sovereignty Read

European AI policy has spent two years trying to manufacture exactly this outcome. Sovereign-cloud awards, the AI Act, the Mistral national-champion narrative, and the increasingly explicit push from European leaders for capital deployment at scale inside the EU all pointed at the same gap. There was no European frontier lab with continental capital, continental headquarters, and a research mandate at frontier scale.

Mistral, with respect, is not it at this scale. The French lab is excellent and shipping (we covered Mistral Medium 3.5 last week), but it is also nowhere close to 1 billion euros of patient, dedicated R&D capital from a single committed industrial backer with a captive customer base. SAP just wrote that check.

The one caveat I would put on the sovereignty narrative: Prior Labs already has a New York office and the SAP deal is structured for global research output, not regional protectionism. That is almost certainly the right call for actually doing frontier work, and it should defuse anyone trying to spin this as a closed European garden. The lab will compete globally on output. It just happens to be headquartered in Freiburg.

What This Pressures

A few obvious second-order moves to watch over the next two quarters.

Salesforce and Oracle now have to make a tabular AI play of their own. Salesforce has the Tableau acquisition and Einstein, but no frontier tabular foundation model and nobody publicly leading one. Oracle is in roughly the same position. Both companies sit on arguably as much structured data as SAP and have spent the past 18 months telling investors AI is integral to the next leg. They will either acquire, partner, or quietly de-emphasize the structured-AI narrative in favor of the LLM-on-top story.

The frontier labs (OpenAI, Anthropic, Google) probably do not move on this directly. Tabular foundation modeling is a narrow research program with a different training distribution than the web-scale-text recipe their entire stack is built around. They can partner, license, or fold tabular into a larger multimodal frame, but they are unlikely to go heads-down on it the way Prior Labs has. The economics for them are better at the LLM frontier.

The interesting pressure is on Databricks. Databricks owns the data surface, has been investing heavily in MosaicML and bespoke training, and has the most direct overlap with the SAP-Dremio-Prior Labs combined stack. If anyone in the US replies in kind, it is almost certainly Databricks plus an academic lab acquisition.

Our Take

For three years the running joke about European AI was that the continent had the regulators and the GPUs and the universities, and was still missing the lab. The SAP-Prior Labs deal does not solve everything (compute is still the long-pole, and a 4-year ramp is slower than the US frontier cadence), but it changes the answer to the question. Europe has a frontier AI lab. It is funded for the decade. It is targeting the AI category that is most economically relevant to its largest customer base. And it is independent inside a parent that needs the research output to remain competitive.

The other thing worth saying: the fact that the deal is for tabular foundation models, not another LLM, is what makes it strategically interesting. The LLM race is crowded, expensive, and arguably commoditizing at the frontier. Tabular is wide open, expensive, and stickier in the enterprise. SAP picked the harder, narrower problem and put a billion euros behind solving it. That is a better bet on the substance than another European LLM ever could have been.

We are adding TabPFN-2.6 to our models tracker today and will be watching the next-generation TabPFN release cadence closely. Frank Hutter has historically published openly and shipped reference implementations on GitHub. Whether that posture survives full SAP integration is the single biggest open question in this deal, and the one we will be writing about again the next time Prior Labs ships.