Fundamental raises $255 million Series A with a new take on big data analysis | TechCrunch
Briefly

Fundamental raises $255 million Series A with a new take on big data analysis | TechCrunch
"While LLMs have been great at working with unstructured data, like text, audio, video, and code, they don't work well with structured data like tables,"
"With our model Nexus, we have built the best foundation model to handle that type of data."
"Called a Large Tabular Model (LTM) rather than a Large Language Model (LLM), Fundamental's Nexus breaks from contemporary AI practices in a number of significant ways. The model is deterministic - that is, it will give the same answer every time it is asked a given question - and doesn't rely on the transformer architecture that defines models from most contemporary AI labs."
An AI lab named Fundamental unveiled Nexus, a Large Tabular Model (LTM) designed to analyze enterprise structured data such as tables. Fundamental raised $255 million, including a $225 million Series A led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures, with participation from Hetz Ventures and angel investors. Nexus departs from transformer-based architectures and is deterministic, producing the same answer for the same question. Nexus undergoes pre-training and fine-tuning like foundation models but delivers different behavior than typical LLMs. The design targets scenarios where transformer LLMs struggle, particularly reasoning over very large structured datasets beyond context window limits.
Read at TechCrunch
Unable to calculate read time
[
|
]