Despite widespread adoption of large language models across enterprises, companies building LLM applications still lack the right tools to meet complex cognitive and infrastructure needs, often resorting to stitching together early-stage solutions available on the market. The challenge intensifies as AI models grow smarter and take on more complex workflows, requiring engineers to reason about end-to-end systems and their real-world consequences rather than judging business outcomes by examining individual inferences. TensorZero addresses this gap with an open-source stack for industrial-grade LLM applications that unifies an LLM gateway, observability, optimization, evaluation, and experimentation in a self-reinforcing loop. The platform enables companies to optimize complex LLM applications based on production metrics and human feedback while supporting the demanding requirements of enterprise environments including sub-millisecond latency, high throughput, and complete self-hosting capabilities. The company hit the #1 trending repository spot globally on GitHub and already powers cutting-edge LLM products at frontier AI startups and large organizations, including one of Europe’s largest banks.
AlleyWatch sat down with TensorZero CEO and Founder Gabriel Bianconi to learn more about the business, its future plans, recent funding round, and much, much more…
Who were your investors and how much did you raise?
We raised a $7.3M Seed round from FirstMark, Bessemer Venture Partners, Bedrock, DRW, Coalition, and angel investors.
Tell us about the product or service that TensorZero offers.
TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
What inspired the start of TensorZero?
We asked ourselves what will LLM engineering look like in a few years when we started TensorZero. Our answer is that LLMs will have to learn from real-world experience, just like humans do. The analogy we like here is, “If you take a really smart person and throw them at a completely new job, they won’t be great at it at first but will likely learn the ropes quickly from instruction or trial and error.”
This same process is very challenging for LLMs today. It will only get more complex as more models, APIs, tools, and techniques emerge, especially as teams tackle increasingly ambitious use cases. At some point, you won’t be able to judge business outcomes by staring at individual inferences, which is how most people approach LLM engineering today. You’ll have to reason about these end-to-end systems and their consequences as a whole. TensorZero is our answer to all this.
How is TensorZero different?
TensorZero enables you to optimize complex LLM applications based on production metrics and human feedback.
TensorZero supports the needs of industrial-grade LLM applications: low latency, high throughput, type safety, self-hosted, GitOps, customizability, etc.
TensorZero unifies the entire LLMOps stack, creating compounding benefits. For example, LLM evaluations can be used for fine-tuning models alongside AI judges.
What market does TensorZero target and how big is it?
Companies building LLM applications, which will be every large company in the future.
What’s your business model?
Pre-revenue/open-source.
Our vision is to automate much of LLM engineering. We’re laying the foundation for that with open-source TensorZero. For example, with our data model and end-to-end workflow, we will be able to proactively suggest new variants (e.g. a new fine-tuned model), backtest it on historical data (e.g. using diverse techniques from reinforcement learning), enable a gradual, live A/B test, and repeat the process.
With a tool like this, engineers can focus on higher-level workflows — deciding what data goes in and out of these models, how to measure success, which behaviors to incentivize and disincentivize, and so on — and leave the low-level implementation details to an automated system. This is the future we see for LLM engineering as a discipline.
How are you preparing for a potential economic slowdown?
YOLO (we are AI optimists).
What was the funding process like?
Easy, the VCs reached out to us. Landed on our laps, realistically. Thankful for the AI cycle!
What are the biggest challenges that you faced while raising capital?
None.
What factors about your business led your investors to write the check?
Our founding team’s background and vision. When we closed we had a single user.
What are the milestones you plan to achieve in the next six months?
Continue to grow the team (grow to ~10) and onboard more businesses.