International Finance
FeaturedTechnology

Start-up of the Week: SambaNova ignites global AI buzz

IFM_SambaNova
SambaNova claims its AI solution has been designed to empower enterprises to control the trajectory of their data and AI future

California-based SambaNova Systems, established in 2017, recently raised USD 350 million in a new funding round and struck a partnership with Intel, while looking to capitalise on surging demand for inference chips used in artificial intelligence (AI) applications. When discussing inference chips, the latter run AI models and power real-time decisions, and the technology has attracted intense investor interest, with AI companies seeking faster and more efficient hardware.

The funding round was led by private equity firms Vista Equity Partners and Cambium Capital, with Intel’s investment arm, Intel Capital, participating in the event. According to Reuters, the proceeds will fund the expansion of SambaNova’s new SN50 AI chip, in addition to scaling the start-up’s SambaCloud platform and, most importantly, deepening enterprise software integrations. SoftBank Corp, Japan’s telecommunications and internet biggie, will be the first customer to deploy the SN50 chip within its AI data centres in the East Asian country.

SambaNova and Intel also signed a multi-year agreement to deliver cost-effective AI inference solutions for AI-native companies, complementing the Lip-Bu Tan-led company’s existing data centre GPU commitments. SambaNova and Intel were mulling a merger, with Tan, who also serves as SambaNova’s executive chairman, having previously discussed acquiring the start-up for roughly USD 1.6 billion, including debt. However, talks got stalled.

Creating The Next-gen AI Infrastructure

SambaNova claims its AI solution has been designed to empower enterprises to control the trajectory of their data and AI future. As per Kunle Olukotun, the venture’s co-founder and chief technologist, “Foundation models represent a paradigm shift in AI and deep learning – truly transforming the value organisations can derive from AI. We’re innovating at every layer of the AI stack to deliver the fully integrated AI platform that will serve as the technology backbone for this next generation of AI computing and innovation.”

And talking about innovation, SambaNova’s activities since 2025 have all been about industry firsts. Take the February 2026 event, where, apart from announcing the collaboration with Intel (along with new funding), the company introduced its SN50 AI chip. It now offers enterprises a 3X lower total cost of ownership, a powerful foundation to scale fast inference and bring autonomous AI agents into full production. The SN50 will be shipping to customers by the year-end.

Then, in November 2025, European Cloud leader OVHcloud announced SambaNova as the building block to complement its inference portfolio of solutions with a focus on ultra-low latency inference. The tie-up will be a game-changing one, as OVHcloud believes that organisations building next-generation AI workloads will face increasingly sharp constraints like sequential LLM calls introducing latency bottlenecks, user-facing applications requiring immediate responses, and operational pipelines facing the challenge of scaling to millions of inferences with strict performance guarantees for time to first token and time per output token.

The OVHcloud and SambaNova partnership will have a wide range of use cases in sectors such as financial trading, cybersecurity, industrial automation, logistic optimisation, monitoring and much more. These sectors are known for slow inferences, which can result in operational blind spots or degraded user experience.

About a month ago, SambaNova announced InfercomAI, and with the start-up’s help, it would launch Europe’s first sovereign Inference-as-a-Service platform. The agreement will see Infercom deploy the SambaManaged platform across strategic European locations, before expanding to additional sites across the region. Scotland‑based Argyll Data Development, on the other hand, around the same point in time, declared a strategic partnership with SambaNova to deliver the United Kingdom’s first renewable-powered AI inference cloud. The deployment will anchor the “Killellan AI Growth Zone,” a 184-acre green digital campus on Scotland’s Cowal Peninsula, creating a blueprint for nations in terms of combining AI sovereignty, energy independence and sustainability.

SambaNova’s innovation has made its presence felt in Australia as well, with the start-up launching the country’s first ASIC-based sovereign AI cloud, a breakthrough platform engineered to empower regional governments and businesses with secure, high-performance and onshore AI infrastructure. This new alliance, announced in 2025, marked a milestone in Australia’s pursuit of digital independence and sustainable technology leadership.

The Products

Powered by SambaNova’s RDU chip, SambaCloud delivers fast inference on the best and largest models. While all inference speeds are independently benchmarked and reported by “Artificial Analysis,” the solution strictly follows the principle of data privacy, protecting the clients’ sensitive business information through robust security controls (backed by cloud providers like Amazon and Google), rigorous risk management processes, and ongoing monitoring and improvement initiatives.

SambaCloud also has features like the best open-source models, which provide support for a range of AI models such as DeepSeek, Llama and Qwen, covering text, image and audio processing stages of AI application building. Next is SambaStack, which offers the industry’s leading hardware and software stack, purpose-built for AI inference. With the flexibility to deploy on-premises or in the cloud, organisations now have a potent option to accelerate their AI innovation with dedicated SambaNova infrastructure.

SambaStack is a “Chip-to-Model Intelligence,” powered by SambaRack, which the start-up claims to be the “most efficient rack for AI.” It uses an average of 10kw of power for better intelligence for every joule of energy. SambaStack also delivers the fastest inference on the best AI models, including DeepSeek and Llama. The same chip-to-model intelligence also comes in handy when helping tech companies to set up their data centre in weeks and start processing millions of tokens in their private clouds. OVHcloud’s collaboration with SambaNova provides Europe’s most advanced AI inference platform, featuring exceptional speed, scalability, and efficiency essential for mission-critical generative AI workloads.

SambaNova’s Reconfigurable Dataflow Units (RDUs) enable OVHcloud to deliver unprecedented throughput-per-watt. Each SambaRack SN40L-16 runs large models at 10 kW average power, outperforming legacy solutions while minimising the carbon footprint. OVHcloud’s “Premium AI Endpoints” meet developers’ needs of delivering fast inference on open-source AI models, while driving four times better energy efficiency over traditional GPUs, reducing physical footprint and power consumption, making AI sustainable. SambaNova’s three-tier memory architecture also enables OVHcloud to serve more models with less hardware. Multiple models can be served per SambaRack and can be hot-swapped at runtime with very low switching times.

Mention must be made of “SambaManaged,” as through innovation, the start-up is helping its tech sector clients build their own AI inference cloud (powered by the SambaNova SN40L RDU chip) in the existing data centre infrastructure, with minimal modifications. The chip delivers lightning-fast inference on the best open-source models with an extremely efficient power footprint. Launched in July 2025, the solution has also consolidated its place as the industry’s first inference-optimised data centre product offering, which can be deployed in just 90 days, faster than the typical 18 to 24 months.

The breakthrough was a timely one. With global AI inference demands soaring, traditional data centres were grappling with lengthy deployment timelines of 18-24 months, extensive power requirements, and costly facility upgrades. SambaManaged now addresses these critical barriers, enabling organisations to quickly launch profitable AI inference services leveraging existing power and network infrastructure.

It has already managed to bag a few firsts. Apart from setting a new industry benchmark for performance per watt, maximising return on investment and reducing total cost of ownership, the 90-day timeline of setting up a fully managed AI inference service has helped the start-up’s clients minimise integration challenges, accelerating time to value.

Another prominent feature of SambaManaged has been lightning-fast inference with leading open-source models, ensuring no vendor lock-in and future-proof operations. Businesses can scale from small to large deployments with ease, including the capability to build a 1 MW Token Factory (100 racks or 1,600 chips) or larger that scales with evolving operational needs.

Apart from unlocking new revenue streams by making sure that existing infrastructures deliver leading-edge AI inference services without increasing energy consumption, clients are getting another advantage, which is cloud experts at SambaNova providing tech support from the start.

What's New

Business Leader of the Week: CEO Connor Teskey to write Brookfield AM’s next chapter

IFM Correspondent

Oil price stares at massive gain amid Middle East crisis

IFM Correspondent

Gulf cargo bookings suspended as insurance premiums rise

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.