d-Matrix, a specialist in generative AI inference for data centres, has raised $275 million in a Series C round, lifting its valuation to $2 billion and total funding to $450 million.
The capital will accelerate product rollout, global expansion and large-scale deployments of what the company claims is the world’s highest-performing and most energy-efficient inference platform for hyperscalers, enterprises, and sovereign AI customers.
The oversubscribed round was co-led by BullhoundCapital, Triatomic Capital and Temasek, with new investors including Qatar Investment Authority and EDBI.
Existing backers M12 (Microsoft’s venture arm), Nautilus Venture Partners, Industry Ventures and Mirae Asset also joined the round.
d-Matrix’s full-stack inference system integrates compute and memory on a single platform, paired with high-speed networking and inference-focused software. The company says its solution delivers 10× faster performance, 3× lower cost and 3–5× better energy efficiency than GPU-based systems.
Powered by its Corsair™ accelerators, JetStream™ NICs and Aviator™ software, the platform can generate 30,000 tokens per second at 2 ms per token on a Llama 70B model, and run 100B-parameter models within a single rack.
The gains come as the industry faces rising sustainability pressures. d-Matrix claims one of its data centre deployments can match the output of ten GPU-based centres, offering a path to lower global AI power consumption while improving economics for enterprise AI applications.
“From day one, we focused on inference,” said CEO and co-founder Sid Sheth. “We knew that once models had to run continuously at scale, existing infrastructure wouldn’t keep up. We’ve spent six years building an architecture for the Age of AI Inference.”
Investor appetite also reflects the company’s expanding customer base and ecosystem partnerships, including the recently announced SquadRack™ open-standards reference architecture with Arista, Broadcom and Supermicro. Upcoming 3D memory-stacking innovations and a customer-driven go-to-market strategy further cement d-Matrix’s position in the emerging AI infra stack.



