Company Research Report: d-Matrix
Company Overview
- Name: d-Matrix
- Mission: Transforming AI from unsustainable to attainable by advancing compute capabilities for modern AI workloads.
- Founded: 2019
- Founders: Sid Sheth (Founder & CEO) and Sudeep Bhoja (Founder & CTO)
- Key People:
- Sid Sheth, Founder & CEO
- Sudeep Bhoja, Founder & CTO
- Ranganathan “Suds” Sudhakar, Chief Development Officer
- Peter Buckingham, Senior Vice President, Software
- Leon Bezdikian, Vice President of Human Resources
- Sree Ganesan, Vice President of Product
- PJ Jamkhandi, Vice President of Finance & Accounting
- Richard Ogawa, General Counsel
- Jerry Qubain, Vice President of Manufacturing, Operations
- Board of Directors includes Sasha Ostojic, Jeff Huber, Michael Stewart, Connie Sheng
- Headquarters: Santa Clara, California
- Number of Employees: No information is available
- Revenue: No information is available
- Known For: Innovative AI inference computing platform, especially their product Corsair™, which is the world’s most efficient AI inference platform for datacenters.
Products
- Corsair™ AI Inference Platform
- Description: Corsair™ is a highly efficient AI inference platform designed for datacenters. It leverages innovative Digital In-Memory Compute (DIMC) architecture which tightly integrates memory and compute to eliminate traditional bottlenecks.
- Key Features:
- Performance: Handles 60,000 tokens/sec at 1 ms/token latency for Llama3 8B in a single server, and 30,000 tokens/sec at 2 ms/token latency for Llama3 70B in a single rack.
- Architecture: The platform consists of chiplet-based design made by Taiwan Semiconductor Manufacturing Company.
- Efficiency: Offers unprecedented energy efficiency and cost-performance as compared to conventional GPUs.
- Scalability: Supports scale-up/scale-out architecture which can be deployed at varying sizes from single card to rack-scale implementations.
Recent Developments
- Product Launch:
- d-Matrix recently launched the Corsair AI inference platform that forgoes traditional GPUs in favor of their DIMC approach to computing.
- Innovations:
- Developed industry-first solutions in silicon, software, chiplet packaging, and interconnect fabrics for accelerating AI inference, marked by innovations like DMX Link™ and DMX Bridge™.
- Introduced support for block floating point numerical formats, enhancing inference efficiency.
- Presented advancements at NeurIPS, describing techniques for efficient AI inference and post-training quantization of large language models.
- Partnerships:
- Collaborated with companies like Supermicro, GigaIO, and Liqid, to integrate Corsair with their systems, providing high-performance, scale-up infrastructure for next-generation AI workloads.
- Funding & Growth:
- Secured significant funding, including more than $160 million, backed by major tech venture capitals such as Microsoft’s M12.
- Recognition:
- Featured as one of CRN’s 10 Hottest Semiconductor Startups of 2024.
- Tech Innovations:
- Spotlighted in Eigen conferences and publications for their unique DIMC architecture and its potential applications. Engaged in research presentations at NeurIPS on quantization methods and efficient inference architectures.
In summary, d-Matrix is pioneering AI inference technology with their Corsair platform, which delivers high-performance computing while addressing energy and cost challenges associated with AI implementations, supported by innovative partnerships and backed by leading venture capitalists.