Cerebras vs d-Matrix
Side-by-side comparison
Overall Winner: Cerebras (Score: 79)
C
Cerebras
🇺🇸 Andrew Feldman
79
D
d-Matrix
🇺🇸 Sid Sheth
68
| Metric | Cerebras | d-Matrix |
|---|---|---|
| Valuation | $4BWinner | $2B |
| Total Funding | $720MWinner | $450M |
| Founded | 2016 | 2019Winner |
| Stage | Series F | Series C |
| Employees | 400 | 150 |
| Country | USA | USA |
| Category | AI Infrastructure | AI Infrastructure |
| Awaira Score | 79Winner | 68 |
Related Comparisons
Frequently Asked Questions
Is Cerebras bigger than d-Matrix?▾
Yes, Cerebras has a higher valuation ($4B) compared to d-Matrix ($2B).
Which company raised more funding — Cerebras or d-Matrix?▾
Cerebras raised $720M while d-Matrix raised $450M.
Which company has a higher Awaira Score?▾
Cerebras has the higher Awaira Score of 79.
What does Cerebras do vs d-Matrix?▾
Cerebras: Cerebras Systems designs and manufactures specialized processors for artificial intelligence and machine learning applications. Founded in 2016, the company develops custom silicon chips optimized for training and inference of large language models and deep learning workloads. Its flagship product, the Cerebras Wafer Scale Engine (WSE), is one of the largest computer chips ever built, integrating hundreds of billions of transistors on a single wafer to deliver high compute density and memory bandwidth for AI workloads. The WSE architecture prioritizes parallel processing capabilities and reduced latency for neural network training, distinguishing it from traditional GPU-based approaches used by competitors like NVIDIA. Cerebras addresses the infrastructure layer of AI computing, targeting organizations training large-scale models. The company has secured $720 million in total funding and maintains a $4.0 billion valuation as of its Series F funding round, indicating strong investor confidence in custom AI chip development. Its competitive positioning centers on delivering superior compute efficiency and performance-per-watt compared to conventional accelerators. The company targets both cloud service providers and enterprises with significant AI computing requirements. Cerebras represents the emerging wave of AI-specific chip designers competing in the rapidly expanding AI infrastructure market, though adoption remains limited compared to established GPU manufacturers. Cerebras builds wafer-scale processors specifically engineered for large language model training, offering an alternative architecture to traditional GPU-based AI computing infrastructure.. d-Matrix: d-Matrix is an AI infrastructure company founded in 2019 that develops hardware and software solutions optimized for generative AI and large language model inference. The company designs specialized processors and system architectures to improve the efficiency and cost-effectiveness of deploying AI models at scale. d-Matrix's core technology focuses on reducing latency and power consumption in AI workloads, addressing key bottlenecks in data center operations.
The company has raised $450 million across funding rounds, achieving a $2.0 billion valuation as of its Series C stage. d-Matrix competes in the competitive AI infrastructure market alongside companies developing custom silicon and inference acceleration platforms. Its approach targets enterprises and cloud providers requiring optimized inference capabilities for large-scale AI deployments.
The company operates within the growing segment of AI infrastructure providers that emerged to support the infrastructure demands of modern generative AI applications. d-Matrix's positioning emphasizes efficiency gains and operational cost reduction compared to standard computing infrastructure. The company addresses a critical market need as organizations seek to deploy AI models economically while maintaining performance requirements. Its technology appeals to data center operators and enterprises managing substantial inference workloads. d-Matrix represents the category of specialized hardware companies built to support the computational demands of contemporary AI systems. The company's trajectory reflects broader industry expansion in AI infrastructure optimization and deployment technologies. d-Matrix specializes in inference optimization hardware specifically designed to reduce the computational and energy costs of deploying large language models at scale..
Which company was founded first?▾
Cerebras was founded first in 2016. d-Matrix was founded in 2019.