Skip to main content

vLLM vs CoreWeave

Side-by-side on valuation, funding, investors, founders & more

Comparison updated: April 2026

CoreWeave is valued at $49B — more than 3x vLLM's N/A.

Head-to-Head Verdict

CoreWeave leads on 3 of 3 metrics

vLLM

0 wins

-Awaira Score
-Team Size
-Experience

CoreWeave

3 wins

+Awaira Score
+Team Size
+Experience

Key Numbers

Valuation
N/A
$49B
Total Funding
N/A
$2.4B
Awaira Score
45/100
95/100
Employees
1-50
1800
Founded
2023
2017
Stage
Bootstrapped
Public
vLLMCoreWeave
vLLM logo
vLLM

🇺🇸 United States · Woosuk Kwon

BootstrappedAI InfrastructureEst. 2023

Valuation

N/A

Total Funding

N/A

Awaira Score45/100

1-50 employees

Full vLLM Profile →
Winner
CoreWeave logo
CoreWeave

🇺🇸 United States · Michael Intrator

PublicAI InfrastructureEst. 2017

Valuation

$49B

Total Funding

$2.4B

Awaira Score95/100

1800 employees

Full CoreWeave Profile →
Market Context

vLLM and CoreWeave are both AI Infrastructure companies based in United States, making this a direct domestic rivalry. The stage gap — vLLM at Bootstrapped vs CoreWeave at Public — shapes how each company allocates capital and talent.

🔬

Analyst Summary

Built from real data · Updated April 2026

Companies

vLLM and CoreWeave both operate in AI Infrastructure, though their strategies diverge significantly. vLLM is an open-source high-throughput and memory-efficient inference and serving engine for large language models, developed initially at UC Berkeley and widely adopted in production AI deployments. CoreWeave is a specialized AI infrastructure provider founded in 2017 that has become a major player in GPU cloud computing.

Funding & Valuation

CoreWeave carries a disclosed valuation of $49B, while vLLM remains privately valued. CoreWeave has raised $2.4B in disclosed funding.

Growth Stage

vLLM is the younger company by 6 years, having launched in 2023 compared to CoreWeave's 2017 founding. Stage-wise, vLLM is classified as Bootstrapped and CoreWeave as Public, reflecting divergent fundraising histories. Team sizes also differ: vLLM employs 1-50 people versus CoreWeave's 1800.

Geography & Outlook

Headquartered in 🇺🇸 United States, both vLLM and CoreWeave draw from the same local ecosystem of talent and capital. On Awaira's 0-100 scale, CoreWeave leads decisively at 95 compared to vLLM's 45. vLLM, led by Woosuk Kwon, and CoreWeave, led by Michael Intrator, each bring distinct leadership visions to the AI sector.

Funding Velocity

vLLM

Total Rounds1
Avg. Round Size$1.7M

CoreWeave

Total Rounds5
Avg. Round Size$448.2M
Funding Span4.1 yrs

Funding History

vLLM has completed 1 funding round, while CoreWeave has gone through 5. vLLM's most recent round was a Seed of $1.7M, compared to CoreWeave's IPO ($1.5B). vLLM is at Bootstrapped while CoreWeave is at Public — different points in their growth trajectory.

Team & Scale

CoreWeave has the bigger team at roughly 1800 people — 1800x the size of vLLM's 1-50. CoreWeave has a 6-year head start, founded in 2017 vs vLLM's 2023. Both are based in United States.

Metrics Comparison

MetricvLLMCoreWeave
💰Valuation
N/A
$49B
📈Total Funding
N/A
$2.4B
📅Founded
2023WINS
2017
🚀Stage
Bootstrapped
Public
👥Employees
1-50
1800
🌍Country
United States
United States
🏷️Category
AI Infrastructure
AI Infrastructure
Awaira Score
45
95WINS

Key Differences

📅

Market experience: CoreWeave has 6 years more (founded 2017 vs 2023)

🚀

Growth stage: vLLM is at Bootstrapped vs CoreWeave at Public

👥

Team size: vLLM has 1-50 employees vs CoreWeave's 1800

⚔️

Direct competitors: Both operate in the AI Infrastructure market segment

Awaira Score: CoreWeave scores 95/100 vs vLLM's 45/100

Which Should You Choose?

Use these signals to make the right call

vLLM logo

Choose vLLM if…

  • vLLM is an open-source high-throughput and memory-efficient inference and serving engine for large language models, developed initially at UC Berkeley and widely adopted in production AI deployments
CoreWeave logo

Choose CoreWeave if…

Top Pick
  • Higher Awaira Score — 95/100 vs 45/100
  • More established by valuation ($49B)
  • Stronger investor backing — raised $2.4B
  • More market experience — founded in 2017
  • CoreWeave is a specialized AI infrastructure provider founded in 2017 that has become a major player in GPU cloud computing

Funding History

vLLM raised N/A across 1 round. CoreWeave raised $2.4B across 5 rounds.

vLLM

Seed

Jan 2023

$1.7M

CoreWeave

IPO

Mar 2025

$1.5B

Series B

Apr 2023

Lead: Sapphire Ventures

$221M

Series D

Jan 2023

Lead: Sapphire Ventures

$300M

Series C

Jun 2022

Lead: Sapphire Ventures

$200M

Series A

Mar 2021

Lead: Bessemer Venture Partners

$20M

Investor Comparison

No shared investors detected between these two companies.

Unique to CoreWeave

Sapphire VenturesBessemer Venture PartnersZetta Venture PartnersBenchmarkGoldman Sachs

Users Also Compare

FAQ — vLLM vs CoreWeave

Is vLLM bigger than CoreWeave?
CoreWeave has a disclosed valuation of $49B, while vLLM's valuation is not publicly available, making a direct size comparison difficult. CoreWeave employs 1800 people.
Which company raised more funding — vLLM or CoreWeave?
CoreWeave has raised $2.4B in disclosed funding across 5 known rounds. vLLM's funding history is not publicly available.
Which company has a higher Awaira Score?
CoreWeave leads with an Awaira Score of 95/100, while vLLM sits at 45/100. That 50-point gap reflects real differences in funding, scale, and traction — it's not a vanity metric.
Who founded vLLM vs CoreWeave?
vLLM was founded by Woosuk Kwon in 2023. CoreWeave was founded by Michael Intrator in 2017. Visit each company's profile on Awaira for a full founder biography.
What does vLLM do vs CoreWeave?
vLLM: vLLM is an open-source high-throughput and memory-efficient inference and serving engine for large language models, developed initially at UC Berkeley and widely adopted in production AI deployments. The project introduced PagedAttention, a novel memory management technique that significantly increases GPU utilization during LLM inference by managing key-value cache memory analogously to how operating systems manage virtual memory pages.\n\nThe engine is used in production by AI infrastructure teams at major technology companies, AI labs, and cloud providers who need to maximize the number of concurrent LLM requests served per GPU. vLLM benchmarks consistently demonstrate throughput improvements of 10 to 20 times over naive inference implementations, translating directly into lower cost per inference query at scale. The project is maintained by a community of contributors from both academia and industry.\n\nHigh-throughput LLM serving infrastructure is foundational to the economics of AI deployment. As inference costs represent an increasing share of AI operating budgets, the performance characteristics of the serving engine directly determine the financial viability of AI-powered products. vLLM dominant position in open-source LLM serving gives it deep adoption among infrastructure engineers and makes it a reference implementation against which commercial serving solutions are measured. CoreWeave: CoreWeave is a specialized AI infrastructure provider founded in 2017 that has become a major player in GPU cloud computing. The company operates a global network of data centers optimized for artificial intelligence and machine learning workloads, offering on-demand access to high-performance GPUs and compute resources. CoreWeave's platform enables enterprises and AI developers to train large language models, run inference workloads, and deploy machine learning applications without building proprietary infrastructure. The company serves organizations across industries including enterprise AI, research institutions, and cloud-native startups requiring flexible, scalable compute capacity. CoreWeave distinguishes itself through customized infrastructure solutions tailored to GPU-intensive applications, offering various processor configurations from NVIDIA and AMD architectures. The company went public in 2025 and currently carries a valuation of $42.0 billion with total funding of $2.38 billion, reflecting substantial investor confidence in AI infrastructure demand. CoreWeave competes directly with hyperscalers like AWS, Google Cloud, and Microsoft Azure in the GPU compute space, alongside specialized competitors such as Lambda Labs and Crusoe Energy. The company's growth trajectory reflects the accelerating demand for accessible GPU computing as organizations scale their AI capabilities. Its business model capitalizes on the infrastructure bottleneck in AI deployment, positioning it as a critical enabler of AI adoption across enterprise sectors. CoreWeave's public status and $42B valuation reflect recognition of GPU infrastructure as fundamental to AI scaling, distinct from traditional cloud computing markets.
Which company was founded first?
CoreWeave got there first, launching in 2017 — that's 6 years of extra runway. vLLM didn't arrive until 2023. In AI, that kind of head start means more training data, deeper customer relationships, and a bigger talent moat.
Which company has more employees?
vLLM has about 1-50 employees; CoreWeave has about 1800. A bigger team usually means more revenue or heavier VC backing, but in AI, small teams can build at massive scale.
Are vLLM and CoreWeave competitors?
Yes — they're direct rivals. Both vLLM and CoreWeave compete in AI Infrastructure, targeting many of the same buyers. If you're evaluating one, you should be looking at the other.

Bottom Line

CoreWeave has a clear lead here — Awaira Score of 95 vs vLLM's 45. The difference comes down to funding depth and team scale.

Who Should You Watch?

CoreWeave is in the stronger position — better score and deeper pockets. But vLLM has room to surprise, especially if they land a marquee investor. Follow both profiles on Awaira to track funding rounds, team changes, and score updates.

Deep Dive