>> This contrasts the inflation-adjusted $37B spent on the Manhattan project around 1941 to the $400B+ AI bet in 2025, implying it's 10x bigger than the Manhattan project.
> However, if measured as a % of GDP, it's similar (roughly 1-2%). Which is still notable, but 1x vs 10x is an important difference. I think this is important to point out given significant population and TFP growth since 1941.
> I would point out that Nvidia is distinct from the old telcos in an important ways, which is that it has 75%+ margins, so if it keeps the fraction of it's sales in this kind of deal low it's not really an existential risk for them.
> Also "Altman proposes soon somehow speeding up this process 100x" regarding data centers taking 2 years seems a little misleading - he's not talking about building one from scratch in a week, presumably, but rather starting 50 projects a year such that one is finishing every week I guess
Very interesting. The bottleneck is the power generation itself, as you spoke about. I’m sure your prediction is correct that gigawatts-per-facility will increase, but the amount of available energy doesn’t add up and neither does the projected energy. The new builds (nuclear and otherwise) are on a much longer timeline.
The irony is that if AGI *does* arrive on schedule, NVIDIA competitors are quite likely to catch up.
NVIDIA's moat is in software, and automating software engineers is a major focus of the industry. Not only that, but the software NVIDIA writes is the sort which is easier to automate: it has to pass a big test suite with high performance. There's little UI or human factors component. It's a fairly objective evaluation function, akin to the sort of work the AI industry is currently doing around RL with verifiable rewards.
I'm just a spectator here. I don't have deep knowledge of this industry. But from the outside, NVIDIA looks like a cartoon character with a chainsaw, furiously destroying the branch which supports their weight.
Short NVIDIA could be the trade of the century, you just need to time it right. I'm envisioning a scenario like the following: Google makes use of its existing work on verifiable rewards and autoformalization to design a provably correct, superfast TPU using AI. They don't need to replicate the breadth of customers NVIDIA serves. They initially focus on serving their internal AI needs, and then start reaching out to a few other major AI players, massively undercutting NVIDIA prices to steal market share. They take the 80/20 approach of targeting the 20% of configurations which power 80% of AI servers, instead of serving a big variety of configurations like NVIDIA. As soon as Google produces credible evidence that it can succeed with this strategy, you'd be a fool to hold on to your NVIDIA shares. They've got a long way to fall.
From an AI safety perspective, the above scenario seems mildly positive. Race dynamics decrease if Google achieves a commanding lead / leverage over other AI players. And decreasing profits in the industry means that AI leaders are more likely to be in it for humanity rather than getting rich quick.
Are you using a different definition of market cap here (to imply something about liquid assets)? Nvidia’s market cap is 4.3 trillion, meaning $100 billion is only ~3%, not 35%
If you're going to compare a contract with market cap you need to multiply the expected annual earnings from that contract by a reasonable pe multiple of ~15-20
> This likely will be a part of 45-60GW of total compute across Meta, Microsoft, Amazon/AWS/Anthropic, OpenAI/Oracle, Google/DeepMind, and xAI
It's worth clarifying whether this includes non-AI compute or not. Given your forecast for 2027 (90GW in AI compute), my understanding is that this is 45-60GW in AI compute for mid-2026. This seems too aggressive to me.
Josh's report with EPRI (https://www.epri.com/research/products/000000003002033669) which I agree with suggests closer to 10GW in worldwide AI power currently (cf page 19), 20-30 GW in 2026, and around 40GW in 2027 (cf Figure 9, doubling to go from US to World).
The discussion about where tens to hundreds of GW of new generation on a short time scale feels a lot like an "Underpants Gnome" problem.
There is no possible way to build this kind of generation in this country within the next few years. Physically impossible. The lead time for GW-sized transformers alone is tens of months to say noting of all the other equipment and labor to put it all together.
That's not physical impossibility, it's social impossibility conditional on the bulk of regulatory/red tape delays remaining in place (and no upsets due to new technology or similar). Which does seem reasonable to expect, but still
Thanks for the input. My comment wasn't very clear or comprehensive. It was a mistake for me to use the word "impossible". I was merely using these transformers to illustrate a larger point that building out the infrastructure is the main physical constraint and that constraint is not addressed in the essay thus the reference to "underpants gnomes" who also failed to address the main constraint in their plan. Thanks for the engagement!
I got a comment privately that I want to post:
>> This contrasts the inflation-adjusted $37B spent on the Manhattan project around 1941 to the $400B+ AI bet in 2025, implying it's 10x bigger than the Manhattan project.
> However, if measured as a % of GDP, it's similar (roughly 1-2%). Which is still notable, but 1x vs 10x is an important difference. I think this is important to point out given significant population and TFP growth since 1941.
I got a comment privately that I want to post:
> I would point out that Nvidia is distinct from the old telcos in an important ways, which is that it has 75%+ margins, so if it keeps the fraction of it's sales in this kind of deal low it's not really an existential risk for them.
> Also "Altman proposes soon somehow speeding up this process 100x" regarding data centers taking 2 years seems a little misleading - he's not talking about building one from scratch in a week, presumably, but rather starting 50 projects a year such that one is finishing every week I guess
Very interesting. The bottleneck is the power generation itself, as you spoke about. I’m sure your prediction is correct that gigawatts-per-facility will increase, but the amount of available energy doesn’t add up and neither does the projected energy. The new builds (nuclear and otherwise) are on a much longer timeline.
Yeah. There’s a lot of plans to get the requisite energy, none with a guaranteed chance of success. It’s hard to project.
Not enough to fulfill all of the promises, in all likelihood. Timelines will have to be pushed back.
The power transformers have a 24-month+ lead time for GW scale generation plants, let alone the labor and other plant equipment.
>NVIDIA competitors catch up
The irony is that if AGI *does* arrive on schedule, NVIDIA competitors are quite likely to catch up.
NVIDIA's moat is in software, and automating software engineers is a major focus of the industry. Not only that, but the software NVIDIA writes is the sort which is easier to automate: it has to pass a big test suite with high performance. There's little UI or human factors component. It's a fairly objective evaluation function, akin to the sort of work the AI industry is currently doing around RL with verifiable rewards.
I'm just a spectator here. I don't have deep knowledge of this industry. But from the outside, NVIDIA looks like a cartoon character with a chainsaw, furiously destroying the branch which supports their weight.
Short NVIDIA could be the trade of the century, you just need to time it right. I'm envisioning a scenario like the following: Google makes use of its existing work on verifiable rewards and autoformalization to design a provably correct, superfast TPU using AI. They don't need to replicate the breadth of customers NVIDIA serves. They initially focus on serving their internal AI needs, and then start reaching out to a few other major AI players, massively undercutting NVIDIA prices to steal market share. They take the 80/20 approach of targeting the 20% of configurations which power 80% of AI servers, instead of serving a big variety of configurations like NVIDIA. As soon as Google produces credible evidence that it can succeed with this strategy, you'd be a fool to hold on to your NVIDIA shares. They've got a long way to fall.
From an AI safety perspective, the above scenario seems mildly positive. Race dynamics decrease if Google achieves a commanding lead / leverage over other AI players. And decreasing profits in the industry means that AI leaders are more likely to be in it for humanity rather than getting rich quick.
Are you using a different definition of market cap here (to imply something about liquid assets)? Nvidia’s market cap is 4.3 trillion, meaning $100 billion is only ~3%, not 35%
No, I'm just bad at math. Fixed!
which calls into question other numbers in the piece. Just say'n.
Totally fair for that to be called into question given this error, which I regret.
I
I get the same results as you.
If you're going to compare a contract with market cap you need to multiply the expected annual earnings from that contract by a reasonable pe multiple of ~15-20
> This likely will be a part of 45-60GW of total compute across Meta, Microsoft, Amazon/AWS/Anthropic, OpenAI/Oracle, Google/DeepMind, and xAI
It's worth clarifying whether this includes non-AI compute or not. Given your forecast for 2027 (90GW in AI compute), my understanding is that this is 45-60GW in AI compute for mid-2026. This seems too aggressive to me.
Josh's report with EPRI (https://www.epri.com/research/products/000000003002033669) which I agree with suggests closer to 10GW in worldwide AI power currently (cf page 19), 20-30 GW in 2026, and around 40GW in 2027 (cf Figure 9, doubling to go from US to World).
Aside from the 4 methods in the EPRI report, I know of the following predictions of AI power demand:
- 5 sources cited in https://www.offgridai.us/ (including Situational Awareness)
- RAND https://www.rand.org/pubs/research_reports/RRA3572-1.html which is also particularly aggressive, 327 GW by 2030 (and starts at 11GW in 2024)
- Anthropic's summary page 7 here https://www-cdn.anthropic.com/0dc382a2086f6a054eeb17e8a531bd9625b8e6e5.pdf
- SemiAnalysis here https://files.nitrd.gov/90-fr-9088/SemiAnalysis-AI-RFI-2025.pdf
Nobody is talking about *water* here, either. Or, rather, the need to constantly keep these *cool* behemoths from constantly running *hot*…
https://www.technocracy.news/
Some of the Newer AI Data Centers are already using Nuclear Power to make Them Viable.
The discussion about where tens to hundreds of GW of new generation on a short time scale feels a lot like an "Underpants Gnome" problem.
There is no possible way to build this kind of generation in this country within the next few years. Physically impossible. The lead time for GW-sized transformers alone is tens of months to say noting of all the other equipment and labor to put it all together.
That's not physical impossibility, it's social impossibility conditional on the bulk of regulatory/red tape delays remaining in place (and no upsets due to new technology or similar). Which does seem reasonable to expect, but still
Thanks for the input. My comment wasn't very clear or comprehensive. It was a mistake for me to use the word "impossible". I was merely using these transformers to illustrate a larger point that building out the infrastructure is the main physical constraint and that constraint is not addressed in the essay thus the reference to "underpants gnomes" who also failed to address the main constraint in their plan. Thanks for the engagement!
AGI will make these concerns trivial
So we're back to needing AI 2027 which is an acknowledged overly-bullish estimate on likely capabilities progress?
Can AGI conjure 24-month lead time transformers overnight?
If they wish
One of Oracles 5 centers is in my town...
If the whole thing does unwind and the fed responds like the following, any other risks you see?
https://x.com/esyudkowsky/status/1971044125790421255?s=46
Wow. I totally agree.