๐งต View Thread
๐งต Thread (25 tweets)

i've been going on and on about this, but basically, AI flips electricity on its head. previously, nearly half the cost of electricity to the consumer was allocated towards distributing it where (and when) you want it. AI data centers want upwards of hundreds of megawatts of power in one place, most of the time. (and for training they don't care about latency) The most constrained thing isn't even the energy itself -- there's a surplus during the solar peak -- it's often the sheer availability & the interconnection. It now can take multiple years to get a grid interconnect to draw power of this magnitude, if the location can even handle it. The capital cost of the power infrastructure is just a tiny fraction (like 3%!) of the capex of the compute, and even just the depreciation of the compute exceeds the cost of even premium power. Hence, AI hyperscalers and those that aspire to be in their class are traveling to where the power is, are building where power has been (and there's legacy transmission to support it, like old nuclear plants) and are getting into the business of actually building powerplants and reactors. Utilities and transmission and distribution companies, interconnection queues, all are used to react much slower -- over many years -- unlike the top technology companies, now vying to compete at the highest levels of AI performance. Since ~99% of energy technologies previously died withering while waiting for utilities to consider them bankable, this represents an extremely fertile, attractive new state of play if you're bringing a new energy technology to market (๐๐ปโโ๏ธ๐). Whereas before you'd have to brave what was once called a "green valley of death", now you have teracap companies like microsoft bidding on: - Conventional, large nuclear fission - Small modular nuclear reactors - Engineered Geothermal - Fusion! - Stirling Engines ๐ฌ - Who knows what else - Maybe you, Anon! Even Tesla is standing up gas generator arrays to burn fossil fuels for their AI power supply. Shit is getting real










the challenge is getting enough power in a single place. but we're not so far away from having distributed inference, and @NousResearch is making progress in dramatically (800x!) reducing required training bandwidth, so your vision may reveal itself. 4090s in ~TinyBoxes may become the cogen people have dreamed of https://t.co/r4zrV5gUz3






good document on nukes by @creon https://t.co/oCq4DodltH (agree with most of it though I think our work at @lightcellenergy will change the underlying storage assumptions)

FYI, the Federal Energy Regulatory Commissions is having an open session on the co-location of large loads at generating facilities such as AI DATA CENTER POWERPLANTS ๐๐ปโโ๏ธโก๏ธ๐ค on November 1st, 7am - 12pm PDT. open to the public. there are many projects where utilities want to put loads, such as data centers, close to powerplants. Teh US electrical grid is among the most constrained resource, and the time for an upgrade is like 7-10 years, which is much slower than the 1-2 year AI investment cycle. financially, I predict, the outcome will be, co-located loads like AI data centers will have to pay something between 0-100% of the finanical cost of T&D infrastructure, despite being almost none of the burden, because (imo) of the collection of legacy rents. But what colocated loads will be able to avoid is the cost of time. Merely the financial cost of depreciating GPUs is unacceptable -- at $40,000/kW and 5%, the interest would be $2000/kW, and would more than pay for a combined cycle plant per year. At ~75% depreciation over 3 years, the GPUs depreciation costs a new build nuclear plant at $10,000 kW (already imo outrageous, but a learning curve has to restart somewhere) within one year! https://t.co/UaeLqyg5cu

