Are nukes the solution to the data center problem?
Or are data centers the solution to the nuclear reactor infeasibility problem?
🤖 Data Center Watch 👾
“America’s Data Centers Could Go Dark,” the subject line of the email read.
If only, I mused. I’m less worried about data centers going dark than about everything else going dark because of data centers. But whatever. That’s not what the PR person (or AI bot?) who sent the email was trying to say. They were there to ask, rhetorically: “Can Microreactors Save the Day?” They then offered to connect me with James Walker, CEO of a firm called NANO Nuclear Energy, who would then try to sell me on his KRONOS MMR™, described as a “compact, carbon free” way to power data centers.
There is a lot of hysteria around data centers these days. Folks like me are worried about how much energy and water they use, and the effect that might have on the grid, the climate, scarce water supplies, and other utility customers. Others are panicking over the possibility that the U.S. might fall behind in the AI race — though I have no idea what winning the race would entail or look like.
And, in our capitalistic system, where there is fear, there are myriad solutions, most of which entail building or making or consuming more of something rather than just, well, you know, turning off the damned data centers. The Trump administration would solve the problem by subsidizing more coal-burning, while the petroleum industry is offering up its surplus natural gas. Tech firms are buying up all the power from new solar arrays and geothermal facilities, long before they’re even built.
Perhaps the most hype, and the loftiest promises of salvation, however, involve nuclear power and a new generation of reactors that are smaller, portable, require less up-front capital, and supposedly not weighed down with all of the baggage of the old-school conventional reactors, which not only cost a lot to build, but also tend to evoke visions of Chernobyl, Three Mile Island, or Fukushima.
Yet for all the buzz — which may be loudest in the Western U.S. — it’s far from certain that this so-called nuclear renaissance will ever come to fruition. The latest generation of reactors may go by slick, newfangled names, but they are still expensive, require dangerous and damaging mining to extract uranium for fuel, produce waste, are potentially dangerous — and are still largely unproven.

Several years ago I visited Experimental Breeder Reactor I, located west of Idaho Falls. It has been defunct since 1963 and is now a museum, and a sort of time capsule taking one back to heady times when atomic energy promised to help feed the exploding, electricity-hungry population of the post-war Western U.S. and its growing number of electric gadgets (remember electric can openers?).
The retro-futuristic facility is decked out with control panels and knobs and valves and other apparatus that possess the characteristic sleek chunkiness of mid-century high-tech design. A temperature gauge for the “rod farm” goes up to 500 degrees centigrade, and if you look closely you’ll see a red button labeled “SCRAM” that, if pushed, would have plunged the control rods into the reactor, thereby “poisoning” the reaction and shutting it down. If you have to push it, you’d best scram on out of there.
I couldn’t help but get caught up in the marvels of the technology. On a cold December day in 1951, scientists here had blasted a neutron into a uranium-235 atom and shattered it, releasing energy and yet more neutrons that split other uranium atoms, causing a frenetically energetic chain reaction identical to the one that led to the explosions that annihilated Hiroshima and Nagasaki several years earlier. Mass is destroyed, energy created. Only this time the energy was harnessed not to blow up cities, but to create steam that turned a turbine that generated electricity that illuminated a string of lightbulbs and then powered the entire facility — all without burning fossil fuels or building dams.
This particular reactor was known as a “breeder” because its fuel reproduces itself, in a way. During the reaction, loose neutrons are “captured” by uranium-238 atoms, turning them into plutonium-239, which is readily fissionable, meaning it can be used as fuel for future reactions.

At first glance it seems like the answer to the world’s energy problems, and two years after EBR-I lit up, Dwight D. Eisenhower delivered his 1953 “Atoms for Peace” speech. Nuclear energy would help redeem the world from the terrible scourge of atomic weapons, the president said; it would be used to “serve the needs rather than the fears of the world — to make the deserts flourish, to warm the cold, to feed the hungry, to alleviate the misery of the world.”*
Now, with Arizona utilities teaming up to develop and build new reactors; with Wyoming’s, Idaho’s, and Utah’s governors collaborating on their nuclear-powered “Energy Superabundance” effort; and with Oklo looking to build a modern version of EBR-I not far from the original, it’s beginning to feel like 1953 all over again. Only now the nuclear reaction promises to serve the needs of cyberspace rather than the real world — to make AI do your homework, to cool the server banks, to feed the Instagram feeds, to send out those Tik-Toks at twice the speed.
Seven decades later, Eisenhower’s hopes have yet to be fulfilled.
It turns out a lot of people aren’t comfortable with the idea nuclear reactions taking place down the road, regardless of how many safety backstops are in place to avoid a catastrophic meltdown a la Chernobyl. Nuke plants cost a lot of money and take forever to build. They need water for steam generation and for cooling, which can be a problem in water-constrained places and even in water-abundant areas: Diablo Canyon nuke plant sucks up about 2.5 billion gallons of ocean water to generate steam and to cool the reactors, before spitting it — 20 degrees warmer — back into the Pacific. This kills an estimated 5,000 adult fish each year, along with an additional 1.5 billion fish eggs and fry and messes up water temperature and the marine ecosystem. And while nukes are good at producing baseload power (meaning steady, 24/7 generation), they aren’t very flexible, meaning they can’t be ramped up or down to accommodate fluctuating demand or variable power sources like wind and solar.
And then there’s the waste. The nuclear reaction itself may seem almost miraculous in its power, simplicity, and even purity.
But the steps required to create the reaction, along with the aftermath, are hardly magical. To fuel a single reactor requires extracting hundreds of thousands of tons of ore from the earth, milling the ore to produce yellowcake (triuranium octoxide), converting the yellowcake to uranium hexafluoride gas, enriching it to concentrate the uranium-235, and fabricating the fuel pellets and rods.
Each step generates ample volumes of toxic waste products. Mining leaves behind lightly radioactive waste rock; milling produces mill tailings containing radium, thorium, radon, lead, arsenic, and other nasty stuff; and enrichment and fabrication both produce liquid and solid waste. It has been about 40 years since the Cold War uranium boom busted, and yet the abandoned mines and mills are still contaminating areas and still being cleaned up — if you can ever truly clean up this sort of pollution.
Yet the reaction, itself, generates the most dangerous form of leftovers, containing radioactive fission products such as iodine, strontium, and caesium and transuranic elements including plutonium. This “spent nuclear fuel,” or radioactive waste, is removed from the reactor during refueling and for now is typically stored on site. Efforts to create a national depository for these nasty leftovers have failed, usually because the sites aren’t deemed safe enough to contain the waste for a couple hundred thousand years, or because locals don’t want it in their back yard. If it were to fall into the wrong hands, it could be used in a “dirty bomb,” a conventional explosive that scatters radioactive material around an area.
Plus, breeder reactors, especially, produce plutonium, which can then be used in nuclear warheads (India used U.S.-supported breeder technology to acquire nuclear weapons). That’s one of the reasons folks soured on the technology and the U.S. ended its federal plutonium breeder reactor development program in the 1980s. The other reasons were high costs and sodium coolant leaks (and resulting fires). After the EBR-I shut down in 1963, because it was outdated, the Idaho National Laboratory built EBR-II nearby. It was shut down and decommissioned in 1994.
Nevertheless, Oklo — one of the rising new-nuke stars — is touting its use of similar technology as the EBR-II, i.e. liquid-metal-cooled, metal-fueled fast reactor, as a selling point for the reactor it is currently developing at the INL.
The envisioned new fleet of reactors go by many names: SMRs, or small modular reactors, and advanced, fast, micro, or nano-reactors. Most of them can be fabricated in a factory, then trucked to or assembled on-site. Some are small enough to fit in a truck. They can be used alone to power a microgrid or a data center, or clustered to create a utility-scale operation that feeds the grid.
Their main selling point is that they require less up-front capital than a conventional reactor, that you can build and install one of these things for a fraction of the cost and a fraction of the time (once the reactors are actually licensed, developed, and produced on a commercial scale, which is still not the case).
A decade ago, companies like NuScale were also promoting them as ways to power the grid in a time of increasing restraints on carbon. Now that the feds are not only declaring climate change a “hoax,” but also forbidding agencies from even uttering the term, that no longer carries as much weight. Instead, almost every new proposal now is marketed as a “solution” to the data center “problem.” Google, Switch, Amazon, Open AI, and Meta are all looking to power their facilities with nukes, if and when they are finally up and running.
The new technology is not monolithic. Some are cooled in different ways, or use different types of fuel, but they all work on the same principle as old-school conventional reactors. As such, they also require the same fuel-production process, also have potential safety issues, and also create hazardous waste.
In fact, a 2022 Stanford study found that small modular reactors could create more, and equally hazardous, waste than conventional reactors per unit of power generated. The authors wrote: “Results reveal that water-, molten salt–, and sodium-cooled SMR designs will increase the volume of nuclear waste in need of management and disposal by factors of 2 to 30 {compared to an 1,100 MW pressurized water reactor}.”
The cost thing isn’t all that clear cut, either. The smaller reactors may be cheaper to build, but because they don’t take advantage of economies of scale, they are more expensive per unit of electricity generated than conventional reactors, and still can be cost prohibitive.
In 2015, for example, Oregon-based NuScale proposed installing 12 of its 50-MW small modular reactors at the Idaho National Laboratories to provide 600 MW of capacity to the Utah Associated Municipal Power Systems, or UAMPS (which also includes a handful of non-Utah utilities). In 2018 — after receiving at least $288 million in federal subsidies — NuScale upped the planned capacity to 720 MW, saying it would lower operating costs.
But what started out as a $3 billion project in 2015 kept increasing, so that even after it was ramped down to 421 MW, the projected price tag had ballooned to $9.3 billion in 2023 (still about one-third of the cost of the new Vogtle plant in Georgia, but with a fraction of the generating capacity). UAMPS’s collective members, realizing there were plenty of more cost-effective ways to keep their grids running, canceled the project later that year.
It kind of makes you wonder: Is this new wave of nuclear reactors solving the data center energy demand problem? Or are data centers’ energy-gobbling habits solving the nuclear reactors’ cost and feasibility problems?
I suspect it’s a little bit of both, with the balance swinging toward the latter. In that case, nuclear reactors are not alone: The Trump administration is using data center demand as the prime justification for propping up the dying coal industry.
Before the Big Data Center Buildup, utilities really had no need for expensive, waste-producing reactors — they could more cheaply and safely build solar and wind installations with battery storage systems for backup. If needed, they could supplement it with geothermal or natural gas-fired peaker plants.
But if data centers end up demanding as much power as projected (like 22,000 additional megawatts in Nevada, alone), utilities will need to pull out all the stops and add generating capacity of all sorts as quickly as possible, or they’ll tell the data centers to generate their own power. Either scenario would likely make small nukes more attractive, even if they do cost too much, and even if it means that data centers end up being radioactive waste repositories, too.
Another plausible scenario is that the tech firms figure out ways to make their data centers more efficient; that it’s more cost-effective (and therefore profitable) to develop less energy- and water-intensive data processing hardware than to spend billions on an experimental reactor that may not be operating for years from now.
What a novel concept: To use less, rather than always hungering for more and more and more.