The Daily Grind is brought to you by:
Damn Gravity: Book for Builders and Entrepreneurs
Good morning!
Welcome to The Daily Grind for Thursday, August 14.
Today’s headline: The rising tensions between tech companies and energy providers.
Plus, a preview of Isabelle’s new book on nuclear electricity, Rad Future.
Let’s get into it!
Today, the New York Times published a feature on how AI is re-shaping the U.S. power grid. It gives us a closer look at the growing tensions between technology companies and American utilities that power them.
Data centers are popping up across the landscape—with dozens more planned—and energy costs are already starting to rise for small businesses and individual consumers.
“Starting in June, the electricity bill for a typical household in Ohio increased at least $15 a month because of data centers, according to data from a major local utility and an independent monitor of the electric grid that stretches across 13 states and the District of Columbia.”
Large tech companies are spending billions to keep up with the booming demand for data centers to power AI. “The single biggest constraint,” said Amazon CEO Andy Jassey, “Is power.”
Amazon and other tech giants are promising to pay their way for new energy infrastructure required to run their data centers. But the key question is, who pays for the upfront costs of new grid development?
Traditionally, utility companies would front these costs and would make money back on usage of that power in the future. But the speed at which new data centers are going up has changed the calculation. Utilities would have to shell out billions of dollars to build the needed infrastructure before data centers come online and start paying their bills.
Sometimes, data center projects are heavily delayed, or scrapped altogether.
Microsoft, for example, announced plans in October to build three data center campuses that would require power from the Ohio utility. “The Columbus region’s skilled work force, strong infrastructure and strategic location make it ideal for this project,” the company said then.
But six months later — before regulators ruled against the tech industry — Microsoft changed its data center strategy and said it was putting the Ohio projects on ice. For the foreseeable future, those sites would remain farmland.
Tech companies and utilities are also struggling to estimate how much power they will actually need in the future. Several factors could impact future energy usage:
Demand for AI — With the potential that we’ve entered AI’s trough of disillusionment, and some scientists are now worried that AI has peaked, will future energy needs really materialize?
Efficiency of AI — The gold rush within the AI gold rush is building more energy efficient hardware and software to power AI. If new chipmakers like Positron succeed in lowering the energy usage of inference by 1/3, we could see energy needs plateau instead of spike.
Technology companies could simply choose to change their plans for data centers, leaving local utilities in a lurch. “Tech companies say they plan to keep building data centers, but where those sites will be is uncertain. That puts utilities at risk of building more than their area needs,” said the New York Times.
The need for energy is also forcing utility companies to re-open fossil fuel plants, which could accelerate the impacts of climate change in the near future.
AI’s strain on our energy infrastructure could have positive effects, too. One, it is forcing utility companies to make much-needed upgrades to our energy system, which will serve generations of Americans.
Two, AI data centers have re-ignited the demand for nuclear power, our cleanest and most abundant energy source.
The balance is tricky: supply energy to the greatest technology boom since the Internet, but do it in a sustainable way.
If we do it right, and invest in the right kind of energy, we could all soon live in a greener world.
WSJ: Meet the ‘Clippers’ Cashing In on Your Social-Media Feeds
Fast Company: Layoffs 2025: Companies have announced over 800,000 jobs cuts so far. There are 3 big reasons why
Financial Times: DeepSeek R2's launch delay is due to training issues on Huawei Ascend chips, prompting a switch to Nvidia chips for training and Huawei's for inference
Isabelle Boemeke, founder of Isodope, has just released a new book about the magic of nuclear electricity (her preferred term for nuclear energy).
Boemeke is not a nuclear scientist, policy maker, or energy expert—and she doesn’t pretend to be. She calls herself a “translator,” using her unique wit and style to explain and advocate for nuclear energy.
I love Boemeke’s illustrations that visually explain how nuclear works and it’s benefits. Also, I love the book cover. This will be a book that lives face-out on my shelf.
When Andy Jassey called power the “biggest constraint,” on AI, he is referring to the bottleneck.
But the term “bottleneck” is used so frequently that it’s lost its meaning. So let’s revisit.
In systems engineering theory, such as lean manufacturing, the bottleneck is the single point in a system—such as an assembly line—where production is being slowed down.
An assembly always has a bottleneck—and only one bottleneck. There can only be one biggest point of delay or weakness in a system. The job of the systems engineer is to identify and eliminate the bottleneck. When one bottleneck is solved, they move on to the next bottleneck.
We all have a bottleneck in our work too—one single point of weakness that is slowing us down the most.
What is THE bottleneck in your work right now?
Identifying and eliminating the bottleneck is the highest leverage work you can do. Spend some time today to think about your bottleneck and how to remove it.
That’s it for today! Thank you for reading. Talk to you tomorrow.
What did you think of today's newsletter? |