Chris Kelly, former Facebook executive says the artificial intelligence industry needs to figure out how to use less electricity as companies build massive data centers across the country.
Chris Kelly, who was Facebook’s chief privacy officer and general counsel, told CNBC on Tuesday that making AI more efficient will be critical going forward. Human brains run on just 20 watts of power, he pointed out. AI companies are building facilities that need billions of watts.
“I think that finding efficiency is going to be one of the key things that the big AI players look to,” Kelly said. The companies that figure out how to cut data center costs will come out ahead, he believes.
The construction boom has raised questions about where the electricity will come from. The power grid is already under pressure. Nvidia and OpenAI announced plans in September for data centers needing at least 10 gigawatts of electricity. That’s enough to run roughly 8 million American homes for a year. It’s also about what New York City uses during its busiest summer days in 2024, based on New York Independent System Operator figures.
Worries about expenses grew after DeepSeek released a free large language model in December 2024. The Chinese company said it cost less than $6 million to develop. That’s dramatically lower than what American competitors have spent.
Kelly expects more Chinese companies to become major players. President Donald Trump recently approved the sale of Nvidia’s H200 chips to China. Open-source models from China will give people access to basic computing power and AI tools, Kelly added.
Consumers face soaring bills
The rush to build these facilities is already hitting electricity bills. Data centers that haven’t been constructed yet are pushing power prices higher as reported by Cryptopoltian previously. Regular customers might end up paying for expensive infrastructure that may not be needed if demand predictions are wrong.
Consumers on the biggest electric grid in the country will pay $16.6 billion to guarantee future power supplies for data centers between 2025 and 2027. That’s according to a watchdog report released this month. The grid is PJM Interconnection. It provides electricity to over 65 million people in 13 states, including Virginia, which has the world’s largest data center hub. Northern Illinois and Ohio are growing markets too.
“A lot of us are very concerned that we are paying money today for a data center tomorrow,” said Abe Silverman. He was general counsel for New Jersey’s public utility board from 2019 to 2023. “That’s a little bit scary if you don’t really have faith in the load forecast.”
Data center boom may not be as big as power companies think
Home electricity costs have already gone up in states with major data center activity. Residential prices in September jumped 20% in Illinois, 12% in Ohio, and 9% in Virginia compared to the same month last year. The federal Energy Information Administration provided those numbers. All three states rank among the top five data center markets nationwide.
Joe Bowring leads Monitoring Analytics. He explained that data center power costs show up directly on household bills. “When the wholesale power costs go up, people pay more, when it goes down people pay less,” he said.
PJM predicts data centers will need an extra 30 gigawatts by 2030. That’s enough electricity to power more than 24 million homes annually. But there’s uncertainty about whether that demand will actually happen. Data center developers often explore multiple locations before choosing one, said Cathy Kunkel. She’s a consultant at the Institute for Energy Economics and Financial Analysis. That means the forecasts likely count some projects twice.
Sharpen your strategy with mentorship + daily ideas – 30 days free access to our trading program