• References

The Trust Project

References This article includes a list of source material, including documents and people, so you can follow the story further.
The sun rises over an image of Earth, with trees, windmills and birds on the horizon.
(Provided by Gigafact.)

Yes.

A single data center can use far more than the energy needed to power 5,000 American homes.

According to the U.S. Energy Information Administration, the average U.S. household uses about 10,700 kilowatt-hours of electricity per year. At that rate, 5,000 homes consume roughly 53.5 million kilowatt-hours annually.

Large data centers can far exceed that threshold. Pew Research Center reported that a typical artificial intelligence-focused “hyperscale” data center consumes enough energy to power 100,000 households.

According to Data Center Map, a leading industry registry, the U.S. has over 4,300 data centers, including nine in Maine. Those centers consumed 183 billion kilowatt-hours of electricity in 2024, according to Pew.

Some communities have opposed data centers — especially those for AI — over concerns about electricity costs, grid strain, water use and limited local economic benefits. Lewiston officials recently unanimously rejected a proposed facility following public opposition.

This fact brief is responsive to conversations such as this one.

The Colorado Sun partners with Gigafact to produce fact briefs — bite-sized fact checks of trending claims. Read our methodology to learn how we check claims.

Sources

References:

How much electricity does an American home use?, U.S. Energy Information Administration, accessed December 2025. Source link
What we know about energy use at U.S. data centers amid the AI boom, Pew Research Center, Oct. 24, 2025. Source link
Data Center Map, USA Data Centers, accessed December 2025. Source link
Lewiston City Council shoots down data center proposal, Maine Public, Dec. 17, 2025. Source link

Type of Story: Fact-Check

Checks a specific statement or set of statements asserted as fact.