Traditional Data Centers vs. AI Super Clusters
Most people hear "data center" and think of the server rooms that power email, websites, and cloud storage. Traditional data centers have existed for decades and typically consume 5-30 megawatts (MW) of power. They run standard processors (CPUs), generate moderate heat, and use predictable amounts of water and electricity.[1]
An AI hyper-scale data center is a fundamentally different kind of facility. These campuses are purpose-built to run artificial intelligence workloads - training and operating the large language models behind tools like ChatGPT, image generators, and autonomous systems. Instead of CPUs, they are packed with thousands of graphics processing units (GPUs) and specialized AI accelerators that consume far more power per chip and produce far more heat.[2]
What Does "Super Cluster" Mean?
A "super cluster" refers to a massive aggregation of AI computing capacity - tens of thousands to hundreds of thousands of specialized GPUs connected by high-speed, low-latency networking, functioning as one giant supercomputer across multiple buildings on a single campus. A single super cluster might require 50-100+ megawatts, and full campuses can deliver hundreds of megawatts or even gigawatts of compute power. These facilities are at the very top of the data center scale.[3]
The largest AI data center projects currently being built in the United States illustrate the staggering scale: OpenAI's Stargate project ($500 billion committed, nearly 7 GW of planned capacity across multiple sites); Elon Musk's xAI Colossus in Memphis (2 GW, 555,000 GPUs, the world's largest single-site AI training installation); and Meta's Hyperion in Louisiana (over 2 GW, 2,250 acres, 4 million square feet). These campuses span hundreds to thousands of acres and cost tens of billions of dollars to build.[4]
Scale Comparison: Numbers That Matter
The numbers associated with these facilities are staggering when compared to a community like Pekin (population ~32,000):
- Power: A large AI data center campus can consume 300-1,000+ MW of electricity continuously. The entire City of Pekin uses an estimated 30-50 MW. One facility could draw 10-17 times more power than the whole city.[5]
- Water: Hyper-scale data centers use evaporative cooling that consumes 1-5 million gallons of water per day. Google's facility in The Dalles, Oregon consumed 434 million gallons in 2024 - one-third of the city's total water supply. All Northern Virginia data centers consumed nearly 2 billion gallons in 2023.[6]
- Land: These campuses typically require 100-2,000+ acres, permanently converting agricultural or natural land to industrial use. Meta's Hyperion campus in Louisiana occupies 2,250 acres - four times the size of Manhattan's Central Park.
- Backup power: Large facilities maintain dozens to hundreds of diesel generators for emergency backup. In Virginia alone, nearly 9,000 diesel generators have been permitted at data centers; one Amazon facility contemplates up to 10 million gallons of diesel fuel annually.[7]
The U.S. Data Center Boom
The demand for AI computing power is driving an unprecedented expansion of data center construction across the United States. According to the Lawrence Berkeley National Laboratory, U.S. data center electricity usage grew from 58 TWh in 2014 to 176 TWh in 2023 (4.4% of total U.S. electricity) and is projected to reach 325-580 TWh by 2028 (6.7-12% of total U.S. electricity). The International Energy Agency projects data centers will drive almost half of U.S. electricity demand growth through 2030. By then, U.S. data centers are expected to consume more electricity than all energy-intensive manufacturing combined - aluminum, steel, cement, and chemicals.[8]
This gold rush has sent developers scouting for sites in small and mid-sized communities where land is cheap, regulations are lighter, and local governments may be persuaded with promises of tax revenue and jobs. As of mid-2025, $64 billion worth of U.S. data center projects had been blocked or delayed by community opposition across 28 states. Communities like Pekin may be attractive to developers precisely because they lack the regulatory infrastructure to scrutinize these projects adequately.[9]
Why This Matters for Pekin
WHP LLC (Western Hospitality Partners) has proposed building an AI data center "super cluster" on the Lutticken Property - recently annexed Groveland Township farmland. While specific capacity figures have not been publicly disclosed, the use of the term "super cluster" indicates a facility at the very large end of the spectrum.
A facility of this type would place enormous demands on Pekin's water supply, electrical grid, and natural environment. It would generate constant industrial noise, require massive diesel backup systems, and permanently destroy productive farmland. The benefits - primarily tax revenue and a relatively small number of permanent jobs - flow to outside investors, while the costs are borne disproportionately by the community.
The following pages explore each of these impacts in detail:
- Water Consumption - How data centers drain municipal water supplies
- Power Grid Strain - The electricity demands and their impact on ratepayers
- Noise & Air Pollution - Industrial noise and diesel generator emissions
- Environmental Destruction - Farmland loss, habitat damage, and ecological harm