An Inflection Point for Data Centers
Kodak’s first camera was a box with no viewfinder. It was pre-loaded with 100 exposures, the only pictures the box would ever take. After its initial release in 1888, George Eastman cultivated the camera into a ubiquitous product and Kodak into an iconic company. A century later, Kodak employed 140,000 people and reached a peak market valuation of $60 billion in today’s dollars.
Many of us already know what came next. The invention of the digital camera in the 1970s (notably, by Kodak), forced the obsolescence of celluloid film. This new technology, one that would enable cameras to become exponentially more ubiquitous in peoples’ lives, led the company not to a higher plane of success but rather to their undoing. 15 years after reaching peak valuation, Kodak filed for bankruptcy. The company that invented the camera failed during an era when the number of photos taken worldwide increased from 55 billion to over a trillion annually.
How did this happen? For decades, Kodak incrementally built their core business model around the sale of a commodity: film. Rapid technology change undermined that business model and the company was unable to make the strategic shifts necessary to stay alive.
In his book “Only the Paranoid Survive,” Intel’s former CEO Andrew Grove writes about the concept of a strategic inflection point, a “time in the life of a business when its fundamentals are about to change.” In the late 20th Century, Kodak faced a strategic inflection point, imposed by emerging technology. That moment could have resulted in a much stronger and more prosperous future for Kodak. The same is being said now about the data center industry.
The Homogeneous Past
Conventional data center white space (source: Rainford).
Until recently, data centers were designed as essentially homogeneous environments. Whether supporting web services, databases, or general enterprise computing, much of the industry operated on a simple principle: provide reliable power, cooling, and connectivity in standardized configurations. These configurations provided large, dedicated rooms to house IT equipment, commonly called “white space”. For 30 years, generic white space has been the celluloid film of the data center industry.
This standardization made sense in a world where computing workloads were relatively uniform. Facility designs followed predictable patterns: hyperscale facilities near fiber routes for cloud providers, mid-tier facilities near metropolitan areas for colocation services, and small computer rooms for on-premises enterprise needs. While rack densities and redundancy requirements might vary, the fundamental approach to data center design remained relatively consistent.
AI As a Catalyst for Change
The exponential growth of artificial intelligence, particularly generative AI, is forcing a radical reimagining of data center design. Unlike traditional computing, AI workloads—especially training large language models—create fundamentally different demands on infrastructure.
The most obvious change is in power density. For decades, a data center with quality white space would support server racks that consumed 5-10kW each, an amount of power similar to about 6 household hair dryers. To regulate the equipment’s temperature, it was sufficient to cool the white space with conventional air conditioning units.
In contrast, today, most AI workloads run on GPUs that, when packed into the same 24”-wide server, consume up to 60kW of power, a nearly tenfold increase on traditional power density. It’s not practical to blow enough air across these chips in the dense configurations needed for optimal performance, so the industry has taken the very bold step of normalizing what were once fairly niche liquid cooling technologies. The most common approach is direct to chip cooling, in which chilled water is delivered directly to a cold plate atop each GPU, impressing on the white space a sophisticated fluid network that is at once unfamiliar to conventional facility operators and essential to the chip’s health and wellbeing.
One of NVIDIA’s Grace Blackwell nodes with 8 GB200 GPUs mounted inside. The tubes supply chilled water from a central manifold to a cold plate on top of each chip. These tubes are in essence a new network of capillaries that make the facility’s cooling system orders of magnitude more complex (source: Zak Kostura).
The failure of this liquid cooling system, even for a fraction of a second, can induce a burn-out of the GPU, a piece of equipment that may have an MSRP upwards of $30,000 and a five-year lifespan.
While 60kW per rack is common now for intensive AI workloads, the latest NVIDIA chips, which belong to the Grace Blackwell chip architecture and product line, can be configured into individual racks that consume 132kW of power. And the company has presented a vision of a new variant on this architecture, the Vera Rubin Ultra, that could consume 600kW (another order of magnitude increase) of power in a single rack. Their target release date for that variant is now less than 24 months away.
When one gets to 600kW, it’s no longer practical to compare a rack’s energy needs to household hair dryers. A rack with that power density is roughly akin to the electrical load of 300,000 square feet of commercial office space (typically sized for an average of about 2 watts per square foot). It’s not uncommon to find that much space in a 10-story building. To extract 600kW of heat, that building would require 170 tons of refrigeration.
Jensen Huang introducing Vera Rubin Ultra in March 2025, a standard-sized server rack.
That one rack would consume roughly the same amount as the 270,000 square foot Stadium Tower in Anaheim on a normal day.
What does this mean for data center design? It means that building-scale power and cooling will increasingly be required at each rack within the data center. We can no longer delineate “the building” from “the equipment stored in the building”, a paradigm that prompted the conventional concept of “white space”. Instead, the designers of the data center facility must understand what is actually going on in the data halls, and deliver a purpose-built and fully-integrated solution.
The comprehensive integration of power, cooling and network into the racks of AI servers upends the conventional paradigm of “white space” (source: NVIDIA).
Strategic Inflection Point
For data center owners and operators, the ground is shifting. To be sure, the need for conventional white space isn’t going away. But the business and operational models built around it won’t position a company to capture this exploding new market. Instead, companies need to revisit both the nature and the timing of decisions that drive facility design and delivery.
The nature of their decisions must be more nuanced. Because one tenant’s use case may be different from the next tenant’s, the entire facility needs to be adaptable. If it is not, the owner risks having either stranded power and cooling infrastructure or a tenant who is dissatisfied with the facility’s performance.
The timing of decisions needs to shift from speculative to collaborative. Whereas developers could speculate about the needs of a tenant sufficiently to advance the facility design and construction, it’s now prudent to wait until those assumptions can be directly confirmed with the tenant. This puts much more emphasis on the company’s ability to deliver infrastructure quickly.
Kodak may be gone, but cameras aren’t. Were Eastman alive today, he might be shocked to learn that I have so many cameras in my home that I can’t even account for all of them, a hardware feature built into devices I bought for reasons having nothing to do with capturing images. Technological change catalyzes businesses, redefines our physical world and challenges our concept of foundational products.
Most will agree that AI is a very powerful catalyst. We all see its impact manifest on the screens we look at every day. Behind those screens, AI is also having a massive impact on the infrastructure used to deliver those services. For those who make a business out of delivering that infrastructure, change has come.