by  

Seemingly overnight, the data center industry’s adoption of liquid cooling has transformed into a widespread certainty. Despite its initial perception as a recommended solution only in the highest of high-performance compute (HPC) environments, the breakneck pace of chip development and skyrocketing demand for generative AI have caused the industry at large to reassess liquid cooling: instead of just a “nice-to-have,” it’s quickly become an inevitable “must-have.”

What’s less certain, however, is how data center operators should implement liquid cooling. Is it easier to retrofit legacy infrastructure to accommodate your liquid cooling solution, or should newer technologies primarily be implemented in newer sites? Currently, when it comes to capturing next-gen workloads, it seems as though popular consensus leans towards the latter: “for the moment,” writes Accelevation President Sam Prudhomme, “it remains more appealing to build something from scratch.”

But how long will that appeal hold? And, if that appeal is mostly due to the common myth that retrofitting is prohibitively expensive when compared to “starting from scratch,” I think it’s time to ask: just how true is that, really? Are we potentially ignoring the long-term benefits of retrofits in favor of the short-term ease of greenfield builds?

In this moment of explosive growth for our industry, it’s critical to take a step back from how retrofitting is commonly perceived and reconsider what it could truly deliver.

The True Cost of Greenfield

Typically, when asked why existing infrastructure can’t be retrofitted, the answer can be summed up in a single word: “cost.” To many, retrofitting presents a series of financial hurdles that are seemingly too risky to overcome. However, it’s worthwhile to explore the various meanings of the word “cost,” and question whether a “simpler” greenfield approach may incur other costs long, long after the first shovels strike the earth.

The Cost of Power

Data centers currently consume 3.5% of America’s energy. That percentage is expected to rise beyond 9% by 2035, largely due to the processing demands of generative AI. With each new data center constructed, we further worsen the strain on our available power. 

This has rapidly become a critical issue that has caused mainstream concern. For example, in Virginia, home of Loudoun County’s “Data Center Alley,” residents’ electricity bills are predicted to increase by up to 70% within the next five years to meet the growing demand for data centers. At a time where data centers are already globally labeled as “energy vampires,” the increased risk of community pushback caused by rising power costs will inevitably delay construction efforts, revive demands for further regulation, and overall damage our industry’s reputation.

Rather than having more and more data centers consume more and more power, greater emphasis should be given to optimizing the facilities we currently have. Seeking out the latest advancements in cooling solutions, such as shifting from air to two-phase, direct-to-chip liquid, can slash cooling’s energy demands by up to 50%, significantly reducing operational costs and allocating more power to compute. 

The Cost of Time

Newer data centers are mostly constructed to accommodate newer chips. However, while NVIDIA rolls out its latest GPUs on a yearly basis, data center operators take three to five years on average to build a facility, and getting enough power to facilitate these data centers can take up to seven years. 

This data illustrates a simple truth: The rate of new data centers being constructed will always be outstripped by the frantic rate of newer, hungrier chips entering the market. Greenfields will always participate in a race they can never win. 

Valuable time spent on acquiring permits for new data centers, developing new sites, and connecting new utilities—it all adds up. Fortunately, it’s substantially quicker to future-proof what already exists. Retrofitting is approximately 30% quicker than going greenfield, often taking up less than a year. Furthermore, legacy infrastructure can be upgraded with two-phase, direct-to-chip cooling to enable the next several years’ worth of chip development, avoiding construction projects that risk immediate obsolescence upon completion. 

The Cost of Sustainability

Pursuing a greener industry has seemingly been deprioritized in favor of pursuing AI. This shift in priority will inevitably lead to diminishing returns, as less and less land becomes available for greenfield projects and greater and greater strain is placed on what’s left.

We can’t ignore the environmental cost of new construction. Concrete, for example, represents up to 80% of a new data center’s embodied carbon emissions, and concrete itself accounts for 11% of global carbon emissions—more than the entirety of the EU’s. Ultimately, emissions will inevitably worsen under this greenfield-centric approach, accelerating the rate of global warming, and subsequently accelerating the need for more power devoted to cooling. 

Instead of abandoning older data centers to become “digital ghost towns”—or demolishing them, which “contributes significantly to carbon emissions”—legacy infrastructure can serve as a hotbed for innovation. A sustainability-oriented approach to retrofitting can pay additional dividends by generating marketing value. During the Paris Olympics, Equinix’s PA10 data center made headlines by innovating its process to capture waste heat to warm the Olympic pools. Efforts at retrofitting can also be included as part of a company’s publicized ESG efforts, which have recently transformed from a “voluntary effort to a mandatory requirement.”

Eventually, we’ll hit a point where our environment won’t support data center construction at its current scale. Retrofitting what we’ve already got will soon become a necessity—and we had better be ready for it.

Conclusion

Of course, money still remains a critical factor for companies weighing a brownfield vs. greenfield approach. Luckily, given our industry’s dynamic growth, retrofitting may become less of a hard sell. Recent data from Omdia has noted that 74% of enterprisesraised their IT budgets last year, and many enterprises have moved generative AI solutions into production-ready environments. These trends likely indicate more cash will be devoted to data center investments.

In fact, retrofitting has been identified as a growing market opportunity. A recent report from Research and Markets stated that the “need for better cooling systems” will help retrofitting “become a persuasive idea for operators” over the next decade, adding that retrofitting “can also help provide powerful new layers of security, intelligence, and automation.”

Existing data centers have played a pivotal role in reaching our industry’s present success. It’s time to upgrade and enable their cooling infrastructure to better support our exciting future.