Power Play – What’s driving the power appetite in real estate?

Power capacity availability has quickly become the biggest driver in the commercial real estate market.   The mantra ‘location, location, location’ has traditionally driven property value.  In today’s new frontier of artificial intelligence (AI) development, those desirable locations also need access to much higher densities of reliable, and hopefully, renewable power supplies.  This holds true in both the more densely developed urban markets as well as areas that traditionally support larger data centers.

As context, in Q3-2024, AI companies have leased 3.9 million square feet in San Francisco and 2.6 million square feet in Silicon valley, according to CBRE data, signifying a fundamental recovery to the commercial real estate market.  These leases bring with them well-paying, working-in-the-office jobs and precious tax revenue to cities whose coffers are struggling following the decline in commercial office property taxes.

Real estate developers are identifying properties with access to more power such as outdated/decommissioned data centers, manufacturing buildings, or sites with close adjacency to utility distribution facilities.  The promise is that these developments will be uniquely situated to support a growing company that specializes in AI software development.  These sites, however, tend not to be in the urban areas where software developers like to lease their office spaces.

Addressing the power needs of the AI software companies is crucial in supporting the current wave of ground-breaking technology development.  Although most of the post-roll-out AI application processing is occurring in high density data centers, the need for server rooms in their office spaces to support the development of new and improved applications is vital to these companies. The preference for short distance (not in the cloud) processing and access to the equipment means that many of the technology companies want small but mighty server rooms proximate to the desks in their office spaces.  The challenge is that these small server spaces require significant power.  The power needs stem from advanced graphic processing units (GPUs) and parallel processing software platforms such as CUDA - Nvidia’s proprietary parallel processing platform - that can solve advanced problems simultaneously. The hardware and software support the development of large language models, advanced biotech research and other deep learning development.

What’s the demand?

The whopping load of a single rack of GPU’s is currently close to 150 kW (think 150 toasters) in approximately 6 square feet of space.  The trend is moving toward 300 – 400 kW per rack (you can do the toaster math).  Even when only a small local server room is needed, these spaces require 4000 watts per square foot of power for the equipment.  Cooling this incredible density is done with liquid coolant directly to the chip through cold plates attached to the CPU’s and GPU’s. Currently, 80% of the server room cooling is done using this direct liquid cooling through coolant distribution units with 20% through more traditional computer room air conditioning around and through the racks.  Even small server rooms (less than 1000 SF) for AI development may require new utility services dedicated to these spaces.

Increased Power and Sustainability

The growth of AI adoption comes with significant concerns about increased greenhouse gas emissions and climate change. Renewable energy and other carbon free energy sources will need to grow at a greater rate than ever before just to keep pace.  Controversially, atomic energy sources are being pushed and funded by the large AI forward businesses including Microsoft, Amazon, Google, and Open AI.   Microsoft has made the news for reaching a deal to restart operations at the Three Mile Island nuclear plant; Amazon is buying a nuclear-powered data center in Pennsylvania; and Google has inked a deal with Kairos Power to use small nuclear reactors.   Renewable sources such as solar, wind and wave have been less favored due to limitations in capacity and scaling.

In urban areas such as San Francisco and San Jose, where these AI development companies want to locate, there are both benefits and obstacles. The advantages are access to talent, venture capital, a grid that is relatively green and moving toward carbon-free.  The challenges are the limitations of the grid and the time it takes to install the infrastructure to deliver the desired power – even with relatively small increases.

How to plan for the future

Interfacing with utility companies can cause pause for thought in even the most courageous: increasing the service or adding a service to an existing site can take years of patience and a considerable capital investment. It requires a utility system impact studies or a Large Load Analysis  by the utility company to determine the infrastructure required to provide the capacity.  The analysis will give the developer an idea of how much the service will cost to install as well as monthly special facilities fees which occur when a customer falls outside of the normal parameters.  These studies are the design path to obtaining the service capacity AI software companies need but take 6-12 months or more to complete.   They often show that it will take multiple years and tens of millions of dollars to serve the site.  These hurdles will slow down the application of AI and the recovery of downtown San Francisco and San Jose that have relied on technology businesses as an important economic engine.

Some like it hot

The heat produced by AI data servers may be put to beneficial use.  Amazon demonstrated this application in their four-block Denny Triangle Seattle campus, where they are using waste heat from an existing Westin data center, providing more than 75% of the heating required for their campus at about four times the energy efficiency of a comparable HVAC system.

Overcoming obstacles

Ideally, owners, developers, and city leaders would come together to work collaboratively with PG&E and the other power utilities to develop a strategic plan to provide sites with addition power for technology companies that need on site GPU’s and AI type parallel processing. Creating a smart power infrastructure master plan that will focus the electrical utility infrastructure upgrades to specific areas will reduce the time and expense and provide more clarity for the path forward and will allow AI based companies focused on complex problem solving to grow.   Until this happens, those sites fortunate to have abundant power today are where it’s at.