top of page
yuvraj-singh-parmar-zYn_ZGIzoMs-unsplash_edited_edited.jpg

In the Age of Intelligence, Power Is the New Oil

ree

Every AI model starts with a dataset and ends with a power bill. As large models and cloud services multiply, the demand for compute has exploded, putting a spotlight on electricity and infrastructure. Bain & Company reports that AI’s compute needs are growing at twice the pace of Moore’s Law, driving global compute demand toward 200 gigawatts by 2030. Data centers already consume huge power: about 415 TWh or 1.5% of global electricity in 2024. In practice, the AI boom is as much about watts, cooling and location as it is about algorithms. The hottest competition today is often over where to build servers, not just how to train models.


The geopolitics of compute

Compute infrastructure has become a strategic asset. Data centers were once back-room tech, but they now underpin everything digital. The World Economic Forum notes that data centres have evolved into strategic assets as the digital age’s equivalent of power plants or ports.  A majority of the world’s data centers are in the U.S., reflecting American dominance of cloud infrastructure. In turn, governments worldwide court data-center investment. Countries offer tax breaks and fast-track permits to attract hyperscale facilities. Emerging economies see data center hubs as growth engines, and a data center ‘gold rush’ is noted in Asia.


At the same time, major powers are competing for control of the chains that feed these centers: chips, cables and clouds. The U.S. and China have entered a new tech rivalry with export controls on high-end AI chips cutting off supplies to competitors, and alliances like the ‘Chip 4’ pact (US, Japan, Taiwan, South Korea) forming to coordinate semiconductor strategy. China is responding with its own ‘Digital Silk Road’ and massive investment in domestic chips and data halls. The result is a more fragmented tech landscape, as friendly nations group around common standards and self-reliance becomes a policy goal.


Money is flooding this space. McKinsey projects that roughly $7 trillion in data-center CapEx will be needed by 2030 to meet AI demand. In practical terms, nations with abundant cheap power have an advantage. For example, Virginia’s data-center region offers power rates 28% below the U.S. average, while states with weaker grids or higher prices, like much of New England, struggle to compete. As Bain and others note, meeting AI’s power needs will likely require new power plants and grid upgrades, making energy and compute as important to national strategy as oil and gas once were.


Environmental & infrastructure impact

The surge in compute has big environmental implications. Data-center electricity use (currently 1.5% of world total) is set to soar, projecting roughly 945 TWh by 2030 (roughly Japan’s annual power). Cooling those servers is equally demanding. Many major data hubs sit in hot or dry regions, which strains local resources. New analysis warns over half of the top 100 data-center regions already face high climate risk, and 52% are in areas of water stress. For example, expanding centers in Asia or the Middle East have led regulators to worry about depleting reservoirs.


Water usage is particularly notable. TechRadar found that AI-related cooling consumes on the order of 100 billion liters of water per year worldwide. A single 100MW AI-capable center might use 4.2 million liters daily. In Malaysia’s Johor region, authorities have even denied most water requests for data centers on sustainability grounds. The concentrated impact of new data centers on specific towns and ecosystems far outweighs the negligible per-user energy of AI prompts.


The industry is responding with ‘green compute’ strategies. Hyperscalers sign long-term renewable power purchase agreements or build on-site solar/wind to lock in low-carbon power. Cooling designs are also evolving: many new U.S. centers use closed-loop liquid cooling, which can cut water use by up to 70%. Some cloud giants even vary their AI workloads regionally to avoid peak grid stress. Still, balancing massive AI compute with sustainability remains a key challenge as demand grows.


Ideal compute locations

Northern Virginia (USA). Loudoun County is nicknamed ‘Data Center Alley’, with the world’s largest data center cluster (30+ million sq. ft. operational). It offers ultra-low latency, robust fiber networks and a workforce experienced with hyperscale providers. Critically, Virginia has plentiful, relatively inexpensive power. However, the boom creates local strains. In 2023 Loudoun data centers used nearly 2 billion gallons of water. Officials have therefore imposed regulations, such as mandatory closed-loop cooling, and are upgrading grids to cope. Land and power are getting scarcer, and new transparency and tax measures are being debated as government scrutiny rises.


Northern Norway. Arctic Europe is attracting AI builders. Narvik is ideal because of abundant renewable power, cool ambient climate and stable infrastructure. OpenAI’s new ‘Stargate Norway’ facility is explicitly choosing Narvik for its 1.5 GW of hydropower and natural cooling. The project will run entirely on renewables, with high-efficiency liquid cooling to minimize energy and heat waste. The trade-offs: Norway’s sites are remote from large populations, raising latency and logistics concerns. But for workloads where green, cheap power is king, the location shines, especially for serving Europe.


Southeast Asia (Johor, Malaysia). Emerging markets are also vying for AI infrastructure. In Johor, YTL’s Green Data Center Park will offer up to 500 MW capacity powered in part by an on-site 100MW solar farm. Its proximity to Singapore and direct fiber links make it attractive to Asian firms seeking low latency to major markets. Large land parcels keep real estate costs down. The flipside is climate: hot, humid air means higher cooling loads, and Southeast Asia has limited water. Malaysia’s regulators have only approved 18% of water-use requests for regional data centers due to supply concerns. Investors here must weigh cheap land and growing demand against environmental limits and evolving energy policy.


The Balance of Global Influence

For AI entrepreneurs and investors, power is no longer a background concern; it’s as critical as code itself. Startups with heavy compute workloads must now weigh not just architecture, but geography: regions with surplus renewables, stable grids, and supportive policy can halve operating costs and emissions. For investors and policymakers, the race to build next-gen infrastructure is the new frontier. Compute power, grids, real estate and cooling needed to deliver it, is emerging as a strategic resource in its own right. The environmental and ethical trade-offs will be unavoidable, as surging demand for data and electricity collide with fragile energy transitions. In short, the next decade’s tech winners may not be just model-makers, but those who own the grid, the land, the cooling systems, and the compute. Whoever controls the capital flows and the power behind the intelligence will define not only the pace of innovation, but the balance of global influence itself.

 
 
 

Comments


bottom of page