Podcast

Ep 113 – AI’s power problem (Roy Illsley)

This week’s guest

Roy Illsley

Chief Analyst, IT Operations Omdia

The energy demands of artificial intelligence are skyrocketing. AI workloads are doubling every few months, pushing data centers’ energy consumption beyond the capabilities of our aging grid infrastructure. As these facilities approach gigawatt-level power needs—equivalent to powering entire cities—the energy question has become the critical bottleneck for AI’s future growth.

In this episode, I’m talking with Roy Illsley, Chief Analyst at Omdia who specializes in cloud and data center technologies. We explore how the world’s biggest data center owners—Amazon, Google, and Microsoft—are responding to these challenges, from renewable energy investments to nuclear power, and what it means for the industry. Listen now to hear:

  • Why hyperscalers are looking to nuclear energy solutions to meet their massive power requirements [03:23];
  • How Europe’s data sovereignty requirements are complicating AI development in power-constrained countries [05:40];
  • What opportunity telcos have with AI inferencing at the edge, and why they risk repeating past mistakes [09:20];
  • The crucial software capability gap that could determine who wins in the edge AI market [12:14].

Links and resources

Wanna talk AI and public cloud? Telco execs, set up a meeting with our team to learn how to tap the immense business value it can bring.


Guest bio

Roy Illsley has over thirty-eight years of IT experience, working for a variety of consultancy and end-user companies with experience in the defence, utilities, automotive, retail, and Fast-Moving Consumer Goods (FMCG) industries. He works in the Cloud and Data Centre practice and is recognized as Omdia’s expert on Cloud, Virtualisation, and Management. He also has experience of Data Centre technologies, Quantum Computing, IT Service Management, and covers IT Strategy and Policy. His most recent reports include a series of five on the topic of edge computing, and two on the topic of sovereign cloud. In addition to the reports, he has produced white papers and carried out primary research for leading global organizations. He has delivered keynotes speeches and webinars. He has also presented at external trade events all over the world on a wide range of topics.


Follow DR

The Telco in 20 podcast is ranked in the top 5% of all podcasts globally by Listen Notes! 🎉 We’ve also won a 2024 MarComm Award and Hermes Award, and are recognized as a TeckNexus Top 12 Telco and Tech Podcast, Forrester Top 100 Channel Podcast and Feedspot Top 10 Telecom Podcast.

If you enjoy the podcast, would you leave us a review? It takes you seconds to do in your app and it really makes a difference in helping to convince hard-to-get guests. And I love reading your feedback and reviews!

Podcast credits

  • Executive Producer and Host: Danielle Rios, TelcoDR
  • Senior Producer: Lindsay Grubb, TillCo Media
  • Senior Editor/Brand Manager: Alisa Jenkins, Springboard Marketing
  • Audio Editor: Andrew Condell
  • Supervising Producer: Amanda Avery
  • Associate Producer: Kriselda Dionisio
  • Music: Dyami Wilson

Most popular podcasts

  1. Ep 108 – ☕ Wake up and smell the BSS with Ray Le Maistre from TelecomTV ☕
  2. Ep 107 – 🧣Wrapping your head around Responsible AI🧣 (Ferry Grijpink)
  3. Ep 105 – NVIDIA’s vision for AI and the RAN (Chris Penrose)
  4. Ep 100 – The SPECIAL 100th episode of Telco in 20

Follow now

Get my FREE newsletter, delivered every two weeks, with curated content to help telco execs across the globe move to the public cloud.

Wanna talk public cloud?

Set up a meeting with our team to learn how to tap the immense business value it can bring.

More episodes from Telco in 20

Frequently Asked Questions

1. Why is energy becoming such a critical challenge for AI development?

AI workloads are doubling every three to four months, creating exponential power demands that aging grid infrastructure can’t support. Data centers already consume 1-2% of global electricity, and facilities are approaching gigawatt-level needs—equivalent to powering entire cities. The challenge is compounded by grids that weren’t designed for this scale. As Roy Illsley notes, most western grid infrastructure is 40-50 years old and can’t effectively transport renewable energy from generation sites to where data centers operate. Power has become “the new gold,” determining even where AI infrastructure can be built.

2. How are Amazon, Google, and Microsoft addressing massive AI power requirements?

The hyperscalers are turning to nuclear energy solutions after years of renewable investments. Microsoft is exploring reopening Three Mile Island, while Amazon and Google are investing in small nuclear reactors. Nuclear offers consistent baseload power that renewable sources can’t provide reliably. They’re also building data centers near power sources rather than communication hubs—old industrial Midwest sites are now prime locations simply because they have available power infrastructure, marking a fundamental shift in data center location strategy.

3. What opportunity do telcos have with AI inferencing at the edge?

Telcos have significant advantages for edge AI: country-specific presence for data sovereignty requirements, last-mile customer connections, and existing infrastructure. As Roy Illsley from Omdia explains, this represents a massive opportunity similar to cloud computing in 2010, which telcos missed. For AI inferencing at the edge and sovereignty, telcos could breathe new life into their business models using existing assets. However, they face competition from containerized data centers that can be quickly deployed anywhere there is power. Success requires telcos to execute differently than they did with cloud computing.

4. Why did telcos fail to capitalize on the cloud opportunity, and are they repeating this mistake with edge AI?

Telcos approached cloud as an infrastructure play when it was really about making infrastructure programmable through software. As Danielle Rios explains in the episode, hyperscalers succeeded by hiding complexity behind software layers and letting customers build with simple API calls, while telcos kept selling raw capacity. This meant hyperscalers evolved at software speed while telcos moved at hardware speed. Now with edge AI, telcos are again focusing on physical assets—real estate and networks—while missing that AI workloads need sophisticated orchestration software. Without this software capability, telcos risk becoming the “dumb pipes” of AI computing.

5. How does Danielle Rios at TelcoDR help telcos avoid repeating past cloud computing mistakes?

DR emphasizes that telcos need to be brutally honest about their ability to execute on edge AI opportunities. She highlights that making money from AI inferencing hinges on having sophisticated software capabilities to orchestrate where models run, how they scale, and how to balance costs versus latency—not just physical infrastructure. In her podcast takeaway, she references Jim Abolt’s insight from Episode 1: “A compelling vision without the ability to execute is just a hallucination.” DR invites telco executives to discuss realistic revenue growth strategies rather than chasing pipe dreams.

6. How are Europe’s data sovereignty requirements affecting AI development amid power constraints?

European countries are moving toward stricter data sovereignty rather than opening borders, particularly because everyone wants “sovereign AI” to develop and own IP domestically. Companies like Oracle are investing $5 billion over five years in the UK for this reason. However, power constraints create challenges—Germany’s old, fragmented grid can’t support the AI data centers it wants, forcing German AI work to Nordic countries with better power availability. France is an exception with 90% nuclear power generation, giving it a significant advantage in meeting AI energy demands while maintaining sovereignty.