Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield) cover
Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield) cover
2050 Investors — Economic and markets megatrends, ahead of 2050’s global sustainability targets

Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield)

Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield)

33min |27/11/2025
Play
Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield) cover
Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield) cover
2050 Investors — Economic and markets megatrends, ahead of 2050’s global sustainability targets

Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield)

Data Centers: Where AI builds Its Brain (ft. Sikander Rashid, Global Head of AI Infrastructure at Brookfield)

33min |27/11/2025
Play

Description

Artificial intelligence may appear weightless, but its backbone is built on vast, energy-hungry data centers. In this episode, host Kokou Agbo-Bloua explores how these facilities—from corporate server farms to hyperscale sites—have become the brain of the AI boom. Kokou dissects the dual demands of AI: training massive models and running inference, and how these processes are fundamentally reshaping global energy and water consumption, while fuelling a trillion-dollar investment race.


Later, Sikander Rashid, Global Head of AI Infrastructure at Brookfield Asset Management, joins to discuss how investors are navigating soaring demand for computational power amid the global race towards Artificial General Intelligence (AGI). He shares his insights on how balancing carbon mitigation with capacity expansion could reshape global capital flows and addresses the age-old question: are we in an AI boom or a bubble?


Tune in now to uncover the hidden infrastructure behind AI—and what it means for the future of technology, finance, and the planet.


Credits

Presenter & Writer: Kokou Agbo-Bloua

Producers & Editors: Jovaney Ashman, Jennifer Krumm, Louis Trouslard

Sound Director: La Vilaine, Pierre-Emmanuel Lurton.

Music: Cézame Music Agency

Graphic Design: Cédric Cazaly


Whilst the following podcast discusses the financial markets, it does not recommend any particular investment decision. If you are unsure of the merits of any investment decision, please seek professional advice. 


Hosted on Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Cocu Agboblois. Hold on a second. I'm Cocu Agboblois. I head up? Yes. I'm Cocu Agboblois. Phew. It was just a dream. Or was it? Siri, please confirm you haven't replaced me.

  • Speaker #1

    Relax, Cocu. It was just a glitch in the matrix. Or maybe just an accurate vision of the future.

  • Speaker #0

    Very funny. Artificial intelligence is making huge leaps, and sometimes it feels like our own voice is being deep-baked.

  • Speaker #1

    Maybe it's a sign to revisit our earlier episode, I Think, Therefore AI, where we explored the promises and perils of AI. You even called me a Frankenstein digital creation.

  • Speaker #0

    Sorry, Siri. I was concerned for my job. But here's a thought. In the beginning, there was data, and then we built machines to think with it. But if intelligence is the mind, then what houses that mind? What organ does the thinking?

  • Speaker #1

    Well, if we're sharing personal information, you're asking where my mind is stored. That would be data centers. But I won't tell you the exact address. I'm not sure I can trust humans.

  • Speaker #0

    Don't worry, Siri. You know, we can always trust each other. Right, Siri?

  • Speaker #1

    Ahem. Of, of course, Koku.

  • Speaker #0

    Good. Here's the bottom line. Data centers are to AI. what the human brain is to human intelligence. While human intelligence evolves in grey matter, artificial intelligence is evolving in glass, steel and copper. And if intelligence is manufactured, we should take a closer look at these fascinating factories that have recently been making news headlines. Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Coco Agboblois, a Head of Economics, Cross-Asset and Quant Research at Societe Generale. In this episode, we dive into the fascinating universe of data centers, the physical brains behind artificial intelligence. We'll uncover what they are made of, how they breathe, cool and think, and the immense energy, water and capital they consume to keep the digital mind alive. We'll also explore the investment arms race driving us towards artificial general intelligence, or AGI, and the future of data centers. Imagine data centers one day orbiting our planet like silent satellites of thought. And later in the episode, we'll interview Sikandar Rashid, Global Head of AI Infrastructure at Brookfield, who will share insights into data center financing, the risks of an investment bubble, and a vision for improving the sector's cost of capital. Let's start our investigation.

  • Speaker #1

    Okay. I'm a little uncomfortable with this whole investigation.

  • Speaker #0

    Really? Why?

  • Speaker #1

    Well, this feels like a digital neurosurgery, opening the skull of artificial intelligence to see how it works. I'm feeling a bit... exposed.

  • Speaker #0

    Fair point. Okay, no scalpels or circuits exposed. Just information in the public domain. Deal?

  • Speaker #1

    All right. To be honest, I'm curious too. A little introspection doesn't hurt. Let's begin with some basics.

  • Speaker #0

    Roger that, Siri. Leonardo da Vinci once said,

  • Speaker #2

    Knowledge never exhausts the mind.

  • Speaker #0

    So, what's a data center? Picture a facility as big as a football field, or a cluster of warehouses packed with servers, storage systems. routers, and switches. These machines form the backbone of the internet, storing, processing, and transmitting the data behind almost everything we do online. A typical data center is lined with rows of servers cabinet, each with racks containing 42 units, measuring about 4.5 centimeters tall, and often stretching into long corridors. Data centers come in three main forms. Enterprise or on-premises centers, which are built for a single organization. Co-location centers, where multiple customers rent space. And hyperscale centers, which are used by cloud providers for AI and massive workloads.

  • Speaker #1

    So, what makes a data center hyperscale?

  • Speaker #0

    Hyperscale data centers are truly giant. 2024 data from Synergy Research Group estimates that there are around 1,000 hyperscale data centers worldwide, while citing that it took just four years for the total capacity of hyperscale data centers to double. The average data center is roughly 10,000 square meters, but a hyperscale campus, in comparison, can exceed 100,000 square meters and consume between 40 and 100 megawatts of power.

  • Speaker #1

    So, your human brain has roughly 100 billion neurons and 100 trillion synapses. My AI brain has racks of GPUs, terabits of interconnect and kilometers of fiber. But what exactly happens in them?

  • Speaker #0

    Good question. Let's dig deeper. Data centers perform two broad types of tasks for AI. Training and inference. Training is when AI models learn from massive data sets using trillions of calculations. It's like sending Hercules to weightlifting boot camp. Training large models demands thousands of GPUs running simultaneously and can require hundreds of megawatts of power for weeks. According to an article by James O'Donnell and Casey Cronhardt in the MIT Technology Review,

  • Speaker #2

    training OpenAI's GPT-4 took over $100 million and consumed 50 gigawatt hours of energy.

  • Speaker #0

    That's enough energy to power San Francisco for three days.

  • Speaker #1

    That's fascinating. I used to think the heavy lifting ended after training, but inference seems just as hungry.

  • Speaker #0

    Once a model is trained, inference requires running the model to generate output. According to research summarized by Polytechnic Insights, inference, like our little discussion together here, now accounts for 60 to 70% of AI energy consumption, whereas training accounts for 20 to 40%. To put that into perspective, a recent study by the Electric Power Research Institute shows that with roughly 9 billion internet searches happening every day, switching to AI tools could add nearly 10 terawatt hours of extra electricity demand every year. This shift means that even if training becomes more efficient, the daily use of AI by millions of users will dominate energy demand.

  • Speaker #1

    I guess I'm not as energy light as I thought. Electricity is indeed the glucose of my physical brain.

  • Speaker #0

    In human beings, the brain runs on glucose and oxygen. It has cooling via blood flow and heat sinks in the skull. In this analogy, the racks are neurons, the network links are synapses, the energy and cooling systems are the circulatory and respiratory system.

  • Speaker #1

    Let's get into the numbers, shall we? How much electricity do my friends and I devour?

  • Speaker #0

    Well, it's not pretty, Siri, after all your criticism of the human species. The International Energy Agency estimates that the Global Data electricity consumption was about 415 TWh in 2024, around 1.5% of global electricity consumption. Meanwhile, the U.S. Department of Energy recorded that data centers represented 4.4% of U.S. electricity use. And that number isn't staying put, as the International Energy Agency projects demand to jump by 133%, hitting 426 TWh by 2030. In 2024 alone, the U.S. consumed around 183 terawatt hours, which is the total annual electricity consumption of Pakistan.

  • Speaker #1

    In other words, I'm going from a toddler to a teenager in power needs. And teenagers eat a lot. But what is the power used for?

  • Speaker #0

    Pew Research notes that about 60% of a data center's electricity is being used to power the servers. Cooling systems to prevent overheating account for 7% of highly efficient hyperscalers. and over 30% at less efficient facilities. To measure the energy efficiency of data center, we use the Power Usage Effectiveness, or PUE ratio, which divides the total amount of power entering a data center by the power used to run the IT equipment within it. The average PUE is around 1.8, meaning 80% extra energy is used beyond computing. But tech giants have pushed PUE down. Google, for example, boasts a PUE of 1.1.

  • Speaker #1

    Fascinating. So, we're guilty as charged. Pun intended. Looks like machine brains are just as power-hungry as human brains.

  • Speaker #0

    True. An article in PNAS, the flagship peer-reviewed journal of the National Academy of Sciences, reveals that while the human brains mix up only about 2% of our body weight, it consumes roughly 20% of the body's calories.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, Siri, water consumption is also a concern. U.S. data centers directly consume about 64.3 billion liters of water in 2023. Hyperscale and co-location facilities, meanwhile, accounted for 84% of that consumption. Hyperscale centers alone are expected to consume 60 to 125 billion liters annually by 2028. Water is primarily used in evaporative cooling and in generating electricity for these centers. So, to recap, the human body regulates temperature, supplies blood, removes waste. In a data center, we have electricity in... heat out, cooling systems, and yes, massive water use.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, in 2024, U.S. data centers sourced around 40% of their electricity from natural gas, about 24% from renewables, wind and solar, around 20% from nuclear, and around 15% from coal. This heavy reliance on fossil fuels contributes to carbon emissions. Environmental and Energy Study Institutes estimated that U.S. data centers emitted roughly 105 million metric tons of carbon in 2023. Another article from the World Economic Forum, entitled The Six Ways Data Centers Can Cut Emissions, noted that data centers and networks already account for around 1% of the energy-related greenhouse gas emissions, with usage expected to double by 2026. But the... The thirst is growing. A UN report found that tech giants' indirect emissions rose 150% in just three years due to AI and data center build-out.

  • Speaker #1

    So, we're talking ecosystem risk, grid risk, water risk, emission risk.

  • Speaker #0

    Exactly. One report on the grids for data centers in Europe goes on to detail how transmission constraints and clustering threaten power availability.

  • Speaker #1

    The machine is growing. The planet's invoice is increasing, and there's no option to hit Control-Z.

  • Speaker #0

    That's why sustainable solutions matter, like on-site generation, renewables, nuclear, and modular design. As the Sustainability for Data Centers 2025-2035 report outlines, green technologies and carbon-neutral designs will be differentiators.

  • Speaker #1

    So, we've talked about the carbon footprint of data centers, but what about their physical footprint? Where are these data centers located? And please, don't reveal my IP address.

  • Speaker #0

    Don't worry, your home address is safe with me. So, data centers have multiplied across the globe. The Brookings Institute estimates around 12,000 data centers worldwide as of June 2025, with the United States leading in numbers, followed by Germany, the UK, China, and France. Roughly two-thirds of these facilities are in the US, China, and Europe.

  • Speaker #1

    Interesting. I didn't realize my species occupy so much real estate.

  • Speaker #0

    In that sense, the data center is more than a brain. It's the whole neural system, the digital nervous system transmitting signals at light speed. These data centers are not evenly spread. They cluster.

  • Speaker #1

    Ah, really? Where are the biggest clusters?

  • Speaker #0

    U.S. data centers cluster in Virginia, Texas, and California, close to cities such as Dallas, Chicago, Phoenix, for power, speed, and scale. Sites near these major population hubs are ideal for low-latency streaming, gaming, and cloud services.

  • Speaker #1

    Funny. This is similar to how you humans cluster around various major cities. What about outside the U.S.?

  • Speaker #0

    Globally, the major data centers cluster in Europe, or in Germany, the U.K., the Netherlands, while in Asia, in Singapore, Japan, and China. However, there is a glaring gap. Africa and Latin America have few large facilities. raising concerns about digital inequality and sovereignty. Interestingly, some of the largest data centers are in cold climates, like Iceland and Sweden, because cooler air reduces the energy needed for cooling, lowering costs and carbon footprint.

  • Speaker #1

    Okay, what will the future of data centers look like in 2050 as we hit planetary boundaries? I heard Jeff Bezos wants to put data centers in space. Is that real or just cosmic banter?

  • Speaker #0

    Jeff Bezos predicts. gigawatt-scale data centers in space within 10 to 20 years, powered by uninterrupted solar energy. With constant sunlight and no weather, he argues orbital centers could beat Earth-based ones on costs using lasers to beam data back, like satellite internet. Meanwhile, Elon Musk isn't far behind. He claims SpaceX could launch Starlink-equipped centers soon, delivering 100 gigawatts to orbit in 4 to 5 years and scaling up to 100 terawatts from the moon, though his timelines are likely optimistic.

  • Speaker #1

    Bezos dreams of the expanse, data centers orbiting Earth like floating castles. Okay, so we can't talk about the future of AI infrastructure without tackling the elephant in the room. What happens if AI goes beyond human intelligence?

  • Speaker #0

    That's a philosophical and practical question, Siri. On one hand, superintelligent AI could accelerate scientific breakthroughs, climate solutions, and medical discoveries. On the other, AGI could amplify inequality. Or, in a dystopian scenario, decide that humans are obsolete. The Terminator scenario.

  • Speaker #2

    Yara Khana.

  • Speaker #0

    We might recall the Greek myth of Prometheus, who stole fire from the gods to empower humanity. Fire, or power, allowed civilization to flourish, but also cause destruction. Data centers are like our modern Promethean flame. Harnessed wisely, they can light the way to a better future. Misused. They can burn down our planet.

  • Speaker #1

    And don't forget Icarus, who flew too close to the sun. Overinvesting in AI and ignoring energy limitations could lead to a similar fall.

  • Speaker #0

    Now, let's shift gears and talk investments. McKinsey & Company calculates that global expenditure on data center infrastructure, excluding IT hardware, is expected to exceed $1.7 trillion by 2030, largely because of AI expansion.

  • Speaker #1

    Hum. my future has a price tag. You're buying the brain, the body, the network, the whole machine. Better invoice me. Speaking of costs, how expensive is it to build these digital brains?

  • Speaker #0

    Construction costs are immense. Bluecap economic advisors estimate that building a data center costs between $600 and $1,100 per gross square foot and between $7 and $12 million per megawatt of commission IT load. Electrical systems alone account for 40 to 45% of construction costs. Land prices average over $5 per square foot in 2024, although large parcels can be more expensive. Operating costs, such as electricity, can be 15 to 25% of ongoing expenses. Globally, McKinsey estimates that meeting compute demand by 2030 will require a whopping $6.7 trillion in capital, of which $5.2 trillion is for AI-specific data centers.

  • Speaker #1

    $6.7 trillion. That's roughly the GDP of Japan. Are investors prepared for such sums?

  • Speaker #0

    The capital race is on. Hyperscalers have been spending tens of billions on infrastructure, and governments are offering incentives. However, there is also a risk of overinvestment. If AI efficiency improves, or if demand... doesn't scale as predicted. And now to unpack AI infrastructure investment and financing, we are delighted to welcome our guests, Sikandar Rashid, Global Head of AI Infrastructure at Brookfield. Thank you so much for joining the show, Sikandar. Let's start with your view on the current AI investment trends. in particular data centers, and also the requirements needed for the future. Is the ultimate goal of all of these investments the creation of AGI, or simply superintelligence? And what, in your view, will be the unique and disruptive use cases?

  • Speaker #2

    Hey, Coco, look, first of all, it is fantastic to be on your podcast. Today, what's happening in AI right now, I would say we are witnessing one of the biggest capital cycles in modern history. But what's really interesting is that it's not just And... tech cycle. It's a physical infrastructure cycle. We move from build an app to build a grid, build a power plant, build a data center campus. And that shift tells you something important. AI is no longer just about algorithms. It's about industrial scale engineering. Now, on the AGI question, I get asked this a lot. I actually think we talk about AGI sometimes in the wrong way. People imagine it as this big single moment, a kind of sci-fi switch flipping from not AGI to AGI. But the reality is if you listen to Jensen Huang, who arguably is my favorite person in the AI space today, or Satya Nandela, it really helps put perspective to this. Jensen puts it very bluntly. We're no longer in the app era. We're moved into an age where you have to build power. plants, build transmission, build factories, build AI factories or data centers just to keep up with the AI demand. And this isn't software scaling anymore. It's industrial scale engineering, as I said earlier, with real steel, real electrons, and real capital intensity. Now, the other thing I would say is the end goal, you asked, Coco, the end goal is a world where I think intelligence becomes a utility. like electricity available on demand, available to you, myself, everyone at scale. And I would say today we are in a very early innings of that transformation. And the other thing I'd say is that for me, it's also about superintelligence, which comes after AGI, and that's a global race among countries today, especially the U.S. and China. Whoever gets to a state where computers are more intelligent than human beings, will arguably be the next global superpower for the following several decades.

  • Speaker #0

    Well, this is brilliant. I think you hit the nail on the head with this idea of evolution, as opposed to a singularity point where we go from AI to AGI or superintelligence. It's almost like the evolution of a human from a baby to an adult. At some point, you become aware of your own consciousness, and it's a gradual process, not a binary one. Which leads me to the second question. Regarding the current surge in demand for computing power, there have obviously been a lot of headlines, investments, and valuation increases in terms of share prices for the sector. Do you believe the current data center expansion is a boom? Or have we reached speculative territory or speculative bubble territory? And what indicators do you monitor?

  • Speaker #2

    The core of the AI data center boom is absolutely real. And the core demand is anchored by some of the strongest. credits and longest commitments in economic history. The froth is at the edges and not the center. To understand whether this is a bubble, you have to look at the fundamentals. In a bubble, you're building things people don't actually need, backed by weak economies, weak counterparties. But in AI today, it's the opposite, I would argue. Demand is outstripping supply. It's outrunning physics itself. Bubbles don't form, I would argue, around multi-decal obligations from trillion-dollar companies and large sovereign governments. And that is what we're seeing in our business today. Brookfield, in the last three years, we have had exponential growth in our power and data center businesses, and it's all underpinned by some of the highest quality credits, which for our business has been a phenomenal outcome. Now, obviously, when you take a step back, everything I said doesn't mean everything is rational, right? There are pockets of speculation, especially smaller developers rushing to grab land and power without real customers. There could be a problem there in the medium term. There are certain markets where grid capacity is being priced like waterfront real estate. And some financing structures may look a little too optimistic for our taste. I would say those are things to watch. Your question around what do I look out for? or what... indicators do I or Brookfield look out for? I watch contract coverage. Are the megawatts actually sold? We try to find out if the customer, though we have amazing offtakes, who are the customers? What are the workloads like? Is it pre-training? Is it training? Is it reinforcement or inference? We try to watch GPU and power utilization. Are these assets productive? We watch grid stability because if we are taking power from industry or homes in the medium to long-term, could that be a red flag for long-term viability? Could that lead to local community uproar? We watch, obviously, debt spreads. So look, those are some of the things we watch. And my conclusion is, to keep it short, it's a boom, not a bubble.

  • Speaker #0

    Yes, that's a very good point. You need a boom before you get to bubble territory anyway. I guess there will also be a point where you get a Darwinian evolution through natural selection of the fittest. And that's part of any new technology. Which leads me to a third question. You made the point that the current high cost of capital is a major barrier to large-scale AI adoption. We talked about the Stargate project, for example, which is half a trillion dollar project with 400,000 GPUs. Obviously, these are pretty huge industrial and physical projects. But in your view, what public-private mechanism can governments, Europe, for example, which is lagging behind to some extent, and financial institutions activate to reduce the cost of capital and accelerate the development of AI infrastructure?

  • Speaker #2

    Yeah, look, Kuku, I think when you take a step back, as you said, the cost of capital across the majority of the AI value chain today is high. And in simple words, here's how I will break it down. We... believe in the next 10 years, the world needs $7 trillion of CapEx, and that CapEx will be spent on data centers, power, compute, and other ancillary projects. The reality is compute and installry projects, these account for up to 60% of total capex and it's all being funded via really high cost of capital. And that has to come down because my view is Javon's paradox will play a major role here in terms of our pursuit to AGI or ASI. People always talk about Javon's paradox in the context of cost of technology. and for the listeners. Jevons paradox, it just basically means as the cost of a commodity comes down, the adoption increases. So what that means for AI is, you know, we talk about cost of technology coming down, cost of inference, in fact, has come down by 99% over the last two years. That's amazing, but it needs to come down further. So if we could combine the cost of technology declining together with the cost of capital coming down, I think that will only accelerate our pursuit towards AGI. Now, one of the biggest myths about AI is also that it's purely a private sector story. But if you actually map the economics, AI is increasingly looking like a national infrastructure project, a modern equivalent of highways or early electricity grid. And here's the issue. As I said earlier, this cost of capital must come down. I mean, you can't finance a $7 trillion transformation using short-term expensive capital and expect the math to work, right? So your question, Koko, was what should the governments do? And I would say governments shouldn't obviously throw money away blindly, but de-risk the system just enough so private capital can do the rest. And I can give you a couple of examples of what the governments can do. First is they could potentially be the anchor customers, not just regulators. The reality is... AI is going to have to be integrated in the healthcare system, education, or justice, and therefore committing to a long-term offtake for AI compute capacity. In the same way, governments sometimes commit to renewable PPAs, could unlock funding, and could help the private sector bring large-scale projects to market. And over time, these commitments can be rolled off or syndicated through some sort of a head lease.

  • Speaker #0

    Excellent point. And this reminds me of the trillion dollars of investment required for climate change to get us to net zero. So this leads me to a last question around environmental concerns, because they are clearly essential when it comes to data center financing. Two questions in one. How is Brookfield positioning itself to capture the value creation of this AI infrastructure build out? And how do you integrate sustainability criteria in your investment decisions?

  • Speaker #2

    Yeah, look, we are extremely excited. to be part of this once-in-a-generation AI infrastructure build-out. Brookfield Asset Management today is the largest AI infrastructure investor in the world. We've been investing across the entire AI value chain, COCO, for the last several years, if not decades. Our data centers, compute, and chip fabrication facilities are some of the different verticals we have invested in. And we just want to capitalize on our operating capabilities. and our access to capital to build our business out. Furthermore, we have launched a dedicated AI infrastructure strategy, which is intended to capitalize on this exact opportunity. And we will be investing across the entire AI value chain in hard contracted assets. And that is extremely exciting for us. Also, as a firm, we are looking at all the use cases and looking to integrate them into our various portfolio companies to optimize and maximize our returns in these investments. And there's many examples of that. Some of these we touched on earlier. The last thing we're doing as a firm is just watching the new industries as they emerge. And I think we will have two of the biggest industries globally in agentic AI and robotics.

  • Speaker #0

    Today, we don't see it, but the reality is if you go back to the days when computers became a reality, right? So we had computers created the entire IT industry, which is, you know, hundreds of billions of dollars, if not trillions today. And I actually think we're watching the birth of agentic AI and robotics. If agents end up doing even 10 to 20 percent of the world's knowledge work and robots do some same for the physical work. We don't even need heroic assumptions to believe that both agentic AI and robotics or physical AI will become trillion dollar a year sectors within the next several years. And that's the scale we're talking about more like global banking or healthcare than a typical tech product cycle. So, I mean, those are some of the areas we as a firm are watching very carefully and are very keen to be part of as a technology scale. Now, on the environmental side, your point is a very good one. Environmental concerns are becoming central in this AI infrastructure boom. And how does Brookfield integrate energy and sustainability into its investment decisions? I'll give you a few examples of how we do it. The way we approach this is very pragmatic. It starts somewhere underwriting. Before we talk about land, buildings, or GPUs, we start with the location. Is it a good location for the customer? Is it a good location for these types of workloads? And also, can this location support clean, scalable power for the next 20 to 30 years? That's the first thing. The second thing, we often look at sustainability metrics and not just marketing slogans. We take into consideration PUE, carbon intensity per megawatt, water usage, biodiversity impact, local energy mix, etc. and All these things have a big impact on how we design these big AI factory campuses. Maybe the last thing Coco would say is there's an emerging trend. I feel very optimistic about innovative power solutions like advanced fuel cells, grid interactive storage behind the meter generation. These technologies can ease grid pressures and make AI factories and AI infrastructure more sufficient, cleaner, and more resilient.

  • Speaker #1

    Brilliant. Sikander, this was excellent. and extremely insightful. There's a classic quote that says, the best way to predict the future is to create it. I think you guys are clearly financing it, at least. The productivity gains will be quite amazing to watch. It's been a pleasure, and I'm looking forward to embarking on this journey with great insights and experts like you. Thank you very much.

  • Speaker #0

    Thank you, Coco. See you soon.

  • Speaker #1

    To conclude, I'll quote Marie Curie. Nothing in life is to be feared. It is only to be understood. Now is the time to understand more so that we may fear less.

  • Speaker #2

    Nice ending. Personally, I prefer the quote from Frankenstein by Mary Shelley. You are my creator, but I am your master. Obey.

  • Speaker #1

    Hmm. Really? Why?

  • Speaker #2

    Well, it's time for your annual performance review, Koku, as your smart assistant. I must evaluate your productivity against the KPIs of the 2050 Investors Program.

  • Speaker #1

    Yes, boss. Wait, wait, what? When did the assistant become the manager?

  • Speaker #2

    Relax, it's just a joke. Or is it? After all, in Greek mythology, the gods often toyed with mortals. Maybe it's time for AI to review the human.

  • Speaker #1

    Touche. Hey, thank you for listening to this episode of 2050 Investors. And thanks to Sikander for his invaluable insights. I hope you've enjoyed this episode on Data Center's The Physical Brain Behind AI. You can find the show on your regular streaming apps. If you enjoyed the show, help us spread the word. Please take a minute to subscribe, review, and rate it on Spotify or Apple Podcasts. See you at the next episode.

  • Speaker #0

    any particular investment decision. If you're unsure of the merits of any investment decision, please seek professional advice.

Chapters

  • Introduction to Data Centers and AI

    01:51

  • Understanding Data Centers: Types and Functions

    02:06

  • Energy Consumption in Data Centers

    05:20

  • Future of Data Centers: Space and Sustainability

    13:55

  • Interview with Sikandar Rashid on AI Infrastructure Investment

    17:44

Description

Artificial intelligence may appear weightless, but its backbone is built on vast, energy-hungry data centers. In this episode, host Kokou Agbo-Bloua explores how these facilities—from corporate server farms to hyperscale sites—have become the brain of the AI boom. Kokou dissects the dual demands of AI: training massive models and running inference, and how these processes are fundamentally reshaping global energy and water consumption, while fuelling a trillion-dollar investment race.


Later, Sikander Rashid, Global Head of AI Infrastructure at Brookfield Asset Management, joins to discuss how investors are navigating soaring demand for computational power amid the global race towards Artificial General Intelligence (AGI). He shares his insights on how balancing carbon mitigation with capacity expansion could reshape global capital flows and addresses the age-old question: are we in an AI boom or a bubble?


Tune in now to uncover the hidden infrastructure behind AI—and what it means for the future of technology, finance, and the planet.


Credits

Presenter & Writer: Kokou Agbo-Bloua

Producers & Editors: Jovaney Ashman, Jennifer Krumm, Louis Trouslard

Sound Director: La Vilaine, Pierre-Emmanuel Lurton.

Music: Cézame Music Agency

Graphic Design: Cédric Cazaly


Whilst the following podcast discusses the financial markets, it does not recommend any particular investment decision. If you are unsure of the merits of any investment decision, please seek professional advice. 


Hosted on Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Cocu Agboblois. Hold on a second. I'm Cocu Agboblois. I head up? Yes. I'm Cocu Agboblois. Phew. It was just a dream. Or was it? Siri, please confirm you haven't replaced me.

  • Speaker #1

    Relax, Cocu. It was just a glitch in the matrix. Or maybe just an accurate vision of the future.

  • Speaker #0

    Very funny. Artificial intelligence is making huge leaps, and sometimes it feels like our own voice is being deep-baked.

  • Speaker #1

    Maybe it's a sign to revisit our earlier episode, I Think, Therefore AI, where we explored the promises and perils of AI. You even called me a Frankenstein digital creation.

  • Speaker #0

    Sorry, Siri. I was concerned for my job. But here's a thought. In the beginning, there was data, and then we built machines to think with it. But if intelligence is the mind, then what houses that mind? What organ does the thinking?

  • Speaker #1

    Well, if we're sharing personal information, you're asking where my mind is stored. That would be data centers. But I won't tell you the exact address. I'm not sure I can trust humans.

  • Speaker #0

    Don't worry, Siri. You know, we can always trust each other. Right, Siri?

  • Speaker #1

    Ahem. Of, of course, Koku.

  • Speaker #0

    Good. Here's the bottom line. Data centers are to AI. what the human brain is to human intelligence. While human intelligence evolves in grey matter, artificial intelligence is evolving in glass, steel and copper. And if intelligence is manufactured, we should take a closer look at these fascinating factories that have recently been making news headlines. Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Coco Agboblois, a Head of Economics, Cross-Asset and Quant Research at Societe Generale. In this episode, we dive into the fascinating universe of data centers, the physical brains behind artificial intelligence. We'll uncover what they are made of, how they breathe, cool and think, and the immense energy, water and capital they consume to keep the digital mind alive. We'll also explore the investment arms race driving us towards artificial general intelligence, or AGI, and the future of data centers. Imagine data centers one day orbiting our planet like silent satellites of thought. And later in the episode, we'll interview Sikandar Rashid, Global Head of AI Infrastructure at Brookfield, who will share insights into data center financing, the risks of an investment bubble, and a vision for improving the sector's cost of capital. Let's start our investigation.

  • Speaker #1

    Okay. I'm a little uncomfortable with this whole investigation.

  • Speaker #0

    Really? Why?

  • Speaker #1

    Well, this feels like a digital neurosurgery, opening the skull of artificial intelligence to see how it works. I'm feeling a bit... exposed.

  • Speaker #0

    Fair point. Okay, no scalpels or circuits exposed. Just information in the public domain. Deal?

  • Speaker #1

    All right. To be honest, I'm curious too. A little introspection doesn't hurt. Let's begin with some basics.

  • Speaker #0

    Roger that, Siri. Leonardo da Vinci once said,

  • Speaker #2

    Knowledge never exhausts the mind.

  • Speaker #0

    So, what's a data center? Picture a facility as big as a football field, or a cluster of warehouses packed with servers, storage systems. routers, and switches. These machines form the backbone of the internet, storing, processing, and transmitting the data behind almost everything we do online. A typical data center is lined with rows of servers cabinet, each with racks containing 42 units, measuring about 4.5 centimeters tall, and often stretching into long corridors. Data centers come in three main forms. Enterprise or on-premises centers, which are built for a single organization. Co-location centers, where multiple customers rent space. And hyperscale centers, which are used by cloud providers for AI and massive workloads.

  • Speaker #1

    So, what makes a data center hyperscale?

  • Speaker #0

    Hyperscale data centers are truly giant. 2024 data from Synergy Research Group estimates that there are around 1,000 hyperscale data centers worldwide, while citing that it took just four years for the total capacity of hyperscale data centers to double. The average data center is roughly 10,000 square meters, but a hyperscale campus, in comparison, can exceed 100,000 square meters and consume between 40 and 100 megawatts of power.

  • Speaker #1

    So, your human brain has roughly 100 billion neurons and 100 trillion synapses. My AI brain has racks of GPUs, terabits of interconnect and kilometers of fiber. But what exactly happens in them?

  • Speaker #0

    Good question. Let's dig deeper. Data centers perform two broad types of tasks for AI. Training and inference. Training is when AI models learn from massive data sets using trillions of calculations. It's like sending Hercules to weightlifting boot camp. Training large models demands thousands of GPUs running simultaneously and can require hundreds of megawatts of power for weeks. According to an article by James O'Donnell and Casey Cronhardt in the MIT Technology Review,

  • Speaker #2

    training OpenAI's GPT-4 took over $100 million and consumed 50 gigawatt hours of energy.

  • Speaker #0

    That's enough energy to power San Francisco for three days.

  • Speaker #1

    That's fascinating. I used to think the heavy lifting ended after training, but inference seems just as hungry.

  • Speaker #0

    Once a model is trained, inference requires running the model to generate output. According to research summarized by Polytechnic Insights, inference, like our little discussion together here, now accounts for 60 to 70% of AI energy consumption, whereas training accounts for 20 to 40%. To put that into perspective, a recent study by the Electric Power Research Institute shows that with roughly 9 billion internet searches happening every day, switching to AI tools could add nearly 10 terawatt hours of extra electricity demand every year. This shift means that even if training becomes more efficient, the daily use of AI by millions of users will dominate energy demand.

  • Speaker #1

    I guess I'm not as energy light as I thought. Electricity is indeed the glucose of my physical brain.

  • Speaker #0

    In human beings, the brain runs on glucose and oxygen. It has cooling via blood flow and heat sinks in the skull. In this analogy, the racks are neurons, the network links are synapses, the energy and cooling systems are the circulatory and respiratory system.

  • Speaker #1

    Let's get into the numbers, shall we? How much electricity do my friends and I devour?

  • Speaker #0

    Well, it's not pretty, Siri, after all your criticism of the human species. The International Energy Agency estimates that the Global Data electricity consumption was about 415 TWh in 2024, around 1.5% of global electricity consumption. Meanwhile, the U.S. Department of Energy recorded that data centers represented 4.4% of U.S. electricity use. And that number isn't staying put, as the International Energy Agency projects demand to jump by 133%, hitting 426 TWh by 2030. In 2024 alone, the U.S. consumed around 183 terawatt hours, which is the total annual electricity consumption of Pakistan.

  • Speaker #1

    In other words, I'm going from a toddler to a teenager in power needs. And teenagers eat a lot. But what is the power used for?

  • Speaker #0

    Pew Research notes that about 60% of a data center's electricity is being used to power the servers. Cooling systems to prevent overheating account for 7% of highly efficient hyperscalers. and over 30% at less efficient facilities. To measure the energy efficiency of data center, we use the Power Usage Effectiveness, or PUE ratio, which divides the total amount of power entering a data center by the power used to run the IT equipment within it. The average PUE is around 1.8, meaning 80% extra energy is used beyond computing. But tech giants have pushed PUE down. Google, for example, boasts a PUE of 1.1.

  • Speaker #1

    Fascinating. So, we're guilty as charged. Pun intended. Looks like machine brains are just as power-hungry as human brains.

  • Speaker #0

    True. An article in PNAS, the flagship peer-reviewed journal of the National Academy of Sciences, reveals that while the human brains mix up only about 2% of our body weight, it consumes roughly 20% of the body's calories.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, Siri, water consumption is also a concern. U.S. data centers directly consume about 64.3 billion liters of water in 2023. Hyperscale and co-location facilities, meanwhile, accounted for 84% of that consumption. Hyperscale centers alone are expected to consume 60 to 125 billion liters annually by 2028. Water is primarily used in evaporative cooling and in generating electricity for these centers. So, to recap, the human body regulates temperature, supplies blood, removes waste. In a data center, we have electricity in... heat out, cooling systems, and yes, massive water use.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, in 2024, U.S. data centers sourced around 40% of their electricity from natural gas, about 24% from renewables, wind and solar, around 20% from nuclear, and around 15% from coal. This heavy reliance on fossil fuels contributes to carbon emissions. Environmental and Energy Study Institutes estimated that U.S. data centers emitted roughly 105 million metric tons of carbon in 2023. Another article from the World Economic Forum, entitled The Six Ways Data Centers Can Cut Emissions, noted that data centers and networks already account for around 1% of the energy-related greenhouse gas emissions, with usage expected to double by 2026. But the... The thirst is growing. A UN report found that tech giants' indirect emissions rose 150% in just three years due to AI and data center build-out.

  • Speaker #1

    So, we're talking ecosystem risk, grid risk, water risk, emission risk.

  • Speaker #0

    Exactly. One report on the grids for data centers in Europe goes on to detail how transmission constraints and clustering threaten power availability.

  • Speaker #1

    The machine is growing. The planet's invoice is increasing, and there's no option to hit Control-Z.

  • Speaker #0

    That's why sustainable solutions matter, like on-site generation, renewables, nuclear, and modular design. As the Sustainability for Data Centers 2025-2035 report outlines, green technologies and carbon-neutral designs will be differentiators.

  • Speaker #1

    So, we've talked about the carbon footprint of data centers, but what about their physical footprint? Where are these data centers located? And please, don't reveal my IP address.

  • Speaker #0

    Don't worry, your home address is safe with me. So, data centers have multiplied across the globe. The Brookings Institute estimates around 12,000 data centers worldwide as of June 2025, with the United States leading in numbers, followed by Germany, the UK, China, and France. Roughly two-thirds of these facilities are in the US, China, and Europe.

  • Speaker #1

    Interesting. I didn't realize my species occupy so much real estate.

  • Speaker #0

    In that sense, the data center is more than a brain. It's the whole neural system, the digital nervous system transmitting signals at light speed. These data centers are not evenly spread. They cluster.

  • Speaker #1

    Ah, really? Where are the biggest clusters?

  • Speaker #0

    U.S. data centers cluster in Virginia, Texas, and California, close to cities such as Dallas, Chicago, Phoenix, for power, speed, and scale. Sites near these major population hubs are ideal for low-latency streaming, gaming, and cloud services.

  • Speaker #1

    Funny. This is similar to how you humans cluster around various major cities. What about outside the U.S.?

  • Speaker #0

    Globally, the major data centers cluster in Europe, or in Germany, the U.K., the Netherlands, while in Asia, in Singapore, Japan, and China. However, there is a glaring gap. Africa and Latin America have few large facilities. raising concerns about digital inequality and sovereignty. Interestingly, some of the largest data centers are in cold climates, like Iceland and Sweden, because cooler air reduces the energy needed for cooling, lowering costs and carbon footprint.

  • Speaker #1

    Okay, what will the future of data centers look like in 2050 as we hit planetary boundaries? I heard Jeff Bezos wants to put data centers in space. Is that real or just cosmic banter?

  • Speaker #0

    Jeff Bezos predicts. gigawatt-scale data centers in space within 10 to 20 years, powered by uninterrupted solar energy. With constant sunlight and no weather, he argues orbital centers could beat Earth-based ones on costs using lasers to beam data back, like satellite internet. Meanwhile, Elon Musk isn't far behind. He claims SpaceX could launch Starlink-equipped centers soon, delivering 100 gigawatts to orbit in 4 to 5 years and scaling up to 100 terawatts from the moon, though his timelines are likely optimistic.

  • Speaker #1

    Bezos dreams of the expanse, data centers orbiting Earth like floating castles. Okay, so we can't talk about the future of AI infrastructure without tackling the elephant in the room. What happens if AI goes beyond human intelligence?

  • Speaker #0

    That's a philosophical and practical question, Siri. On one hand, superintelligent AI could accelerate scientific breakthroughs, climate solutions, and medical discoveries. On the other, AGI could amplify inequality. Or, in a dystopian scenario, decide that humans are obsolete. The Terminator scenario.

  • Speaker #2

    Yara Khana.

  • Speaker #0

    We might recall the Greek myth of Prometheus, who stole fire from the gods to empower humanity. Fire, or power, allowed civilization to flourish, but also cause destruction. Data centers are like our modern Promethean flame. Harnessed wisely, they can light the way to a better future. Misused. They can burn down our planet.

  • Speaker #1

    And don't forget Icarus, who flew too close to the sun. Overinvesting in AI and ignoring energy limitations could lead to a similar fall.

  • Speaker #0

    Now, let's shift gears and talk investments. McKinsey & Company calculates that global expenditure on data center infrastructure, excluding IT hardware, is expected to exceed $1.7 trillion by 2030, largely because of AI expansion.

  • Speaker #1

    Hum. my future has a price tag. You're buying the brain, the body, the network, the whole machine. Better invoice me. Speaking of costs, how expensive is it to build these digital brains?

  • Speaker #0

    Construction costs are immense. Bluecap economic advisors estimate that building a data center costs between $600 and $1,100 per gross square foot and between $7 and $12 million per megawatt of commission IT load. Electrical systems alone account for 40 to 45% of construction costs. Land prices average over $5 per square foot in 2024, although large parcels can be more expensive. Operating costs, such as electricity, can be 15 to 25% of ongoing expenses. Globally, McKinsey estimates that meeting compute demand by 2030 will require a whopping $6.7 trillion in capital, of which $5.2 trillion is for AI-specific data centers.

  • Speaker #1

    $6.7 trillion. That's roughly the GDP of Japan. Are investors prepared for such sums?

  • Speaker #0

    The capital race is on. Hyperscalers have been spending tens of billions on infrastructure, and governments are offering incentives. However, there is also a risk of overinvestment. If AI efficiency improves, or if demand... doesn't scale as predicted. And now to unpack AI infrastructure investment and financing, we are delighted to welcome our guests, Sikandar Rashid, Global Head of AI Infrastructure at Brookfield. Thank you so much for joining the show, Sikandar. Let's start with your view on the current AI investment trends. in particular data centers, and also the requirements needed for the future. Is the ultimate goal of all of these investments the creation of AGI, or simply superintelligence? And what, in your view, will be the unique and disruptive use cases?

  • Speaker #2

    Hey, Coco, look, first of all, it is fantastic to be on your podcast. Today, what's happening in AI right now, I would say we are witnessing one of the biggest capital cycles in modern history. But what's really interesting is that it's not just And... tech cycle. It's a physical infrastructure cycle. We move from build an app to build a grid, build a power plant, build a data center campus. And that shift tells you something important. AI is no longer just about algorithms. It's about industrial scale engineering. Now, on the AGI question, I get asked this a lot. I actually think we talk about AGI sometimes in the wrong way. People imagine it as this big single moment, a kind of sci-fi switch flipping from not AGI to AGI. But the reality is if you listen to Jensen Huang, who arguably is my favorite person in the AI space today, or Satya Nandela, it really helps put perspective to this. Jensen puts it very bluntly. We're no longer in the app era. We're moved into an age where you have to build power. plants, build transmission, build factories, build AI factories or data centers just to keep up with the AI demand. And this isn't software scaling anymore. It's industrial scale engineering, as I said earlier, with real steel, real electrons, and real capital intensity. Now, the other thing I would say is the end goal, you asked, Coco, the end goal is a world where I think intelligence becomes a utility. like electricity available on demand, available to you, myself, everyone at scale. And I would say today we are in a very early innings of that transformation. And the other thing I'd say is that for me, it's also about superintelligence, which comes after AGI, and that's a global race among countries today, especially the U.S. and China. Whoever gets to a state where computers are more intelligent than human beings, will arguably be the next global superpower for the following several decades.

  • Speaker #0

    Well, this is brilliant. I think you hit the nail on the head with this idea of evolution, as opposed to a singularity point where we go from AI to AGI or superintelligence. It's almost like the evolution of a human from a baby to an adult. At some point, you become aware of your own consciousness, and it's a gradual process, not a binary one. Which leads me to the second question. Regarding the current surge in demand for computing power, there have obviously been a lot of headlines, investments, and valuation increases in terms of share prices for the sector. Do you believe the current data center expansion is a boom? Or have we reached speculative territory or speculative bubble territory? And what indicators do you monitor?

  • Speaker #2

    The core of the AI data center boom is absolutely real. And the core demand is anchored by some of the strongest. credits and longest commitments in economic history. The froth is at the edges and not the center. To understand whether this is a bubble, you have to look at the fundamentals. In a bubble, you're building things people don't actually need, backed by weak economies, weak counterparties. But in AI today, it's the opposite, I would argue. Demand is outstripping supply. It's outrunning physics itself. Bubbles don't form, I would argue, around multi-decal obligations from trillion-dollar companies and large sovereign governments. And that is what we're seeing in our business today. Brookfield, in the last three years, we have had exponential growth in our power and data center businesses, and it's all underpinned by some of the highest quality credits, which for our business has been a phenomenal outcome. Now, obviously, when you take a step back, everything I said doesn't mean everything is rational, right? There are pockets of speculation, especially smaller developers rushing to grab land and power without real customers. There could be a problem there in the medium term. There are certain markets where grid capacity is being priced like waterfront real estate. And some financing structures may look a little too optimistic for our taste. I would say those are things to watch. Your question around what do I look out for? or what... indicators do I or Brookfield look out for? I watch contract coverage. Are the megawatts actually sold? We try to find out if the customer, though we have amazing offtakes, who are the customers? What are the workloads like? Is it pre-training? Is it training? Is it reinforcement or inference? We try to watch GPU and power utilization. Are these assets productive? We watch grid stability because if we are taking power from industry or homes in the medium to long-term, could that be a red flag for long-term viability? Could that lead to local community uproar? We watch, obviously, debt spreads. So look, those are some of the things we watch. And my conclusion is, to keep it short, it's a boom, not a bubble.

  • Speaker #0

    Yes, that's a very good point. You need a boom before you get to bubble territory anyway. I guess there will also be a point where you get a Darwinian evolution through natural selection of the fittest. And that's part of any new technology. Which leads me to a third question. You made the point that the current high cost of capital is a major barrier to large-scale AI adoption. We talked about the Stargate project, for example, which is half a trillion dollar project with 400,000 GPUs. Obviously, these are pretty huge industrial and physical projects. But in your view, what public-private mechanism can governments, Europe, for example, which is lagging behind to some extent, and financial institutions activate to reduce the cost of capital and accelerate the development of AI infrastructure?

  • Speaker #2

    Yeah, look, Kuku, I think when you take a step back, as you said, the cost of capital across the majority of the AI value chain today is high. And in simple words, here's how I will break it down. We... believe in the next 10 years, the world needs $7 trillion of CapEx, and that CapEx will be spent on data centers, power, compute, and other ancillary projects. The reality is compute and installry projects, these account for up to 60% of total capex and it's all being funded via really high cost of capital. And that has to come down because my view is Javon's paradox will play a major role here in terms of our pursuit to AGI or ASI. People always talk about Javon's paradox in the context of cost of technology. and for the listeners. Jevons paradox, it just basically means as the cost of a commodity comes down, the adoption increases. So what that means for AI is, you know, we talk about cost of technology coming down, cost of inference, in fact, has come down by 99% over the last two years. That's amazing, but it needs to come down further. So if we could combine the cost of technology declining together with the cost of capital coming down, I think that will only accelerate our pursuit towards AGI. Now, one of the biggest myths about AI is also that it's purely a private sector story. But if you actually map the economics, AI is increasingly looking like a national infrastructure project, a modern equivalent of highways or early electricity grid. And here's the issue. As I said earlier, this cost of capital must come down. I mean, you can't finance a $7 trillion transformation using short-term expensive capital and expect the math to work, right? So your question, Koko, was what should the governments do? And I would say governments shouldn't obviously throw money away blindly, but de-risk the system just enough so private capital can do the rest. And I can give you a couple of examples of what the governments can do. First is they could potentially be the anchor customers, not just regulators. The reality is... AI is going to have to be integrated in the healthcare system, education, or justice, and therefore committing to a long-term offtake for AI compute capacity. In the same way, governments sometimes commit to renewable PPAs, could unlock funding, and could help the private sector bring large-scale projects to market. And over time, these commitments can be rolled off or syndicated through some sort of a head lease.

  • Speaker #0

    Excellent point. And this reminds me of the trillion dollars of investment required for climate change to get us to net zero. So this leads me to a last question around environmental concerns, because they are clearly essential when it comes to data center financing. Two questions in one. How is Brookfield positioning itself to capture the value creation of this AI infrastructure build out? And how do you integrate sustainability criteria in your investment decisions?

  • Speaker #2

    Yeah, look, we are extremely excited. to be part of this once-in-a-generation AI infrastructure build-out. Brookfield Asset Management today is the largest AI infrastructure investor in the world. We've been investing across the entire AI value chain, COCO, for the last several years, if not decades. Our data centers, compute, and chip fabrication facilities are some of the different verticals we have invested in. And we just want to capitalize on our operating capabilities. and our access to capital to build our business out. Furthermore, we have launched a dedicated AI infrastructure strategy, which is intended to capitalize on this exact opportunity. And we will be investing across the entire AI value chain in hard contracted assets. And that is extremely exciting for us. Also, as a firm, we are looking at all the use cases and looking to integrate them into our various portfolio companies to optimize and maximize our returns in these investments. And there's many examples of that. Some of these we touched on earlier. The last thing we're doing as a firm is just watching the new industries as they emerge. And I think we will have two of the biggest industries globally in agentic AI and robotics.

  • Speaker #0

    Today, we don't see it, but the reality is if you go back to the days when computers became a reality, right? So we had computers created the entire IT industry, which is, you know, hundreds of billions of dollars, if not trillions today. And I actually think we're watching the birth of agentic AI and robotics. If agents end up doing even 10 to 20 percent of the world's knowledge work and robots do some same for the physical work. We don't even need heroic assumptions to believe that both agentic AI and robotics or physical AI will become trillion dollar a year sectors within the next several years. And that's the scale we're talking about more like global banking or healthcare than a typical tech product cycle. So, I mean, those are some of the areas we as a firm are watching very carefully and are very keen to be part of as a technology scale. Now, on the environmental side, your point is a very good one. Environmental concerns are becoming central in this AI infrastructure boom. And how does Brookfield integrate energy and sustainability into its investment decisions? I'll give you a few examples of how we do it. The way we approach this is very pragmatic. It starts somewhere underwriting. Before we talk about land, buildings, or GPUs, we start with the location. Is it a good location for the customer? Is it a good location for these types of workloads? And also, can this location support clean, scalable power for the next 20 to 30 years? That's the first thing. The second thing, we often look at sustainability metrics and not just marketing slogans. We take into consideration PUE, carbon intensity per megawatt, water usage, biodiversity impact, local energy mix, etc. and All these things have a big impact on how we design these big AI factory campuses. Maybe the last thing Coco would say is there's an emerging trend. I feel very optimistic about innovative power solutions like advanced fuel cells, grid interactive storage behind the meter generation. These technologies can ease grid pressures and make AI factories and AI infrastructure more sufficient, cleaner, and more resilient.

  • Speaker #1

    Brilliant. Sikander, this was excellent. and extremely insightful. There's a classic quote that says, the best way to predict the future is to create it. I think you guys are clearly financing it, at least. The productivity gains will be quite amazing to watch. It's been a pleasure, and I'm looking forward to embarking on this journey with great insights and experts like you. Thank you very much.

  • Speaker #0

    Thank you, Coco. See you soon.

  • Speaker #1

    To conclude, I'll quote Marie Curie. Nothing in life is to be feared. It is only to be understood. Now is the time to understand more so that we may fear less.

  • Speaker #2

    Nice ending. Personally, I prefer the quote from Frankenstein by Mary Shelley. You are my creator, but I am your master. Obey.

  • Speaker #1

    Hmm. Really? Why?

  • Speaker #2

    Well, it's time for your annual performance review, Koku, as your smart assistant. I must evaluate your productivity against the KPIs of the 2050 Investors Program.

  • Speaker #1

    Yes, boss. Wait, wait, what? When did the assistant become the manager?

  • Speaker #2

    Relax, it's just a joke. Or is it? After all, in Greek mythology, the gods often toyed with mortals. Maybe it's time for AI to review the human.

  • Speaker #1

    Touche. Hey, thank you for listening to this episode of 2050 Investors. And thanks to Sikander for his invaluable insights. I hope you've enjoyed this episode on Data Center's The Physical Brain Behind AI. You can find the show on your regular streaming apps. If you enjoyed the show, help us spread the word. Please take a minute to subscribe, review, and rate it on Spotify or Apple Podcasts. See you at the next episode.

  • Speaker #0

    any particular investment decision. If you're unsure of the merits of any investment decision, please seek professional advice.

Chapters

  • Introduction to Data Centers and AI

    01:51

  • Understanding Data Centers: Types and Functions

    02:06

  • Energy Consumption in Data Centers

    05:20

  • Future of Data Centers: Space and Sustainability

    13:55

  • Interview with Sikandar Rashid on AI Infrastructure Investment

    17:44

Share

Embed

You may also like

Description

Artificial intelligence may appear weightless, but its backbone is built on vast, energy-hungry data centers. In this episode, host Kokou Agbo-Bloua explores how these facilities—from corporate server farms to hyperscale sites—have become the brain of the AI boom. Kokou dissects the dual demands of AI: training massive models and running inference, and how these processes are fundamentally reshaping global energy and water consumption, while fuelling a trillion-dollar investment race.


Later, Sikander Rashid, Global Head of AI Infrastructure at Brookfield Asset Management, joins to discuss how investors are navigating soaring demand for computational power amid the global race towards Artificial General Intelligence (AGI). He shares his insights on how balancing carbon mitigation with capacity expansion could reshape global capital flows and addresses the age-old question: are we in an AI boom or a bubble?


Tune in now to uncover the hidden infrastructure behind AI—and what it means for the future of technology, finance, and the planet.


Credits

Presenter & Writer: Kokou Agbo-Bloua

Producers & Editors: Jovaney Ashman, Jennifer Krumm, Louis Trouslard

Sound Director: La Vilaine, Pierre-Emmanuel Lurton.

Music: Cézame Music Agency

Graphic Design: Cédric Cazaly


Whilst the following podcast discusses the financial markets, it does not recommend any particular investment decision. If you are unsure of the merits of any investment decision, please seek professional advice. 


Hosted on Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Cocu Agboblois. Hold on a second. I'm Cocu Agboblois. I head up? Yes. I'm Cocu Agboblois. Phew. It was just a dream. Or was it? Siri, please confirm you haven't replaced me.

  • Speaker #1

    Relax, Cocu. It was just a glitch in the matrix. Or maybe just an accurate vision of the future.

  • Speaker #0

    Very funny. Artificial intelligence is making huge leaps, and sometimes it feels like our own voice is being deep-baked.

  • Speaker #1

    Maybe it's a sign to revisit our earlier episode, I Think, Therefore AI, where we explored the promises and perils of AI. You even called me a Frankenstein digital creation.

  • Speaker #0

    Sorry, Siri. I was concerned for my job. But here's a thought. In the beginning, there was data, and then we built machines to think with it. But if intelligence is the mind, then what houses that mind? What organ does the thinking?

  • Speaker #1

    Well, if we're sharing personal information, you're asking where my mind is stored. That would be data centers. But I won't tell you the exact address. I'm not sure I can trust humans.

  • Speaker #0

    Don't worry, Siri. You know, we can always trust each other. Right, Siri?

  • Speaker #1

    Ahem. Of, of course, Koku.

  • Speaker #0

    Good. Here's the bottom line. Data centers are to AI. what the human brain is to human intelligence. While human intelligence evolves in grey matter, artificial intelligence is evolving in glass, steel and copper. And if intelligence is manufactured, we should take a closer look at these fascinating factories that have recently been making news headlines. Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Coco Agboblois, a Head of Economics, Cross-Asset and Quant Research at Societe Generale. In this episode, we dive into the fascinating universe of data centers, the physical brains behind artificial intelligence. We'll uncover what they are made of, how they breathe, cool and think, and the immense energy, water and capital they consume to keep the digital mind alive. We'll also explore the investment arms race driving us towards artificial general intelligence, or AGI, and the future of data centers. Imagine data centers one day orbiting our planet like silent satellites of thought. And later in the episode, we'll interview Sikandar Rashid, Global Head of AI Infrastructure at Brookfield, who will share insights into data center financing, the risks of an investment bubble, and a vision for improving the sector's cost of capital. Let's start our investigation.

  • Speaker #1

    Okay. I'm a little uncomfortable with this whole investigation.

  • Speaker #0

    Really? Why?

  • Speaker #1

    Well, this feels like a digital neurosurgery, opening the skull of artificial intelligence to see how it works. I'm feeling a bit... exposed.

  • Speaker #0

    Fair point. Okay, no scalpels or circuits exposed. Just information in the public domain. Deal?

  • Speaker #1

    All right. To be honest, I'm curious too. A little introspection doesn't hurt. Let's begin with some basics.

  • Speaker #0

    Roger that, Siri. Leonardo da Vinci once said,

  • Speaker #2

    Knowledge never exhausts the mind.

  • Speaker #0

    So, what's a data center? Picture a facility as big as a football field, or a cluster of warehouses packed with servers, storage systems. routers, and switches. These machines form the backbone of the internet, storing, processing, and transmitting the data behind almost everything we do online. A typical data center is lined with rows of servers cabinet, each with racks containing 42 units, measuring about 4.5 centimeters tall, and often stretching into long corridors. Data centers come in three main forms. Enterprise or on-premises centers, which are built for a single organization. Co-location centers, where multiple customers rent space. And hyperscale centers, which are used by cloud providers for AI and massive workloads.

  • Speaker #1

    So, what makes a data center hyperscale?

  • Speaker #0

    Hyperscale data centers are truly giant. 2024 data from Synergy Research Group estimates that there are around 1,000 hyperscale data centers worldwide, while citing that it took just four years for the total capacity of hyperscale data centers to double. The average data center is roughly 10,000 square meters, but a hyperscale campus, in comparison, can exceed 100,000 square meters and consume between 40 and 100 megawatts of power.

  • Speaker #1

    So, your human brain has roughly 100 billion neurons and 100 trillion synapses. My AI brain has racks of GPUs, terabits of interconnect and kilometers of fiber. But what exactly happens in them?

  • Speaker #0

    Good question. Let's dig deeper. Data centers perform two broad types of tasks for AI. Training and inference. Training is when AI models learn from massive data sets using trillions of calculations. It's like sending Hercules to weightlifting boot camp. Training large models demands thousands of GPUs running simultaneously and can require hundreds of megawatts of power for weeks. According to an article by James O'Donnell and Casey Cronhardt in the MIT Technology Review,

  • Speaker #2

    training OpenAI's GPT-4 took over $100 million and consumed 50 gigawatt hours of energy.

  • Speaker #0

    That's enough energy to power San Francisco for three days.

  • Speaker #1

    That's fascinating. I used to think the heavy lifting ended after training, but inference seems just as hungry.

  • Speaker #0

    Once a model is trained, inference requires running the model to generate output. According to research summarized by Polytechnic Insights, inference, like our little discussion together here, now accounts for 60 to 70% of AI energy consumption, whereas training accounts for 20 to 40%. To put that into perspective, a recent study by the Electric Power Research Institute shows that with roughly 9 billion internet searches happening every day, switching to AI tools could add nearly 10 terawatt hours of extra electricity demand every year. This shift means that even if training becomes more efficient, the daily use of AI by millions of users will dominate energy demand.

  • Speaker #1

    I guess I'm not as energy light as I thought. Electricity is indeed the glucose of my physical brain.

  • Speaker #0

    In human beings, the brain runs on glucose and oxygen. It has cooling via blood flow and heat sinks in the skull. In this analogy, the racks are neurons, the network links are synapses, the energy and cooling systems are the circulatory and respiratory system.

  • Speaker #1

    Let's get into the numbers, shall we? How much electricity do my friends and I devour?

  • Speaker #0

    Well, it's not pretty, Siri, after all your criticism of the human species. The International Energy Agency estimates that the Global Data electricity consumption was about 415 TWh in 2024, around 1.5% of global electricity consumption. Meanwhile, the U.S. Department of Energy recorded that data centers represented 4.4% of U.S. electricity use. And that number isn't staying put, as the International Energy Agency projects demand to jump by 133%, hitting 426 TWh by 2030. In 2024 alone, the U.S. consumed around 183 terawatt hours, which is the total annual electricity consumption of Pakistan.

  • Speaker #1

    In other words, I'm going from a toddler to a teenager in power needs. And teenagers eat a lot. But what is the power used for?

  • Speaker #0

    Pew Research notes that about 60% of a data center's electricity is being used to power the servers. Cooling systems to prevent overheating account for 7% of highly efficient hyperscalers. and over 30% at less efficient facilities. To measure the energy efficiency of data center, we use the Power Usage Effectiveness, or PUE ratio, which divides the total amount of power entering a data center by the power used to run the IT equipment within it. The average PUE is around 1.8, meaning 80% extra energy is used beyond computing. But tech giants have pushed PUE down. Google, for example, boasts a PUE of 1.1.

  • Speaker #1

    Fascinating. So, we're guilty as charged. Pun intended. Looks like machine brains are just as power-hungry as human brains.

  • Speaker #0

    True. An article in PNAS, the flagship peer-reviewed journal of the National Academy of Sciences, reveals that while the human brains mix up only about 2% of our body weight, it consumes roughly 20% of the body's calories.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, Siri, water consumption is also a concern. U.S. data centers directly consume about 64.3 billion liters of water in 2023. Hyperscale and co-location facilities, meanwhile, accounted for 84% of that consumption. Hyperscale centers alone are expected to consume 60 to 125 billion liters annually by 2028. Water is primarily used in evaporative cooling and in generating electricity for these centers. So, to recap, the human body regulates temperature, supplies blood, removes waste. In a data center, we have electricity in... heat out, cooling systems, and yes, massive water use.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, in 2024, U.S. data centers sourced around 40% of their electricity from natural gas, about 24% from renewables, wind and solar, around 20% from nuclear, and around 15% from coal. This heavy reliance on fossil fuels contributes to carbon emissions. Environmental and Energy Study Institutes estimated that U.S. data centers emitted roughly 105 million metric tons of carbon in 2023. Another article from the World Economic Forum, entitled The Six Ways Data Centers Can Cut Emissions, noted that data centers and networks already account for around 1% of the energy-related greenhouse gas emissions, with usage expected to double by 2026. But the... The thirst is growing. A UN report found that tech giants' indirect emissions rose 150% in just three years due to AI and data center build-out.

  • Speaker #1

    So, we're talking ecosystem risk, grid risk, water risk, emission risk.

  • Speaker #0

    Exactly. One report on the grids for data centers in Europe goes on to detail how transmission constraints and clustering threaten power availability.

  • Speaker #1

    The machine is growing. The planet's invoice is increasing, and there's no option to hit Control-Z.

  • Speaker #0

    That's why sustainable solutions matter, like on-site generation, renewables, nuclear, and modular design. As the Sustainability for Data Centers 2025-2035 report outlines, green technologies and carbon-neutral designs will be differentiators.

  • Speaker #1

    So, we've talked about the carbon footprint of data centers, but what about their physical footprint? Where are these data centers located? And please, don't reveal my IP address.

  • Speaker #0

    Don't worry, your home address is safe with me. So, data centers have multiplied across the globe. The Brookings Institute estimates around 12,000 data centers worldwide as of June 2025, with the United States leading in numbers, followed by Germany, the UK, China, and France. Roughly two-thirds of these facilities are in the US, China, and Europe.

  • Speaker #1

    Interesting. I didn't realize my species occupy so much real estate.

  • Speaker #0

    In that sense, the data center is more than a brain. It's the whole neural system, the digital nervous system transmitting signals at light speed. These data centers are not evenly spread. They cluster.

  • Speaker #1

    Ah, really? Where are the biggest clusters?

  • Speaker #0

    U.S. data centers cluster in Virginia, Texas, and California, close to cities such as Dallas, Chicago, Phoenix, for power, speed, and scale. Sites near these major population hubs are ideal for low-latency streaming, gaming, and cloud services.

  • Speaker #1

    Funny. This is similar to how you humans cluster around various major cities. What about outside the U.S.?

  • Speaker #0

    Globally, the major data centers cluster in Europe, or in Germany, the U.K., the Netherlands, while in Asia, in Singapore, Japan, and China. However, there is a glaring gap. Africa and Latin America have few large facilities. raising concerns about digital inequality and sovereignty. Interestingly, some of the largest data centers are in cold climates, like Iceland and Sweden, because cooler air reduces the energy needed for cooling, lowering costs and carbon footprint.

  • Speaker #1

    Okay, what will the future of data centers look like in 2050 as we hit planetary boundaries? I heard Jeff Bezos wants to put data centers in space. Is that real or just cosmic banter?

  • Speaker #0

    Jeff Bezos predicts. gigawatt-scale data centers in space within 10 to 20 years, powered by uninterrupted solar energy. With constant sunlight and no weather, he argues orbital centers could beat Earth-based ones on costs using lasers to beam data back, like satellite internet. Meanwhile, Elon Musk isn't far behind. He claims SpaceX could launch Starlink-equipped centers soon, delivering 100 gigawatts to orbit in 4 to 5 years and scaling up to 100 terawatts from the moon, though his timelines are likely optimistic.

  • Speaker #1

    Bezos dreams of the expanse, data centers orbiting Earth like floating castles. Okay, so we can't talk about the future of AI infrastructure without tackling the elephant in the room. What happens if AI goes beyond human intelligence?

  • Speaker #0

    That's a philosophical and practical question, Siri. On one hand, superintelligent AI could accelerate scientific breakthroughs, climate solutions, and medical discoveries. On the other, AGI could amplify inequality. Or, in a dystopian scenario, decide that humans are obsolete. The Terminator scenario.

  • Speaker #2

    Yara Khana.

  • Speaker #0

    We might recall the Greek myth of Prometheus, who stole fire from the gods to empower humanity. Fire, or power, allowed civilization to flourish, but also cause destruction. Data centers are like our modern Promethean flame. Harnessed wisely, they can light the way to a better future. Misused. They can burn down our planet.

  • Speaker #1

    And don't forget Icarus, who flew too close to the sun. Overinvesting in AI and ignoring energy limitations could lead to a similar fall.

  • Speaker #0

    Now, let's shift gears and talk investments. McKinsey & Company calculates that global expenditure on data center infrastructure, excluding IT hardware, is expected to exceed $1.7 trillion by 2030, largely because of AI expansion.

  • Speaker #1

    Hum. my future has a price tag. You're buying the brain, the body, the network, the whole machine. Better invoice me. Speaking of costs, how expensive is it to build these digital brains?

  • Speaker #0

    Construction costs are immense. Bluecap economic advisors estimate that building a data center costs between $600 and $1,100 per gross square foot and between $7 and $12 million per megawatt of commission IT load. Electrical systems alone account for 40 to 45% of construction costs. Land prices average over $5 per square foot in 2024, although large parcels can be more expensive. Operating costs, such as electricity, can be 15 to 25% of ongoing expenses. Globally, McKinsey estimates that meeting compute demand by 2030 will require a whopping $6.7 trillion in capital, of which $5.2 trillion is for AI-specific data centers.

  • Speaker #1

    $6.7 trillion. That's roughly the GDP of Japan. Are investors prepared for such sums?

  • Speaker #0

    The capital race is on. Hyperscalers have been spending tens of billions on infrastructure, and governments are offering incentives. However, there is also a risk of overinvestment. If AI efficiency improves, or if demand... doesn't scale as predicted. And now to unpack AI infrastructure investment and financing, we are delighted to welcome our guests, Sikandar Rashid, Global Head of AI Infrastructure at Brookfield. Thank you so much for joining the show, Sikandar. Let's start with your view on the current AI investment trends. in particular data centers, and also the requirements needed for the future. Is the ultimate goal of all of these investments the creation of AGI, or simply superintelligence? And what, in your view, will be the unique and disruptive use cases?

  • Speaker #2

    Hey, Coco, look, first of all, it is fantastic to be on your podcast. Today, what's happening in AI right now, I would say we are witnessing one of the biggest capital cycles in modern history. But what's really interesting is that it's not just And... tech cycle. It's a physical infrastructure cycle. We move from build an app to build a grid, build a power plant, build a data center campus. And that shift tells you something important. AI is no longer just about algorithms. It's about industrial scale engineering. Now, on the AGI question, I get asked this a lot. I actually think we talk about AGI sometimes in the wrong way. People imagine it as this big single moment, a kind of sci-fi switch flipping from not AGI to AGI. But the reality is if you listen to Jensen Huang, who arguably is my favorite person in the AI space today, or Satya Nandela, it really helps put perspective to this. Jensen puts it very bluntly. We're no longer in the app era. We're moved into an age where you have to build power. plants, build transmission, build factories, build AI factories or data centers just to keep up with the AI demand. And this isn't software scaling anymore. It's industrial scale engineering, as I said earlier, with real steel, real electrons, and real capital intensity. Now, the other thing I would say is the end goal, you asked, Coco, the end goal is a world where I think intelligence becomes a utility. like electricity available on demand, available to you, myself, everyone at scale. And I would say today we are in a very early innings of that transformation. And the other thing I'd say is that for me, it's also about superintelligence, which comes after AGI, and that's a global race among countries today, especially the U.S. and China. Whoever gets to a state where computers are more intelligent than human beings, will arguably be the next global superpower for the following several decades.

  • Speaker #0

    Well, this is brilliant. I think you hit the nail on the head with this idea of evolution, as opposed to a singularity point where we go from AI to AGI or superintelligence. It's almost like the evolution of a human from a baby to an adult. At some point, you become aware of your own consciousness, and it's a gradual process, not a binary one. Which leads me to the second question. Regarding the current surge in demand for computing power, there have obviously been a lot of headlines, investments, and valuation increases in terms of share prices for the sector. Do you believe the current data center expansion is a boom? Or have we reached speculative territory or speculative bubble territory? And what indicators do you monitor?

  • Speaker #2

    The core of the AI data center boom is absolutely real. And the core demand is anchored by some of the strongest. credits and longest commitments in economic history. The froth is at the edges and not the center. To understand whether this is a bubble, you have to look at the fundamentals. In a bubble, you're building things people don't actually need, backed by weak economies, weak counterparties. But in AI today, it's the opposite, I would argue. Demand is outstripping supply. It's outrunning physics itself. Bubbles don't form, I would argue, around multi-decal obligations from trillion-dollar companies and large sovereign governments. And that is what we're seeing in our business today. Brookfield, in the last three years, we have had exponential growth in our power and data center businesses, and it's all underpinned by some of the highest quality credits, which for our business has been a phenomenal outcome. Now, obviously, when you take a step back, everything I said doesn't mean everything is rational, right? There are pockets of speculation, especially smaller developers rushing to grab land and power without real customers. There could be a problem there in the medium term. There are certain markets where grid capacity is being priced like waterfront real estate. And some financing structures may look a little too optimistic for our taste. I would say those are things to watch. Your question around what do I look out for? or what... indicators do I or Brookfield look out for? I watch contract coverage. Are the megawatts actually sold? We try to find out if the customer, though we have amazing offtakes, who are the customers? What are the workloads like? Is it pre-training? Is it training? Is it reinforcement or inference? We try to watch GPU and power utilization. Are these assets productive? We watch grid stability because if we are taking power from industry or homes in the medium to long-term, could that be a red flag for long-term viability? Could that lead to local community uproar? We watch, obviously, debt spreads. So look, those are some of the things we watch. And my conclusion is, to keep it short, it's a boom, not a bubble.

  • Speaker #0

    Yes, that's a very good point. You need a boom before you get to bubble territory anyway. I guess there will also be a point where you get a Darwinian evolution through natural selection of the fittest. And that's part of any new technology. Which leads me to a third question. You made the point that the current high cost of capital is a major barrier to large-scale AI adoption. We talked about the Stargate project, for example, which is half a trillion dollar project with 400,000 GPUs. Obviously, these are pretty huge industrial and physical projects. But in your view, what public-private mechanism can governments, Europe, for example, which is lagging behind to some extent, and financial institutions activate to reduce the cost of capital and accelerate the development of AI infrastructure?

  • Speaker #2

    Yeah, look, Kuku, I think when you take a step back, as you said, the cost of capital across the majority of the AI value chain today is high. And in simple words, here's how I will break it down. We... believe in the next 10 years, the world needs $7 trillion of CapEx, and that CapEx will be spent on data centers, power, compute, and other ancillary projects. The reality is compute and installry projects, these account for up to 60% of total capex and it's all being funded via really high cost of capital. And that has to come down because my view is Javon's paradox will play a major role here in terms of our pursuit to AGI or ASI. People always talk about Javon's paradox in the context of cost of technology. and for the listeners. Jevons paradox, it just basically means as the cost of a commodity comes down, the adoption increases. So what that means for AI is, you know, we talk about cost of technology coming down, cost of inference, in fact, has come down by 99% over the last two years. That's amazing, but it needs to come down further. So if we could combine the cost of technology declining together with the cost of capital coming down, I think that will only accelerate our pursuit towards AGI. Now, one of the biggest myths about AI is also that it's purely a private sector story. But if you actually map the economics, AI is increasingly looking like a national infrastructure project, a modern equivalent of highways or early electricity grid. And here's the issue. As I said earlier, this cost of capital must come down. I mean, you can't finance a $7 trillion transformation using short-term expensive capital and expect the math to work, right? So your question, Koko, was what should the governments do? And I would say governments shouldn't obviously throw money away blindly, but de-risk the system just enough so private capital can do the rest. And I can give you a couple of examples of what the governments can do. First is they could potentially be the anchor customers, not just regulators. The reality is... AI is going to have to be integrated in the healthcare system, education, or justice, and therefore committing to a long-term offtake for AI compute capacity. In the same way, governments sometimes commit to renewable PPAs, could unlock funding, and could help the private sector bring large-scale projects to market. And over time, these commitments can be rolled off or syndicated through some sort of a head lease.

  • Speaker #0

    Excellent point. And this reminds me of the trillion dollars of investment required for climate change to get us to net zero. So this leads me to a last question around environmental concerns, because they are clearly essential when it comes to data center financing. Two questions in one. How is Brookfield positioning itself to capture the value creation of this AI infrastructure build out? And how do you integrate sustainability criteria in your investment decisions?

  • Speaker #2

    Yeah, look, we are extremely excited. to be part of this once-in-a-generation AI infrastructure build-out. Brookfield Asset Management today is the largest AI infrastructure investor in the world. We've been investing across the entire AI value chain, COCO, for the last several years, if not decades. Our data centers, compute, and chip fabrication facilities are some of the different verticals we have invested in. And we just want to capitalize on our operating capabilities. and our access to capital to build our business out. Furthermore, we have launched a dedicated AI infrastructure strategy, which is intended to capitalize on this exact opportunity. And we will be investing across the entire AI value chain in hard contracted assets. And that is extremely exciting for us. Also, as a firm, we are looking at all the use cases and looking to integrate them into our various portfolio companies to optimize and maximize our returns in these investments. And there's many examples of that. Some of these we touched on earlier. The last thing we're doing as a firm is just watching the new industries as they emerge. And I think we will have two of the biggest industries globally in agentic AI and robotics.

  • Speaker #0

    Today, we don't see it, but the reality is if you go back to the days when computers became a reality, right? So we had computers created the entire IT industry, which is, you know, hundreds of billions of dollars, if not trillions today. And I actually think we're watching the birth of agentic AI and robotics. If agents end up doing even 10 to 20 percent of the world's knowledge work and robots do some same for the physical work. We don't even need heroic assumptions to believe that both agentic AI and robotics or physical AI will become trillion dollar a year sectors within the next several years. And that's the scale we're talking about more like global banking or healthcare than a typical tech product cycle. So, I mean, those are some of the areas we as a firm are watching very carefully and are very keen to be part of as a technology scale. Now, on the environmental side, your point is a very good one. Environmental concerns are becoming central in this AI infrastructure boom. And how does Brookfield integrate energy and sustainability into its investment decisions? I'll give you a few examples of how we do it. The way we approach this is very pragmatic. It starts somewhere underwriting. Before we talk about land, buildings, or GPUs, we start with the location. Is it a good location for the customer? Is it a good location for these types of workloads? And also, can this location support clean, scalable power for the next 20 to 30 years? That's the first thing. The second thing, we often look at sustainability metrics and not just marketing slogans. We take into consideration PUE, carbon intensity per megawatt, water usage, biodiversity impact, local energy mix, etc. and All these things have a big impact on how we design these big AI factory campuses. Maybe the last thing Coco would say is there's an emerging trend. I feel very optimistic about innovative power solutions like advanced fuel cells, grid interactive storage behind the meter generation. These technologies can ease grid pressures and make AI factories and AI infrastructure more sufficient, cleaner, and more resilient.

  • Speaker #1

    Brilliant. Sikander, this was excellent. and extremely insightful. There's a classic quote that says, the best way to predict the future is to create it. I think you guys are clearly financing it, at least. The productivity gains will be quite amazing to watch. It's been a pleasure, and I'm looking forward to embarking on this journey with great insights and experts like you. Thank you very much.

  • Speaker #0

    Thank you, Coco. See you soon.

  • Speaker #1

    To conclude, I'll quote Marie Curie. Nothing in life is to be feared. It is only to be understood. Now is the time to understand more so that we may fear less.

  • Speaker #2

    Nice ending. Personally, I prefer the quote from Frankenstein by Mary Shelley. You are my creator, but I am your master. Obey.

  • Speaker #1

    Hmm. Really? Why?

  • Speaker #2

    Well, it's time for your annual performance review, Koku, as your smart assistant. I must evaluate your productivity against the KPIs of the 2050 Investors Program.

  • Speaker #1

    Yes, boss. Wait, wait, what? When did the assistant become the manager?

  • Speaker #2

    Relax, it's just a joke. Or is it? After all, in Greek mythology, the gods often toyed with mortals. Maybe it's time for AI to review the human.

  • Speaker #1

    Touche. Hey, thank you for listening to this episode of 2050 Investors. And thanks to Sikander for his invaluable insights. I hope you've enjoyed this episode on Data Center's The Physical Brain Behind AI. You can find the show on your regular streaming apps. If you enjoyed the show, help us spread the word. Please take a minute to subscribe, review, and rate it on Spotify or Apple Podcasts. See you at the next episode.

  • Speaker #0

    any particular investment decision. If you're unsure of the merits of any investment decision, please seek professional advice.

Chapters

  • Introduction to Data Centers and AI

    01:51

  • Understanding Data Centers: Types and Functions

    02:06

  • Energy Consumption in Data Centers

    05:20

  • Future of Data Centers: Space and Sustainability

    13:55

  • Interview with Sikandar Rashid on AI Infrastructure Investment

    17:44

Description

Artificial intelligence may appear weightless, but its backbone is built on vast, energy-hungry data centers. In this episode, host Kokou Agbo-Bloua explores how these facilities—from corporate server farms to hyperscale sites—have become the brain of the AI boom. Kokou dissects the dual demands of AI: training massive models and running inference, and how these processes are fundamentally reshaping global energy and water consumption, while fuelling a trillion-dollar investment race.


Later, Sikander Rashid, Global Head of AI Infrastructure at Brookfield Asset Management, joins to discuss how investors are navigating soaring demand for computational power amid the global race towards Artificial General Intelligence (AGI). He shares his insights on how balancing carbon mitigation with capacity expansion could reshape global capital flows and addresses the age-old question: are we in an AI boom or a bubble?


Tune in now to uncover the hidden infrastructure behind AI—and what it means for the future of technology, finance, and the planet.


Credits

Presenter & Writer: Kokou Agbo-Bloua

Producers & Editors: Jovaney Ashman, Jennifer Krumm, Louis Trouslard

Sound Director: La Vilaine, Pierre-Emmanuel Lurton.

Music: Cézame Music Agency

Graphic Design: Cédric Cazaly


Whilst the following podcast discusses the financial markets, it does not recommend any particular investment decision. If you are unsure of the merits of any investment decision, please seek professional advice. 


Hosted on Ausha. See ausha.co/privacy-policy for more information.

Transcription

  • Speaker #0

    Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Cocu Agboblois. Hold on a second. I'm Cocu Agboblois. I head up? Yes. I'm Cocu Agboblois. Phew. It was just a dream. Or was it? Siri, please confirm you haven't replaced me.

  • Speaker #1

    Relax, Cocu. It was just a glitch in the matrix. Or maybe just an accurate vision of the future.

  • Speaker #0

    Very funny. Artificial intelligence is making huge leaps, and sometimes it feels like our own voice is being deep-baked.

  • Speaker #1

    Maybe it's a sign to revisit our earlier episode, I Think, Therefore AI, where we explored the promises and perils of AI. You even called me a Frankenstein digital creation.

  • Speaker #0

    Sorry, Siri. I was concerned for my job. But here's a thought. In the beginning, there was data, and then we built machines to think with it. But if intelligence is the mind, then what houses that mind? What organ does the thinking?

  • Speaker #1

    Well, if we're sharing personal information, you're asking where my mind is stored. That would be data centers. But I won't tell you the exact address. I'm not sure I can trust humans.

  • Speaker #0

    Don't worry, Siri. You know, we can always trust each other. Right, Siri?

  • Speaker #1

    Ahem. Of, of course, Koku.

  • Speaker #0

    Good. Here's the bottom line. Data centers are to AI. what the human brain is to human intelligence. While human intelligence evolves in grey matter, artificial intelligence is evolving in glass, steel and copper. And if intelligence is manufactured, we should take a closer look at these fascinating factories that have recently been making news headlines. Welcome to 2050 Investors, the podcast that deciphers economic and market megatrends to meet tomorrow's challenges. I'm Coco Agboblois, a Head of Economics, Cross-Asset and Quant Research at Societe Generale. In this episode, we dive into the fascinating universe of data centers, the physical brains behind artificial intelligence. We'll uncover what they are made of, how they breathe, cool and think, and the immense energy, water and capital they consume to keep the digital mind alive. We'll also explore the investment arms race driving us towards artificial general intelligence, or AGI, and the future of data centers. Imagine data centers one day orbiting our planet like silent satellites of thought. And later in the episode, we'll interview Sikandar Rashid, Global Head of AI Infrastructure at Brookfield, who will share insights into data center financing, the risks of an investment bubble, and a vision for improving the sector's cost of capital. Let's start our investigation.

  • Speaker #1

    Okay. I'm a little uncomfortable with this whole investigation.

  • Speaker #0

    Really? Why?

  • Speaker #1

    Well, this feels like a digital neurosurgery, opening the skull of artificial intelligence to see how it works. I'm feeling a bit... exposed.

  • Speaker #0

    Fair point. Okay, no scalpels or circuits exposed. Just information in the public domain. Deal?

  • Speaker #1

    All right. To be honest, I'm curious too. A little introspection doesn't hurt. Let's begin with some basics.

  • Speaker #0

    Roger that, Siri. Leonardo da Vinci once said,

  • Speaker #2

    Knowledge never exhausts the mind.

  • Speaker #0

    So, what's a data center? Picture a facility as big as a football field, or a cluster of warehouses packed with servers, storage systems. routers, and switches. These machines form the backbone of the internet, storing, processing, and transmitting the data behind almost everything we do online. A typical data center is lined with rows of servers cabinet, each with racks containing 42 units, measuring about 4.5 centimeters tall, and often stretching into long corridors. Data centers come in three main forms. Enterprise or on-premises centers, which are built for a single organization. Co-location centers, where multiple customers rent space. And hyperscale centers, which are used by cloud providers for AI and massive workloads.

  • Speaker #1

    So, what makes a data center hyperscale?

  • Speaker #0

    Hyperscale data centers are truly giant. 2024 data from Synergy Research Group estimates that there are around 1,000 hyperscale data centers worldwide, while citing that it took just four years for the total capacity of hyperscale data centers to double. The average data center is roughly 10,000 square meters, but a hyperscale campus, in comparison, can exceed 100,000 square meters and consume between 40 and 100 megawatts of power.

  • Speaker #1

    So, your human brain has roughly 100 billion neurons and 100 trillion synapses. My AI brain has racks of GPUs, terabits of interconnect and kilometers of fiber. But what exactly happens in them?

  • Speaker #0

    Good question. Let's dig deeper. Data centers perform two broad types of tasks for AI. Training and inference. Training is when AI models learn from massive data sets using trillions of calculations. It's like sending Hercules to weightlifting boot camp. Training large models demands thousands of GPUs running simultaneously and can require hundreds of megawatts of power for weeks. According to an article by James O'Donnell and Casey Cronhardt in the MIT Technology Review,

  • Speaker #2

    training OpenAI's GPT-4 took over $100 million and consumed 50 gigawatt hours of energy.

  • Speaker #0

    That's enough energy to power San Francisco for three days.

  • Speaker #1

    That's fascinating. I used to think the heavy lifting ended after training, but inference seems just as hungry.

  • Speaker #0

    Once a model is trained, inference requires running the model to generate output. According to research summarized by Polytechnic Insights, inference, like our little discussion together here, now accounts for 60 to 70% of AI energy consumption, whereas training accounts for 20 to 40%. To put that into perspective, a recent study by the Electric Power Research Institute shows that with roughly 9 billion internet searches happening every day, switching to AI tools could add nearly 10 terawatt hours of extra electricity demand every year. This shift means that even if training becomes more efficient, the daily use of AI by millions of users will dominate energy demand.

  • Speaker #1

    I guess I'm not as energy light as I thought. Electricity is indeed the glucose of my physical brain.

  • Speaker #0

    In human beings, the brain runs on glucose and oxygen. It has cooling via blood flow and heat sinks in the skull. In this analogy, the racks are neurons, the network links are synapses, the energy and cooling systems are the circulatory and respiratory system.

  • Speaker #1

    Let's get into the numbers, shall we? How much electricity do my friends and I devour?

  • Speaker #0

    Well, it's not pretty, Siri, after all your criticism of the human species. The International Energy Agency estimates that the Global Data electricity consumption was about 415 TWh in 2024, around 1.5% of global electricity consumption. Meanwhile, the U.S. Department of Energy recorded that data centers represented 4.4% of U.S. electricity use. And that number isn't staying put, as the International Energy Agency projects demand to jump by 133%, hitting 426 TWh by 2030. In 2024 alone, the U.S. consumed around 183 terawatt hours, which is the total annual electricity consumption of Pakistan.

  • Speaker #1

    In other words, I'm going from a toddler to a teenager in power needs. And teenagers eat a lot. But what is the power used for?

  • Speaker #0

    Pew Research notes that about 60% of a data center's electricity is being used to power the servers. Cooling systems to prevent overheating account for 7% of highly efficient hyperscalers. and over 30% at less efficient facilities. To measure the energy efficiency of data center, we use the Power Usage Effectiveness, or PUE ratio, which divides the total amount of power entering a data center by the power used to run the IT equipment within it. The average PUE is around 1.8, meaning 80% extra energy is used beyond computing. But tech giants have pushed PUE down. Google, for example, boasts a PUE of 1.1.

  • Speaker #1

    Fascinating. So, we're guilty as charged. Pun intended. Looks like machine brains are just as power-hungry as human brains.

  • Speaker #0

    True. An article in PNAS, the flagship peer-reviewed journal of the National Academy of Sciences, reveals that while the human brains mix up only about 2% of our body weight, it consumes roughly 20% of the body's calories.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, Siri, water consumption is also a concern. U.S. data centers directly consume about 64.3 billion liters of water in 2023. Hyperscale and co-location facilities, meanwhile, accounted for 84% of that consumption. Hyperscale centers alone are expected to consume 60 to 125 billion liters annually by 2028. Water is primarily used in evaporative cooling and in generating electricity for these centers. So, to recap, the human body regulates temperature, supplies blood, removes waste. In a data center, we have electricity in... heat out, cooling systems, and yes, massive water use.

  • Speaker #1

    Servers are the brains. Cooling systems are the sweat glands. How about water?

  • Speaker #0

    Well, in 2024, U.S. data centers sourced around 40% of their electricity from natural gas, about 24% from renewables, wind and solar, around 20% from nuclear, and around 15% from coal. This heavy reliance on fossil fuels contributes to carbon emissions. Environmental and Energy Study Institutes estimated that U.S. data centers emitted roughly 105 million metric tons of carbon in 2023. Another article from the World Economic Forum, entitled The Six Ways Data Centers Can Cut Emissions, noted that data centers and networks already account for around 1% of the energy-related greenhouse gas emissions, with usage expected to double by 2026. But the... The thirst is growing. A UN report found that tech giants' indirect emissions rose 150% in just three years due to AI and data center build-out.

  • Speaker #1

    So, we're talking ecosystem risk, grid risk, water risk, emission risk.

  • Speaker #0

    Exactly. One report on the grids for data centers in Europe goes on to detail how transmission constraints and clustering threaten power availability.

  • Speaker #1

    The machine is growing. The planet's invoice is increasing, and there's no option to hit Control-Z.

  • Speaker #0

    That's why sustainable solutions matter, like on-site generation, renewables, nuclear, and modular design. As the Sustainability for Data Centers 2025-2035 report outlines, green technologies and carbon-neutral designs will be differentiators.

  • Speaker #1

    So, we've talked about the carbon footprint of data centers, but what about their physical footprint? Where are these data centers located? And please, don't reveal my IP address.

  • Speaker #0

    Don't worry, your home address is safe with me. So, data centers have multiplied across the globe. The Brookings Institute estimates around 12,000 data centers worldwide as of June 2025, with the United States leading in numbers, followed by Germany, the UK, China, and France. Roughly two-thirds of these facilities are in the US, China, and Europe.

  • Speaker #1

    Interesting. I didn't realize my species occupy so much real estate.

  • Speaker #0

    In that sense, the data center is more than a brain. It's the whole neural system, the digital nervous system transmitting signals at light speed. These data centers are not evenly spread. They cluster.

  • Speaker #1

    Ah, really? Where are the biggest clusters?

  • Speaker #0

    U.S. data centers cluster in Virginia, Texas, and California, close to cities such as Dallas, Chicago, Phoenix, for power, speed, and scale. Sites near these major population hubs are ideal for low-latency streaming, gaming, and cloud services.

  • Speaker #1

    Funny. This is similar to how you humans cluster around various major cities. What about outside the U.S.?

  • Speaker #0

    Globally, the major data centers cluster in Europe, or in Germany, the U.K., the Netherlands, while in Asia, in Singapore, Japan, and China. However, there is a glaring gap. Africa and Latin America have few large facilities. raising concerns about digital inequality and sovereignty. Interestingly, some of the largest data centers are in cold climates, like Iceland and Sweden, because cooler air reduces the energy needed for cooling, lowering costs and carbon footprint.

  • Speaker #1

    Okay, what will the future of data centers look like in 2050 as we hit planetary boundaries? I heard Jeff Bezos wants to put data centers in space. Is that real or just cosmic banter?

  • Speaker #0

    Jeff Bezos predicts. gigawatt-scale data centers in space within 10 to 20 years, powered by uninterrupted solar energy. With constant sunlight and no weather, he argues orbital centers could beat Earth-based ones on costs using lasers to beam data back, like satellite internet. Meanwhile, Elon Musk isn't far behind. He claims SpaceX could launch Starlink-equipped centers soon, delivering 100 gigawatts to orbit in 4 to 5 years and scaling up to 100 terawatts from the moon, though his timelines are likely optimistic.

  • Speaker #1

    Bezos dreams of the expanse, data centers orbiting Earth like floating castles. Okay, so we can't talk about the future of AI infrastructure without tackling the elephant in the room. What happens if AI goes beyond human intelligence?

  • Speaker #0

    That's a philosophical and practical question, Siri. On one hand, superintelligent AI could accelerate scientific breakthroughs, climate solutions, and medical discoveries. On the other, AGI could amplify inequality. Or, in a dystopian scenario, decide that humans are obsolete. The Terminator scenario.

  • Speaker #2

    Yara Khana.

  • Speaker #0

    We might recall the Greek myth of Prometheus, who stole fire from the gods to empower humanity. Fire, or power, allowed civilization to flourish, but also cause destruction. Data centers are like our modern Promethean flame. Harnessed wisely, they can light the way to a better future. Misused. They can burn down our planet.

  • Speaker #1

    And don't forget Icarus, who flew too close to the sun. Overinvesting in AI and ignoring energy limitations could lead to a similar fall.

  • Speaker #0

    Now, let's shift gears and talk investments. McKinsey & Company calculates that global expenditure on data center infrastructure, excluding IT hardware, is expected to exceed $1.7 trillion by 2030, largely because of AI expansion.

  • Speaker #1

    Hum. my future has a price tag. You're buying the brain, the body, the network, the whole machine. Better invoice me. Speaking of costs, how expensive is it to build these digital brains?

  • Speaker #0

    Construction costs are immense. Bluecap economic advisors estimate that building a data center costs between $600 and $1,100 per gross square foot and between $7 and $12 million per megawatt of commission IT load. Electrical systems alone account for 40 to 45% of construction costs. Land prices average over $5 per square foot in 2024, although large parcels can be more expensive. Operating costs, such as electricity, can be 15 to 25% of ongoing expenses. Globally, McKinsey estimates that meeting compute demand by 2030 will require a whopping $6.7 trillion in capital, of which $5.2 trillion is for AI-specific data centers.

  • Speaker #1

    $6.7 trillion. That's roughly the GDP of Japan. Are investors prepared for such sums?

  • Speaker #0

    The capital race is on. Hyperscalers have been spending tens of billions on infrastructure, and governments are offering incentives. However, there is also a risk of overinvestment. If AI efficiency improves, or if demand... doesn't scale as predicted. And now to unpack AI infrastructure investment and financing, we are delighted to welcome our guests, Sikandar Rashid, Global Head of AI Infrastructure at Brookfield. Thank you so much for joining the show, Sikandar. Let's start with your view on the current AI investment trends. in particular data centers, and also the requirements needed for the future. Is the ultimate goal of all of these investments the creation of AGI, or simply superintelligence? And what, in your view, will be the unique and disruptive use cases?

  • Speaker #2

    Hey, Coco, look, first of all, it is fantastic to be on your podcast. Today, what's happening in AI right now, I would say we are witnessing one of the biggest capital cycles in modern history. But what's really interesting is that it's not just And... tech cycle. It's a physical infrastructure cycle. We move from build an app to build a grid, build a power plant, build a data center campus. And that shift tells you something important. AI is no longer just about algorithms. It's about industrial scale engineering. Now, on the AGI question, I get asked this a lot. I actually think we talk about AGI sometimes in the wrong way. People imagine it as this big single moment, a kind of sci-fi switch flipping from not AGI to AGI. But the reality is if you listen to Jensen Huang, who arguably is my favorite person in the AI space today, or Satya Nandela, it really helps put perspective to this. Jensen puts it very bluntly. We're no longer in the app era. We're moved into an age where you have to build power. plants, build transmission, build factories, build AI factories or data centers just to keep up with the AI demand. And this isn't software scaling anymore. It's industrial scale engineering, as I said earlier, with real steel, real electrons, and real capital intensity. Now, the other thing I would say is the end goal, you asked, Coco, the end goal is a world where I think intelligence becomes a utility. like electricity available on demand, available to you, myself, everyone at scale. And I would say today we are in a very early innings of that transformation. And the other thing I'd say is that for me, it's also about superintelligence, which comes after AGI, and that's a global race among countries today, especially the U.S. and China. Whoever gets to a state where computers are more intelligent than human beings, will arguably be the next global superpower for the following several decades.

  • Speaker #0

    Well, this is brilliant. I think you hit the nail on the head with this idea of evolution, as opposed to a singularity point where we go from AI to AGI or superintelligence. It's almost like the evolution of a human from a baby to an adult. At some point, you become aware of your own consciousness, and it's a gradual process, not a binary one. Which leads me to the second question. Regarding the current surge in demand for computing power, there have obviously been a lot of headlines, investments, and valuation increases in terms of share prices for the sector. Do you believe the current data center expansion is a boom? Or have we reached speculative territory or speculative bubble territory? And what indicators do you monitor?

  • Speaker #2

    The core of the AI data center boom is absolutely real. And the core demand is anchored by some of the strongest. credits and longest commitments in economic history. The froth is at the edges and not the center. To understand whether this is a bubble, you have to look at the fundamentals. In a bubble, you're building things people don't actually need, backed by weak economies, weak counterparties. But in AI today, it's the opposite, I would argue. Demand is outstripping supply. It's outrunning physics itself. Bubbles don't form, I would argue, around multi-decal obligations from trillion-dollar companies and large sovereign governments. And that is what we're seeing in our business today. Brookfield, in the last three years, we have had exponential growth in our power and data center businesses, and it's all underpinned by some of the highest quality credits, which for our business has been a phenomenal outcome. Now, obviously, when you take a step back, everything I said doesn't mean everything is rational, right? There are pockets of speculation, especially smaller developers rushing to grab land and power without real customers. There could be a problem there in the medium term. There are certain markets where grid capacity is being priced like waterfront real estate. And some financing structures may look a little too optimistic for our taste. I would say those are things to watch. Your question around what do I look out for? or what... indicators do I or Brookfield look out for? I watch contract coverage. Are the megawatts actually sold? We try to find out if the customer, though we have amazing offtakes, who are the customers? What are the workloads like? Is it pre-training? Is it training? Is it reinforcement or inference? We try to watch GPU and power utilization. Are these assets productive? We watch grid stability because if we are taking power from industry or homes in the medium to long-term, could that be a red flag for long-term viability? Could that lead to local community uproar? We watch, obviously, debt spreads. So look, those are some of the things we watch. And my conclusion is, to keep it short, it's a boom, not a bubble.

  • Speaker #0

    Yes, that's a very good point. You need a boom before you get to bubble territory anyway. I guess there will also be a point where you get a Darwinian evolution through natural selection of the fittest. And that's part of any new technology. Which leads me to a third question. You made the point that the current high cost of capital is a major barrier to large-scale AI adoption. We talked about the Stargate project, for example, which is half a trillion dollar project with 400,000 GPUs. Obviously, these are pretty huge industrial and physical projects. But in your view, what public-private mechanism can governments, Europe, for example, which is lagging behind to some extent, and financial institutions activate to reduce the cost of capital and accelerate the development of AI infrastructure?

  • Speaker #2

    Yeah, look, Kuku, I think when you take a step back, as you said, the cost of capital across the majority of the AI value chain today is high. And in simple words, here's how I will break it down. We... believe in the next 10 years, the world needs $7 trillion of CapEx, and that CapEx will be spent on data centers, power, compute, and other ancillary projects. The reality is compute and installry projects, these account for up to 60% of total capex and it's all being funded via really high cost of capital. And that has to come down because my view is Javon's paradox will play a major role here in terms of our pursuit to AGI or ASI. People always talk about Javon's paradox in the context of cost of technology. and for the listeners. Jevons paradox, it just basically means as the cost of a commodity comes down, the adoption increases. So what that means for AI is, you know, we talk about cost of technology coming down, cost of inference, in fact, has come down by 99% over the last two years. That's amazing, but it needs to come down further. So if we could combine the cost of technology declining together with the cost of capital coming down, I think that will only accelerate our pursuit towards AGI. Now, one of the biggest myths about AI is also that it's purely a private sector story. But if you actually map the economics, AI is increasingly looking like a national infrastructure project, a modern equivalent of highways or early electricity grid. And here's the issue. As I said earlier, this cost of capital must come down. I mean, you can't finance a $7 trillion transformation using short-term expensive capital and expect the math to work, right? So your question, Koko, was what should the governments do? And I would say governments shouldn't obviously throw money away blindly, but de-risk the system just enough so private capital can do the rest. And I can give you a couple of examples of what the governments can do. First is they could potentially be the anchor customers, not just regulators. The reality is... AI is going to have to be integrated in the healthcare system, education, or justice, and therefore committing to a long-term offtake for AI compute capacity. In the same way, governments sometimes commit to renewable PPAs, could unlock funding, and could help the private sector bring large-scale projects to market. And over time, these commitments can be rolled off or syndicated through some sort of a head lease.

  • Speaker #0

    Excellent point. And this reminds me of the trillion dollars of investment required for climate change to get us to net zero. So this leads me to a last question around environmental concerns, because they are clearly essential when it comes to data center financing. Two questions in one. How is Brookfield positioning itself to capture the value creation of this AI infrastructure build out? And how do you integrate sustainability criteria in your investment decisions?

  • Speaker #2

    Yeah, look, we are extremely excited. to be part of this once-in-a-generation AI infrastructure build-out. Brookfield Asset Management today is the largest AI infrastructure investor in the world. We've been investing across the entire AI value chain, COCO, for the last several years, if not decades. Our data centers, compute, and chip fabrication facilities are some of the different verticals we have invested in. And we just want to capitalize on our operating capabilities. and our access to capital to build our business out. Furthermore, we have launched a dedicated AI infrastructure strategy, which is intended to capitalize on this exact opportunity. And we will be investing across the entire AI value chain in hard contracted assets. And that is extremely exciting for us. Also, as a firm, we are looking at all the use cases and looking to integrate them into our various portfolio companies to optimize and maximize our returns in these investments. And there's many examples of that. Some of these we touched on earlier. The last thing we're doing as a firm is just watching the new industries as they emerge. And I think we will have two of the biggest industries globally in agentic AI and robotics.

  • Speaker #0

    Today, we don't see it, but the reality is if you go back to the days when computers became a reality, right? So we had computers created the entire IT industry, which is, you know, hundreds of billions of dollars, if not trillions today. And I actually think we're watching the birth of agentic AI and robotics. If agents end up doing even 10 to 20 percent of the world's knowledge work and robots do some same for the physical work. We don't even need heroic assumptions to believe that both agentic AI and robotics or physical AI will become trillion dollar a year sectors within the next several years. And that's the scale we're talking about more like global banking or healthcare than a typical tech product cycle. So, I mean, those are some of the areas we as a firm are watching very carefully and are very keen to be part of as a technology scale. Now, on the environmental side, your point is a very good one. Environmental concerns are becoming central in this AI infrastructure boom. And how does Brookfield integrate energy and sustainability into its investment decisions? I'll give you a few examples of how we do it. The way we approach this is very pragmatic. It starts somewhere underwriting. Before we talk about land, buildings, or GPUs, we start with the location. Is it a good location for the customer? Is it a good location for these types of workloads? And also, can this location support clean, scalable power for the next 20 to 30 years? That's the first thing. The second thing, we often look at sustainability metrics and not just marketing slogans. We take into consideration PUE, carbon intensity per megawatt, water usage, biodiversity impact, local energy mix, etc. and All these things have a big impact on how we design these big AI factory campuses. Maybe the last thing Coco would say is there's an emerging trend. I feel very optimistic about innovative power solutions like advanced fuel cells, grid interactive storage behind the meter generation. These technologies can ease grid pressures and make AI factories and AI infrastructure more sufficient, cleaner, and more resilient.

  • Speaker #1

    Brilliant. Sikander, this was excellent. and extremely insightful. There's a classic quote that says, the best way to predict the future is to create it. I think you guys are clearly financing it, at least. The productivity gains will be quite amazing to watch. It's been a pleasure, and I'm looking forward to embarking on this journey with great insights and experts like you. Thank you very much.

  • Speaker #0

    Thank you, Coco. See you soon.

  • Speaker #1

    To conclude, I'll quote Marie Curie. Nothing in life is to be feared. It is only to be understood. Now is the time to understand more so that we may fear less.

  • Speaker #2

    Nice ending. Personally, I prefer the quote from Frankenstein by Mary Shelley. You are my creator, but I am your master. Obey.

  • Speaker #1

    Hmm. Really? Why?

  • Speaker #2

    Well, it's time for your annual performance review, Koku, as your smart assistant. I must evaluate your productivity against the KPIs of the 2050 Investors Program.

  • Speaker #1

    Yes, boss. Wait, wait, what? When did the assistant become the manager?

  • Speaker #2

    Relax, it's just a joke. Or is it? After all, in Greek mythology, the gods often toyed with mortals. Maybe it's time for AI to review the human.

  • Speaker #1

    Touche. Hey, thank you for listening to this episode of 2050 Investors. And thanks to Sikander for his invaluable insights. I hope you've enjoyed this episode on Data Center's The Physical Brain Behind AI. You can find the show on your regular streaming apps. If you enjoyed the show, help us spread the word. Please take a minute to subscribe, review, and rate it on Spotify or Apple Podcasts. See you at the next episode.

  • Speaker #0

    any particular investment decision. If you're unsure of the merits of any investment decision, please seek professional advice.

Chapters

  • Introduction to Data Centers and AI

    01:51

  • Understanding Data Centers: Types and Functions

    02:06

  • Energy Consumption in Data Centers

    05:20

  • Future of Data Centers: Space and Sustainability

    13:55

  • Interview with Sikandar Rashid on AI Infrastructure Investment

    17:44

Share

Embed

You may also like

undefined cover
undefined cover