The Texas Energy and Power Newsletter
Energy Capital Podcast
Turning Waste Into Power: Crusoe's Cully Cavness on Revolutionizing Energy Use at Data Centers
0:00
Current time: 0:00 / Total time: -57:11
-57:11

Turning Waste Into Power: Crusoe's Cully Cavness on Revolutionizing Energy Use at Data Centers

Cully Cavness shares insights on Crusoe's innovative solution to rising data center demand, methane flaring, stranded renewables, the role of technology in the energy transition, and much more

In April, ERCOT's Regional Planning Group published projections that ERCOT's electricity demand may double by 2030 compared to 2021. Initially anticipating a peak demand of about 90 gigawatts by the end of the decade, ERCOT now expects a staggering 152 gigawatts, up from approximately 75 gigawatts at the beginning of the decade. This would mark the largest growth rate in the decade since the post-World War II era. There are many factors driving the surge, but one of the most significant is the rise of Bitcoin mining and AI data centers. The size and speed and scale of AI data center growth that has emerged over the last year has been surprising to say the least, even to close industry observers.

However, one company foresaw this trend and built its business model around addressing data center energy needs. That company is Crusoe and on today’s episode I had the pleasure of speaking with Cully Cavness, Crusoe’s co-founder, President, and Chief Operating Officer. He has a robust background in energy and, along with Chase Lochmiller, founded Crusoe in 2018.

What sets Crusoe apart is their innovative approach to siting and powering data centers. They locate them near oil and gas drilling operations that have a lot of venting and flaring of methane gas. They also site data centers near constrained renewable developments. This means that they are using energy or power that would otherwise be merely wasted and instead converting it into useful electricity to power data centers. With the example of flaring and venting, this can reduce emissions by as much as 70%.

I talked about all of this with Cully and more, including how data centers could be flexible loads, the efficiency of AI chips, reducing emissions through methane flaring. It was a fascinating conversation and I hope you enjoy it.

I look forward to hearing your thoughts on this episode and on geothermal energy. Thank you for listening and for being a subscriber!

Share

Timestamps

2:47 - About Cully and Crusoe

8:47 - Crusoe digital flare mitigation (DFM) strategy and how they are reducing emissions

14:54 - Crusoe’s digital renewables optimization (DRO), negative pricing, and how Crusoe is managing stranded wind power

20:46  - Expected and current loads and load growth from data centers

27:45 - Flexibility of data center demand and AI; training vs. inference functions in AI

37:36 - Opportunities for greater efficiencies in AI chips

40:34 - Trends in carbon accounting; location matching

42:52 - Tally’s Law and the Energy Transition

49:22 - Cully’s thoughts on needed policy/regulation changes for the energy transition

Share

Show Notes

Crusoe Energy 

Crusoe Careers Page

Tally’s Law and the Energy Transition by Cully Cavness 

The Extraction State by Charles Blanchard

AI, Data Centers and Energy, Interview with Michael Terrell - Redefining Energy Podcast

AI is poised to drive 160% increase in data center power demand - Report from Goldman Sachs

Nuclear? Perhaps! - Interview with Jigar Shah on the Volts Podcast

Texas Advanced Nuclear Reactor Working Group at the Texas Public Utility Commission

The Energy Capital Podcast with Former PUC Commissioner Will McAdams

"The Name of the Game is Flexibility," a Conversation with ERCOT's Pablo Vegas

Share

Transcript

Doug Lewin 

Cully Cavness, welcome to the Energy Capital Podcast.

Cully Cavness

Thank you so much for having me.

Doug Lewin

Really looking forward to this conversation. Crusoe is really a fascinating company. You guys are doing some really innovative, interesting, and different things. So why don't we start with you, Cully? Tell us a little bit about your background and about Crusoe. Explain to the audience a little bit who you guys are as a company, if you would.

Cully Cavness

Great. I'm excited to be here and share a little bit about what we're doing at Crusoe, where we came from, where we're going. In terms of my personal background, I grew up in Denver, Colorado. I went to Middlebury College in Vermont to study and I studied geology and economics thinking I was gonna go into oil and gas. But at Middlebury, anybody who's familiar with the school will know that the climate conversation was a huge theme and a huge focus in that student body. And it made a big impact on me.

And so I actually, right after I graduated from college, I was awarded a Thomas Watson Fellowship, which is a program where you're sort of banished from your home country for a year and you get to go study whatever subject you really want to study for that year. And I wanted to think about this sort of morality of energy and the balance between energy and the economy and the environment. And so I was really fortunate to be able to go to Iceland where I worked with a lot of geothermal power and hydro producers. I went to China where I was much closer to coal. And then I went to Spain. I worked with wind and solar developers for the CFO of a large renewables group there. And then I went to Argentina and I worked with a hydroelectric engineer. 

And I got to really see a pretty broad survey of the global energy system, everything from finance to project development and management to engineering and operations. I saw power plants that had broken and were in stages of repair and learned a lot from that experience. 

And from that, I ended up going into the geothermal energy industry. I had a mentor who was the CEO of a company called Global Geothermal, and he took me under his wing. And for the first few years of my career, I was developing geothermal power plants, mostly internationally. And then sort of long story short, I ended up doing an MBA over at Oxford in England and came back to an oil and gas focused investment bank here in Denver. It was sort of the one energy focused investment banking role in Denver, primarily oil and gas clients. And that brought me back into the oil industry. I ended up being a Vice President of Finance for a private equity backed oil and gas company after that. And we were drilling some exploratory oil and gas wells in Eastern Colorado. That was sort of a step out from the core shale play, the Niobrara. We were miles away from the core of the activity. We drilled some wells that ended up being good oil wells, but there was no natural gas pipeline infrastructure in that area. And so the default then is, at least at the time was, all right, if you can't get the gas into a pipe, you put the oil into a truck and you send the truck to the refinery. That's how you sell the oil. And you can't do that with the gas, so you just light it on fire and you burn it. It's called a flare. And I thought that was pretty insane. And I was frankly, I was embarrassed about it. You know, just considering the path that I'd gone through and that I had really wrestled with that intersection of climate and environment on one side, but then the economic and human benefits of energy access on the other.

Wasting the energy the uncombusted methane emissions. I had a big problem with that and I've been you know, I've been playing around with mining Bitcoin as a hobby in my basement and my wife was observing that you know, the there's like hot wind coming out of the basement and our power bill had dribbled and that's also a commercial problem related to energy and an environmental problem related to energy. And the insight was basically maybe one of these problems can solve the other. What if we could package a modular data center that could go to the oil field, actually sit on pad next to a flaring well site, capture that gas that was being flared, turn it into electricity, use the electricity to power the modular data center and basically new way to, we called it the digital pipeline. So a new way to take that gas molecule and convert it into value. The view was that it would be simpler and more cost effective to transport a bit than a molecule or an electron in these stranded gas locations. And that thesis really bore out.

Doug Lewin

This is really interesting. So, you know, you, what you're, what you're describing in Eastern Colorado was obviously a problem all over the place in the oil and gas industry. And obviously this podcast is very focused on Texas. It's a huge problem in the Permian and, a little bit less in the Eagleford though. It's still a problem down there too. There's just a lot more of a gas focus down there, but particularly in the Permian where it's mostly oil and what you're describing, as I understand it, is what is often called associated gas, right? You're really drilling for the oil. That's the valuable product.  Gas, depending on what's going on in the market, when today it's very low, right? 2022 is obviously, it was very high, but over the last six, seven, eight years, it's generally been very low. 

One of my favorite energy books is called Extraction State by Charles Blanchard, a History of Natural Gas in the United States. And he describes for those listeners that are Seinfeld fans, the famous episode of the muffin tops where they start this business. Like everybody only wants the top of the muffin. Nobody wants the bottom of the muffin. So they create this business and just sell muffin tops. And they end up in this huge problem because they can't dispose of all of the bottoms of the muffin that nobody wants. They can't get anybody to haul it away. And Blanchard in his book, Extraction State, likens the associated gas to the bottom of the muffins, the oil to the muffin top. So what you're describing is a very interesting solution to instead of, and you mentioned flaring as a matter of fact, like in Texas, an even bigger problem is the venting. Like people think when they see those flares, that's a problem. And it is, it's actually a bigger problem when you can't see it and it's being vented directly into the atmosphere. 

So what you're doing is trying to use that. Can you talk just a little bit and help me and the audience understand? Because I think for a lot of folks who approach this from a climate perspective, they're going to say, well, wait a minute, but you're still burning gas to power the data centers or the Bitcoin mines or whatever. That's still contributing to emissions. But what is the difference? And you don't have to get into great gory detail, but rough order of magnitude, the difference in the emissions, how much are you reducing emissions by actually using that gas for power rather than flaring or venting it?

Cully Cavness

Yeah, some great questions. On the emissions reduction piece, so if we just compare the status quo, there's a big ball of fire, it's a flare situation, and we come in and we deploy what we call digital flare mitigation. That's what we call our technology. And we reduce that flaring volume. The important thing to know is that flares don't fully burn methane, and there have been a lot of studies on this over the years that have shown that probably 8 or 9% of the methane is escaping to the atmosphere, uncombusted from that flare. And it can vary. I mean, it can be as much as half in a really bad, poorly run flare, and it can be much more efficient, but that's sort of like a reasonable average to think about. And methane is 82 times as potent as CO2 over a 20 year timeframe. So when we talk about mid-century 2050 kind of climate goals, there's an 82 multiplier to take into account there. And our generators get 99.9% combustion of the methane. So dropping that basically to zero, it's absolutely true. We're still emitting CO2 from converting the gas through the combustion process. But by eliminating the methane, it's reducing about two thirds to 70% of the CO2 equivalent emissions compared to status quo flaring. 

The other way to think about that is that, you know, like of all the gas that does get burned into CO2 by the flare, the remaining residual methane is three quarters of the total CO2 equivalent impact of the flare. And so by dealing with that, you're dealing with like the majority of the CO2 equivalent problem, especially for those newer term climate goals, which I really do think about a lot because I think the crux of these climate goals isn't really actually about where do we get to from a parts per million perspective ultimately, like what's the right level in the atmosphere. Often there's a focus on it. We can't go beyond 450 ppm. There was an organization called 350.org that said we shouldn't go beyond 350. 

The bigger issue, I believe from a biodiversity standpoint, is actually the speed at which you get there. Because species just can't adapt fast enough to accommodate some of these faster pathways. And methane is kind of the key culprit in that acceleration and that speed side. So getting it from methane to CO2, it sort of buys time, I call it extending the climate runway. And it gives it literally a couple more decades for species to adapt. And you're moving from a semi-arid environment habitat to an arid habitat. Well, if that happens over the span of 10 years, you might go extinct. If it happens over the span of 100 years, there's a potential for a little bit more adaptation or migration for the species to be stressed but not extinct. 

And, you know, so speed really does matter in that context, which is why methane is kind of like the low-hanging fruit that makes a lot of sense to go for first. It's also something that we can do that's economic. So it turns out there's a virtuous cycle of this does make sense economically. We can provide basically a free solution for flaring that mitigates the methane. It provides us with access to a low-cost energy source that was previously being wasted and valued almost at zero. And there's an economic incentive to do more of that. So today we've deployed more than 200 megawatts of that across seven states, including your state in Texas. I was just in Midland and Hobbs, New Mexico earlier this week. And we've got kind of our fastest growing area of operations is down there in the Permian where there is a big flaring challenge, but hopefully we've got a solution that's going to make a dent in that really quickly. 

You know, I'd also just point out that we're also moving into a lot of renewables based projects as well. So we can talk about that later. And there are other forms of stranded energy that are on the other end of the energy transition where there's an inefficiency that needs to be dealt with and computing in a very interesting way can be the solution there as well.

Doug Lewin

So let's talk about that. But before we do, just really quickly on the, I think this is really important on methane, because I agree with you, the speed at which the climate impacts hit us, our species and other species, and affect biodiversity and all that matters. I think the other piece of that, Cully, too, is that, you know, if you're, if you can reduce the most potent greenhouse gasses, which, which methane is, is certainly in that group, CFCs and things like that go in there too. You're, you're also then, buying time for technological improvements to come along. Right. I mean, and I’m not am you know, I'm anticipating the criticism I'll get and it's, and it's legitimate criticism. I'm not representing some kind of techno optimist like view of this that like oh technology is going to save us. We've got a lot of things we got to do right. But one wedge of the pie is technology. And we have seen tremendous technological advancements in the energy and climate space. Just where we are right now with solar and storage, it would have been very hard to even 10 years ago, imagine the kinds of price declines that we've seen. Some people imagined it, but not many. So I think that that's part of it too, is you've really got to address that most potent piece. And so anything we can do on methane just kind of has an outsized impact. 

So let's do talk about renewables because there's sort of an interesting kind of corollary here, almost like the associated gas part of renewables or the bottom of the muffin, if you will, in that you have times and they're becoming quite common now, particularly in Texas, I'm not as familiar as with other markets, how much we're seeing this, but in Texas, the amount of curtailment and congestion that we see on the system is very, very large and rising very fast as we're not able to keep up with the transmission needs as generation is coming into the market. And so you end up with a lot of times where the renewable energy that is produced is not able to reach any place where it's actually used. So they're literally just curtailing, just shutting down wind or solar power. And it's not just wind or solar, there's actually other kinds of power that can get congested, caught behind a congested node and just not able to be used. 

So if I understand what you guys are doing, similarly what you're doing with the flares, where you're putting a data center next to it and capturing that flare and making a power plant, you can kind of do the same thing near congested nodes where you have a lot of renewable production and actually reduce energy waste and actually use that at the data center. Is that correct? And either correct me if I got something wrong and then maybe give us some more context and details around how that works.

Cully Cavness

Yeah, that's right. So I mentioned we called that first business model DFM, digital flare mitigation. And the second model we call DRO, digital renewables optimization. Meaning you can bring a data center into one of these congested nodes, especially if it's really saturated with wind power where you have this intermittency effect. You can have 20% to 30% of the hours of a year in some nodes, and Texas is a good example, that are negatively priced. So that has a number of interesting knock-on results. One is that older wind farms that don't have the production tax credit… 

Let's back up and say how is something negatively priced? How is there actually a negative price on the grid? What's happening there is there’s so much wind power that the grid can't accept it all. Either there's a transmission constraint or there just simply isn't demand at the end of the lines and so people are effectively having to pay transmission and distribution fees and receive zero revenue because there's no bid for that power. And so you're getting like an all in price that's actually a negative price. And you can do that if you get a rebate from the federal government in the form of the production tax credit. So you could sell for negative one penny if you get two and a half pennies from the federal government, your net price is actually positive one and a half. Those production tax credits expire after about 10 years. If you're an older wind farm, you don't get that anymore. So when the price on the node goes negative, you actually just have to stop producing wind power, which is like the worst outcome from an energy transition perspective. You literally could be producing more renewable power, but for an economic constraint reason, it's just actually being shut off. Even for newer wind farms that are still receiving the production tax credit, this is obviously like a very frustrating problem. And it's like a breakdown in the supply and demand connection in the market. 

So our view again is that bringing energy to the data center isn't always the right answer. Sometimes you should bring the data center to the energy. And that's really our origin story is we've been bringing data centers out to stranded energy locations. This is just another form, frankly, of stranded energy. They're also hard places to operate. They're not your traditional data center markets like in Virginia and around Dallas, and the Pacific Northwest. This is, you know, to again use the Texas example, this is rough West Texas kind of desert territory without a lot of the existing infrastructure. However, if you do it the hard way, there's access to a lot of low-priced clean energy. I believe more in the location-based approach to emissions accounting rather than the market-based approach. The market-based approach would say, I built my data center anywhere, I bought renewable energy credits for all the megawatt hours, and therefore I have zero emissions. The location-based is more like, depending on where you located that load, what actually happened in the physical real world. And if you located in some areas, the answer is you spun up a coal power plant. And if you located in other areas, it's like you absorbed a lot of otherwise curtailed or negatively priced wind power and maybe drew some power from the grid as well but the average emissions of that location-based view leads you to some very different outcomes compared to the market-based approach to carbon accounting.

And so we've really embraced that location-based energy first approach to how and where we locate our data centers. We've got projects going in Iceland, for example, the ultimate stranded renewable energy resource, a place that I spent several months of my life. And they've got gigawatts of potential of geothermal and hydropower and 300,000 people. And what are they going to do with it? When I used to live there, they were talking about running a power line to Scotland. So we're bringing AI training workloads to Iceland, and we're deploying what will be the largest computing cluster in the country of Iceland. And it will be serving AI customers with 100% geothermal and hydropower as another example of how this can work from a location-based standpoint.

Doug Lewin

I do have questions about the location based and sort of some of this move towards 24/7, but before we go there, I just, I want to cover a couple of things that I think are really important to AI. Can you help again, me and the audience kind of understand the magnitude of the AI loads? This is like, obviously it's almost becoming cliche cause every conference you're at, it's all AI all the time. But I think to a certain extent, and this is one of the reasons I was so excited to have you on is, you know, you hear all these things floating around, but there's so many different sources of information. You guys are working on this very directly. Can you talk a little bit about just sort of like what size of data centers you're seeing? And we'll put a link in the show notes. There's a pretty good report a couple of weeks ago by Goldman Sachs, and they talked about 160% increase in data center power demand. I think it's very important when you're thinking about that too, to your point about location, that's not going to be 160% increase everywhere, right? It's going to be concentrated in areas where, you know, like Virginia, that's this well-known kind of data center hub, Iceland, you were just talking about, but Texas, I think will be one of those places too, because of the energy abundance and the general ability to get low cost power, which isn't super easy, but it looks like you guys have a big part of the solution to sort of make that happen. But anyway, talk a little bit about the size of these data center loads and what that means for an energy system. You could talk about it in different places, but in Texas, our peak so far is somewhere around 85,000 megawatts, a little higher. Anyway, yeah, talk a little bit about the size of the load, if you would.

Cully Cavness

Yeah, maybe again zooming back like where's the load coming from? It's interesting to just think about physically what's happening in these AI data centers compared to a traditional data center. So traditionally a data center, you go inside there are racks, racks and racks of servers. And those racks traditionally were like seven kilowatt racks. That was a very standard, maybe 14 kilowatt racks. Those were kind of standard power densities. And the current kind of leading edge GPUs coming out of Nvidia, for example, are really optimal around a 50 to even 100 kilowatt rack. And there are prototypes that are not far off that they're being demonstrated at conferences that are hundreds approaching a megawatt in like a rack. And it's this incredible density of electrical and thermal cooling capabilities. So literally liquid cooling to every single chip on the server. There's an in and an out of cold water and hot water going to every single chip on the server. It's allowing them to just compress more and more computing power into smaller footprint. There's that going on. And then there's this insight, which is that the more GPUs can be networked together on the same cluster, the more performant that cluster will be at training, for example, a large language model. And there's a physical constraint element to that. So you actually have fiber distances that have to be considered, how far away can the farthest GPU on the cluster be away from the center of the cluster? And that leads you to wanting to have these really dense configurations of lots of computers close together so they can all be networked together on the high-performance networking. 

And when you play that out, what that's meaning is that what used to be a big data center was like a five megawatt data center, and then a 10 and a 20 megawatt data center. These were kind of beyond belief huge, even just a decade or two ago. Now it's sort of looking like we're going towards 100 megawatt data centers that have 100,000 GPUs all networked together in a single cluster. And perhaps that's even going to be small in the not too distant future. We might go to hundreds of megawatts and campuses that are going to be gigawatts. And there's this arms race of who can train the best models and who can operate the best inference off of that model. And it seems right now that size is going to matter and it's leading to a real land grab around access to megawatts, access to digital infrastructure. And our view is that it's really important how and where we kind of locate those megawatts on the grid and potentially even off grid. And so that's really where we take that energy first approach to development and construction and operation of data centers. We've got a piece of our business which is building the data centers, the 100 megawatt scale. We have our own design, which we think is really an optimized design around heating, cooling, those physical constraints and distances within the data center. And those can be essentially offered and leased to larger technology companies for them to host their own GPUs in there and do their own workloads. We also have our own GPU cloud product called Crusoe Cloud where we put our own computers in there. We rent them out by the hour, by the three-year contract. We have different models, but we actually are the cloud provider within the infrastructure as well in some cases. And we have two different ways to then basically take those megawatts to market. But it all starts with energy first locations that we can develop into that data center infrastructure.

Doug Lewin 

Thank you for that. And so just to put that in perspective for some folks that may not be as in the weeds on power as others, when you talk about the old racks used to be 7 or 14 KW. A home on an average day might be using two or three KW, an average sized home, couple thousand square feet or something like that. On a really hot day, it might be five, six, seven. So, one rack would have been equal to like one home on a very hot day. I'm speaking in generalities, obviously it matters how much insulation and what kind of HVAC and what you're doing inside your house. If you're running multiple hairdryers or something at once, it might be different. But these Nvidia racks you're talking about, now you're getting up to 500 to, excuse me, 50 to a 100 KW. So right now you're talking about something like 10 to 20 homes. And when you talk about a data center that's a hundred megawatts and even clustered, and actually one of the previous podcasts was with ERCOT CEO Pablo Vegas. And he talked about how we're seeing 500 megawatt, I believe the numbers he said were 500 megawatts, 700 megawatts popping up on the grid, like all at once. So that's probably what you're describing as one of these campuses or clusters. Now you're talking about like small cities. It's like a small city popping up on the grid potentially in the space of 6 to 12 months, give or take. About right?

Cully Cavness

I think your order of magnitude is right. I used to think about a house as one KW. 

Doug Lewin 

Yeah, not in Texas. Everything's bigger down here. 

Cully Cavness

But yes, I mean, roughly that's right. I mean, these are huge sources of power demand when they come online at this kind of scale.

Doug Lewin

And so can you talk a little bit about the flexibility of that demand? So this is becoming a bigger and bigger issue. In Texas, ERCOT has established a large flexible load task force, which spends a lot of time, has spent a lot of time thinking about Bitcoin, but is going to more and more need to be thinking about these AI data centers. And I've heard a lot of different things. There's no flexibility at all. There's a lot of flexibility. I've heard everything in between. Can you just kind of share your perspective on how much these can be flexible and if flexibility can be built into the design. You mentioned you guys are designing some for heating and cooling. Are you thinking about building flexibility in? And let me just, before I turn it over to you to answer that, I would imagine with what you guys are doing, that flexibility would be important because when you're siding next to a congested node, like you wanna be running as much as you can during those periods. But then when it's not congested and power prices are getting really high on the grid and there's actually not enough power, maybe you could move that wind or solar rather than using it, the data center, you could move it onto the lines and let it flow to Dallas or Houston or whatever. And the prices are higher than you would want to use less there. So maybe you do that through changing around how much cooling is going on or you site batteries there or if it's a large language model, maybe you're batching functions and not running them 8,760 hours, which is the number of hours there are at a year, but maybe 8,500 and even not needing power for even a couple hundred hours would make a huge difference. So can you talk a little bit about what flexibility there is or is not with AI data centers?

Cully Cavness 

Yeah, I think it comes back to the customers that are using the servers. And right now, I think a lot of the market is still stuck in an old mental model of data centers need to be tier four, meaning 99.999% uptime reliability. And it seems to me that many of the incumbents still have that perspective as it relates to this new wave of AI data centers, it's not clear to me that all the use cases of AI computing require that kind of uptime. And I think that you actually had an insight, which I've been thinking about a lot, which is this batching idea. So you can checkpoint these models, meaning if the power goes out in a worst case scenario, you don't have to restart training the whole model. It kind of falls back to your last point that you saved the model. It's like if you're playing Mario and Bowser kills you, you get to start back at the beginning of the level. You don't have to go all the way back to the beginning of the game. 

And that's important, right? Because that means that, okay, maybe you would tolerate a certain amount of outage and it becomes an economic decision. If I can offer you a much lower price point per hour of training on average throughout the course of the year, provided that I can interrupt your workload 1% of the time or 10% of the time or something. Is that a trade that you're willing to make? 

I believe that for some percentage of the customers that will be a trade that they're ultimately able to make, the market hasn't really moved there yet. But I think some of the kind of frontier folks are starting to realize that. And we would really advocate for that because that would allow for more of the demand response feature that, for example, in Texas is such a big deal in you're deregulated market, you've got this really kind of beautiful, again, market-based incentive driven system to provide flexibility back to the grid. And when it gets really hot and everybody's AC is on, you can get paid to turn off if you're able to. And we need that economic signal to flow through to the customers that are doing the training workloads. I would say there's going to be a difference between training and inference. 

So broadly in this AI computing world, there's these two categories. Training is the very kind of it's a longer term, it can be days, weeks, months of running a model over large amounts of data to find the insights and create the weights and balances, weights and biases in the model that sort of builds the model, let's say. And then inference is once that's been built, it's using the model to do tasks. So it might be for a self-driving car to detect, is that a stop sign or a yield sign? That's like using the model to make one inference or for ChatGPT, it would be answering the question that you just posed in the chat box. That's an inference. And, you know, clearly that is a more, you need to be up and available to provide that service. Can these be federated across multiple data centers? So one can be offline at any given time if as long as another one is online. I think these are really interesting questions and it's just a totally new approach to the digital infrastructure and the kind of interaction of digital infrastructure and the energy infrastructure. It just, we need to get a lot more sophisticated on that. 

That's what I'm really excited about at Crusoe is we have a team that's a hybrid of energy professionals from grids and utilities, from upstream oil and gas producers, from renewables developers. And we have a team of really seasoned executives who have built and operated data centers and cloud products. And you know, we are sort of merging those two disciplines in a pretty special way that we're going to try to find and use these kind of insights of how digital infrastructure can and should interact with energy infrastructure.

Doug Lewin

Yeah, I think that's really, really interesting and a really sort of potent mix, one that is really needed at this point. We'll put in the show notes a link to, there was a podcast, I believe it was Redefining Energy did one with Michael Terrell from Google. And he was describing how they're not there yet, I think, if I remember right, he was talking about sort of where this might be headed in that with what you're talking about with inference, where the task needs to be done, you need to know, is that a stop sign or a yield sign? You can't wait till later, but they could move those functions. And you sort of suggested this just a minute ago, but just to dive a little deeper into it, you could move those functions to different data centers based on how much energy is available. 

And I was sort of getting this image in my head of like, there's the saying, the wind's always blowing somewhere, the sun's always shining somewhere. And so if you picture some of these data functions actually moving around the world with the sun, so that solar power is providing a lot of these tasks, but at different data centers. And you don't really care as a customer, as an end user, I don't care what's happening in Iceland or in Texas, right? It doesn't matter as long as when I need the information, it's there, right? And so you could kind of see some really interesting, and it's going to take this close integration of, as you're describing, this sort of energy expertise and the data center expertise for where is that possible, where is it not.

Cully Cavness

Yeah, I think it's, you're probably more right than wrong. And if you think about what that means from an infrastructure build perspective, it's like a huge undertaking. It's like a global rebuilding of the digital infrastructure, the fiber infrastructure, and all the energy that has to go into it. I mean, we're talking about many trillions of dollars of capital investment. I think on that specific idea, there's this latency constraint, which is some things need to be really fast. Like that if it's a stop sign, you really need to know it's a stop sign and you need to know it immediately. There's other forms of inference that you probably could wait a couple hundred milliseconds to get the ping back from Asia. If there was, if it was sunny in Asia, it's nighttime here, you know, you could you could actually imagine that. Right now, most of the latency, if you type in a prompt into ChatGPT, is still the model itself doing the computing. But as they're getting more efficient, that portion is getting smaller and it's going to be the percentage that's borne by the networking latency is going to be a larger percentage as that happens. And so there will eventually be this interesting question of how close does the workload need to be to the problem? And for some, it'll have to be very close. And for others, it could maybe be anywhere on Earth. 

Doug Lewin

You were also talking earlier about how much of these loads are actually cooling loads, right? And so there's probably also some demand flexibility there, I would think. I don't know. And so let me phrase it in the form of a question. Could you do, like for instance, as on the residential side, if you have a home that is well insulated, you can pre-cool your home. So Texas is a great example. It's going to be a hundred degrees basically every day this summer. We've already hit a hundred a couple times and we're not even as we're recording out of May yet. But you know, you have all these days that are a hundred degrees, tons of solar power, pretty low prices at two o'clock and three o'clock in the afternoon. It's really not an issue. The issue on the grid, the tightness is going to be seven, eight, nine o'clock as the sun is going down. So you pre-cool the home you use less in those evening hours. Is there the ability to do some of that at data centers too, where you're actually making it a little colder, or does it have to be like, it's got to be at this temperature and it can't vary?

Cully Cavness

It's more of a run rate kind of problem. I mean, you've got just 100 megawatts of heat being produced 24-7, and that has to be evacuated on a very continuous basis. There's a little bit of thermal inertia if you were to pre-cool, but…

Doug Lewin

Not as much as with a home.

Cully Cavness 

Never say never. I mean, maybe you get a big reservoir of cool water and you kind of like pre-chill some big thermal mass or something. But in your normal data center, it's less of an opportunity. There's definitely a lot of efficiency opportunities. Having the most efficient chillers and being thoughtful on the designs, the engineering, there's a lot to do there. 

Doug Lewin

And there's efficiency in the chips as well, right? Can you talk about that? You were talking about the Nvidia chips and they're obviously, they use a lot of power right now, but I think Nvidia, at one of their last events did talk about how their chips are more efficient. Are you seeing that on the data center side?

Cully Cavness

The chips are more efficient, but there's more chips and bigger chips. And so the power consumption of the server isn't necessarily declining. In fact, it might even be rising. I mean, again, if you back up to just kind of first principles of what's going on here, you've got the sophistication of making these chips has gotten to such an insane point that there has to be some kind of like tailing off of the efficiency gains. 

So just as an example of how these things are made. They're mostly made in Taiwan and there's a factory there owned by TSMC. A lot of people might have heard of this company. It's the largest semiconductor fab in the world. And they have this machine which is made by a company called ASML. And this machine costs something like $800 million per machine. And it's transported in like four jumbo aircraft to Taiwan.

And they have 80 plus of these machines in TSMC, as far as I understand. Each one of these things is like a major installation engineering project to put this thing together. And then once it's there, the way it works is they're liquefying a droplet of metal that's falling into like a convergence of a bunch of lasers. These lasers are blowing up this droplet of metal into a very specific wavelength of ultraviolet light that bounces off a bunch of mirrors. And then it goes on to a piece of silicon that's like 10 atoms thick or less. 

Doug Lewin

Wow.

Cully Cavness

And it's etching a pattern of circuitry into this and then another deposition of a substrate and another blast of this extreme ultraviolet light. And it's doing this like tens of thousands of times per second across this whole factory. 

When you're measuring things and just like atoms thick, you're running up against real boundaries of physics can do and I don't know how much you can just depend on the chips getting more efficient.

Are there different versions of chips and different designs of chips coming that are definitely different approaches? Yes, absolutely. There's some new startups that have released some new chips that look like they could be way more efficient on the inference side, for example. The training piece, it's a little bit less clear that you've got huge gains that can still be made there. But again, never say never when it comes to technology. I'm just pointing out that they've been at this for like decades now and it's gotten really optimized. And there are certain pieces of it that might be kind of asymptoting.

Doug Lewin

Yeah. And it's a little bit of a law of diminishing returns, right? At a certain point, like you just, you reach that kind of limit where, I mean, we're seeing this like on, we're not there yet, there's still more cost declines in solar and storage, but when you see those curves and you can only get so close to zero before there, there wouldn't be any money left to be made in it. So at some point you hit that terminal point.

Cully Cavness

Maybe. I mean, I just worry when people say it's going to be solved by more efficient chips. There might be limits there.

Doug Lewin 

No, that makes a lot of sense. That makes a lot of sense. All right, so let's come back to something you mentioned earlier. You were talking about carbon accounting and the location-based. There's, again, a lot of emphasis among some of the major tech companies, but I'm seeing this get talked about a lot more, what is often called kind of the 24-7. So you were describing this earlier. A lot of companies want to have 100% clean power, 100% renewable power. Sometimes they describe it different ways. But what they're trying to do is do like hourly matching. You were talking about location matching. Are you seeing more and more companies actually asking about that sort of thing, actually getting much more granular about carbon accounting? Obviously the majors, the Googles and the Microsofts and that are doing that sort of thing. Is it getting beyond those companies or is it still pretty concentrated in a few companies that are doing that kind of work?

Cully Cavness

I think it's still kind of early days on a bunch of this and they keep modifying the rules and editing them over time as I think people just realize there are better approaches to carbon accounting and the standards that go into it. 

You know, location matching is important, but it's also like really important like what's happening at that location. I guess that was the point I was trying to make is that when you add a load to a specific node, it causes a specific set of energy generation resources to get called up. And in some of those situations produce a lot more emissions than others would. So just kind of first being thoughtful about the location and the realities on the grid of what happens when you add load there. Then to me, the second thing is, okay, you're buying RECs and you're providing that economic incentive for the renewables to get developed, which is valuable. It's a great thing that we have that mechanism in the market. But to me, the bigger question is kind of that physical reality of where was the load located?

Doug Lewin

For sure. And you're starting to see, I think they're called TEAKS. There's like time energy attribute or something like that. There's a time-based attribute kind of accounting that is already going on. It'd be interesting to see if that could be extended to location as well. I think it's a really interesting idea. I actually hadn't, I'm embarrassed to say I hadn't really thought about it before, but it's a little bit embedded in the time thing because where you are would matter, but it's not explicit and it probably could be teased out more. It's really interesting.

So I want to ask you just a couple more things before we wind down here. A couple of years ago you wrote an article on Medium. We'll put a link to it in the show notes. Tally's Law and the Energy Transition. Can you talk a little bit about, give people a little preview if they go and read that. What are they going to see? What's sort of your thesis in that piece?

Cully Cavness

Yeah, I mean, this goes back to some of that early, my early experiences in the energy industry and wrestling with just the identity of wanting to work in the energy industry. I mean, I think it's the most fascinating industry. It impacts people. It impacts human lives in a lot of really positive ways. Just having access to any form of energy is huge and transformational for so many people. And back in the early 2000s when I went from Iceland to China, China's pretty wealthy, but in the mid 2000s, it was still pretty rough for the majority of the people there. And most of them could not care less if the power was coal or solar or wind or the climate impacts of any of it. And they just wanted to have the economic means to have another meal that day, you know, like to add protein to their diet, to upgrade from a bicycle to a motor scooter or from a motor scooter to a small car or something like that.

And, you know, starting in Iceland, it's the exact opposite. It's like this abundance of riches of unlimited renewable power. And it's very easy to say, yes, we'll just have 100% renewable power. We only have 300,000 people and we have more hydro and geothermal than we know what to do with. So just two very different realities for those two countries at that time. 

And it led me to that framework of the triangle of ease, energy, the economy, the environment. And I sort of formed a formula of how I thought about the world along these lines, which was RT equals PQV. And people can look it up if they want to, but basically natural resources times technology on one side of the equation and population, quality of life, and environmental health on the other side of the equation. It's a mental model for like, if you change one of these five variables, what happens? So if you increase population or increase the quality of life of a population and you don't have any change on the energy technology side or the natural resource side, then the variable that has to fall is V, environmental health. And conversely, like if you can innovate with technology and increase the left side of the equation, you could increase population, you could increase quality of life without impacting environmental health. 

And, you know, there's more to it than that, but I think that's like kind of a simple way to summarize the viewpoint. And I guess I do take a little bit of the techno-optimist’s viewpoint. And well, in the sense that I do think technology will largely be the solution or it won't be, but we're definitely not shutting off the energy. And buying time, more time for technology to end up being the solution is really the most important thing we can do right now. So the low-hanging fruit being things like waste, things like methane emissions. Let's solve those now so we have more time for the technology to grow and increase the T variable so that we can continue to have better quality of life and more people without having a big climate impact. And I'm hopeful on that.  I mean, I think there are a lot of things that are on the horizon that look great. It appears carbon capture and sequestration is getting there. A lot of interesting things happening on batteries. Radia just came out with this new wind turbine that looks incredible. I don't know if you've looked into this company at all. You should have Mark on your podcast if you want to talk to a really interesting entrepreneur in the wind space right now.

You can go on and on, all the nuclear stuff that's happening. There's a bunch to be excited about, but it's clearly going to take a while for this to mature and to really scale up. So again, back to extending the climate runway, we got to stop emitting all this methane just needlessly. We got to stop wasting energy. Let's stop curtailing renewables. Let's tap into the latent stranded renewables where we can, if we can locate a load in Iceland versus in the Eastern Seaboard where it's going to be much more fossil oriented. Maybe that's a good idea. So again, we're going to energy first, trying to be thoughtful about these things and extend the climate runway while still accommodating this wave of AI demand, which is clearly happening. So it's just sort of like, how can you influence it to be least impactful?

Doug Lewin 

Yeah, I mean, it's happening and I think we're also gonna see increasing demand for all sorts of different things, including quality of life, right? There's still, like you're talking about China 15, 20 years ago, there's still a billion people in the world that don't have access to electricity and they need to get it. That's just wrong. We've got to figure out ways. So that formula you just described I think is really, really interesting. 

I'll tell you, I mean, I think we're all creatures of our experiences. And, you know, we have, obviously as any two people have had different experiences. I do think technology will play a big role in it, but I think there's this kind of interplay happening all the time between technology and markets and policy, right? And so if you're in a place where like you could have the greatest technology, but if the policy makers set up a system where they're favoring other technologies or just not allowing market entry, right? And there's this interplay between those things because you can only hold back, you know, the water, if you will, hold back the deluge so long and a technology will kind of like water flowing, right, will find its way to get in even where policy is trying to keep things out. But policy can really slow down or to put it more positively, policy can really enable markets to bring technologies in really quickly. 

Anyway, I just, while technology will have a big role, I do think there's always going to be this interplay and depending, I don't know if you have any thoughts on that, any place in particular that you're operating where you see like policy stands in the way or makes things easier. I kind of like this Texas market because what you were describing earlier with like, it's going to depend on the customer when I asked you about flexibility, right? We have a market that kind of accommodates that. Like if you're willing to pay thousands of dollars a megawatt hour, you can have the power whenever you want it, but everybody's going to have their price as to when they're willing to curtail. So there's an example of a market sort of enabling technologies and flexibility. That's mostly on the large customer side. We need more of it on the small customer side, but that's a topic for a different podcast.

Cully Cavness

This isn't as close to our business, but I would really love to see a streamlining of nuclear permitting. I just think there's so much potential there and the fear of it has led to a permitting regime that makes it almost impossible to get anything done. And it's sort of disproportionate to the risks at this point. I think if you really talk to experts, they would say that, which I'm not a nuclear expert, but I've been to how many hundreds of energy conferences, I've heard enough to know that it's really, really safe. When it goes wrong, it goes really wrong, but other things go a little bit wrong all the time. And the actual impact on human health and the environment is probably worse from a lot of other energy sources than nuclear running very safely for 99.999% of the time and having a big accident even once in a while, which it does appear they're even getting better at really minimizing that outside risk with the new generations of nuclear reactors. 

And updating our framework to allow more nuclear to get permitted and built would be huge. Obviously, the grid transmission permitting is a big deal. At some point we might have to get a little bit more aggressive with like eminent domain or something. I don't know, you just can't let every potential objection, get in the way of building out the really critical infrastructure that we need for the whole country. And there's a balance. Personal property rights are essential. They're like the cornerstone of the country. But there has to be a practical solution to get things done. And these things really need to get done. And so that's a hard problem. But we've got to figure that one out.

And I think, yeah, the Texas grid is a good example. I mean, you guys get a bad rap with some of the outages from some of the winter storms recently, sometimes. But I would say by and large, it is viewed as the epicenter of entrepreneurship and dynamic business models in the power industry because of the way it's been deregulated. And I think ERCOT’s being pretty thoughtful. It's like they're still regulating and controlling the really bad edge cases, right? There are like caps to power pricing. There are kind of like curtailment mechanisms to avoid the blackouts that things are changing so fast. They're clearly not perfect yet, but they seem like they understand the problem and they're moving in the right direction really fast. But in the middle range, like the normal course of business stuff, just let the free market operate and it'll tell you what kind of power resources and what kind of, you know, do you need battery storage? Do you need peaker plants? do you want more renewables? Let the market figure all that out and not picking too many winners there. That seems great and also kind of reducing the gatekeepers of the traditional grid players, the traditional utility players, letting all that kind of be a little bit more flexible so more participants can come in and find their niche, it just seems to be like it's on a really healthy track in Texas in general, despite some of the headlines, which, you know, look, I wasn't there when the power went out, and I know a lot of people were, got hammered by that in really bad ways. And maybe, though, in the ultimate grand view, it's like a price that was paid for making the grid a lot better in the long run and making it this kind of free market approach where a lot of really cool innovations can come in and make the grid much better 20 and 50 and 100 years from now.

Doug Lewin

And I think for anybody listening that wants to dive more into Winter Storm Uri, we'll put a link in the show notes to the very first podcast of the Energy Capital Podcast we recorded, which was with former Commissioner Will McAdams. He was just a week or two out of the Public Utility Commission. We recorded it. And we talked there about what happened during Uri and how it really wasn't a quote unquote market failure. It wasn't like the... That was really if we weren't yet regulating, and I would argue we're still not yet regulating natural gas supply to be winterized. Power plants weren't well enough winterized. And we hadn't weatherized homes and buildings, so demand was kind of off the charts. And that's something TUC continues to look at and may do more on. 

On the nuclear piece, and we're going to have, there's going to be lengthy show notes for this one. This is good. We've covered a lot of ground. We will put a link. There is an advanced nuclear working group at the Public Utility Commission. Governor Abbott put Commissioner Glotfelty in charge of that. I'm really intrigued by nuclear. I think, you know, just today, the day we're recording, Secretary Granholm, obviously in the Biden Administration, was at a ribbon cutting in Georgia for the new nuclear plants. And she said, we need 200 more. We got two here. We need 198 more. It's one of the few areas where I think Republicans and Democrats seem to really want to kind of both pursue a similar solution. And to your point about the dangers of nuclear versus other energy sources, we'll put a link to a great podcast, Dave Roberts, the Volts Podcast did with Jigar Shah and comparing, for instance, the radiation of coal ash sitting around on coal sites compared to what's at a nuclear plant, far worse. What is just sitting around like basically uncovered in these coal retention ponds. 

Cully Cavness

That's what I was sort of referencing. I mean, there's this, people are afraid of crashing in an airplane much more than crashing in a car, right? Because the airplane crash, the mental model is so gruesome and horrible and the car crash seems like something you do every day. So it's not something to be as afraid of. But the reality is the car is gonna kill you a hundred times more often than the airplane is gonna kill you.

Doug Lewin 

Yeah, it's like 40,000 deaths a year in automobiles or something like that and a very low number…

Cully Cavness

I think it's a little bit like that with nuclear. I mean, it's scary because you think of Chernobyl, but if you actually do the kind of risk math, it's not the scary one to be worried about.

Doug Lewin 

Yeah. Cully, I really appreciate you doing this. The model Crusoe has with this, you know, really citing, big energy loads next to areas where there is abundant and so abundant that it's often wasted energy is really fascinating, really thrilled to hear you're in Texas and doing things here. We'll definitely want to follow your company very closely and encourage our audience to do the same. Is there anything else you'd like to say? Anything I should have asked you that I didn't that you'd like to talk about before we end?

Cully Cavness

You know we're hiring and if anybody wants to join the team we'd love to have you. We've got a really special team of talented, entrepreneurial, hard-working, creative folks here and we need more of them. So check out our website, Crusoe.Ai  and look at the careers page. I'm recruiting for a Chief of Staff which I'm really excited about. So if anybody wants to check that roll out in particular I'm just starting to look through resumes right now. Yeah, I think other than that, we're just happy to introduce ourselves to your audience and we'd love to stay in touch and do it again sometime.

Doug Lewin

Thanks so much, Cully. And we'll put a link to the, if you send us one, we'll put a link to where that job posting is. And yeah, I can't think of, you know, I mean, it's gotta be on the short list of most interesting places if somebody really wants kind of a front row seat into the energy transition and AI data centers, rising load growth, all these things that are going on. What an interesting position you have open. So we'll be sure to, to put a link to that as well. I learned a lot from this discussion. I really appreciate you taking the time. Thanks so much, Cully

Cully Cavness

Me too. Thanks for having me.

Discussion about this podcast

The Texas Energy and Power Newsletter
Energy Capital Podcast
The Energy Capital podcast focuses on Texas energy and power grid issues, featuring interviews with energy professionals, academics, policymakers, and advocates.