Sept. 3, 2024

AI Supercomputing, Rural Development, and Data Centers with Applied Digital CEO Wes Cummins

Ann Berry (00:00):
I'm Ann Berry. 

Katie Perry (00:01):
I'm Katie Perry. And this is after earnings. 

Ann Berry (00:04):
And it's a big one, Katie. Huge. It is the day after Nvidia earnings. 

Katie Perry (00:08):
Super Bowl. Super 

Ann Berry (00:10):
Bowl, 

Katie Perry (00:10):
Or no, Taylor Swift. But 

Ann Berry (00:12):
Watch parties. People are gathering a 

Katie Perry (00:13):
Lot of people in bars, drinking, watching what was happening. Crazy. 

Ann Berry (00:17):
Well, I wonder what the Swifty equivalent for Nvidia is. We'll have to come up with it with a good name for it, but this was a fascinating one, right? Nvidia earnings were pretty strong, yet the stock is down. So this has been the poster child for the AI revolution. Good earnings stock is down. So what do we do? 

Katie Perry (00:30):
Yeah, and naturally, if the play is not the chip maker, where else do you go? What do you think about that, Anne? What else is in that value chain that you see opportunity in? 

Ann Berry (00:39):
Well, that's why I'm really excited for this conversation. We've got coming up, Katie with Applied Digital because to your point, they're not chips, they're also not software, software using the ai. There's something else. Their infrastructure, they're the data center. So not an area I know a ton about. So I'm really excited for 

Katie Perry (00:53):
This one. Same, we talk a lot on this show about the cloud and this is sort of like not the cloud, the physical locations that are the size of multiple shopping malls huge that are making this super computing possible. And even more interesting is where these data centers are being placed in the country. Places like North Dakota, where frankly a lot of these rural areas have been left behind by the ai, the tech revolution 

Ann Berry (01:16):
Lot to learn. So let's get into it. Let's bring in CEO and co-founder of Applied Digital, Wes Cummins. 

Katie Perry (01:22):
Welcome to the show. Wes, it's so great to have you here with us today. 

Wes Cummins (01:27):
Thanks for having me. I appreciate it. 

Katie Perry (01:28):
So let's get right into it. Let's talk about applied digital. What is it that you guys do and where do you fall in the AI value chain? 

Wes Cummins (01:36):
Sure. Applied digital, what we call is we build next generation digital infrastructure. We're building the foundation, the building blocks, primarily data centers for AI workloads. Those data centers are just a different style of data center than what has been built and used in the past for applications like this video streaming that we're doing now or TikTok or YouTube. That was all stuff that was built in metro areas and population centers, pretty low power density and now we're building ultra high power density, ultra low latency inside the data center for supercomputing. Supercomputing is really what drives AI and it needs a different style of infrastructure to support that. So what we're building is really purpose-built facilities for Nvidia GPUs right now. So we built that. We also run a compute platform where we actually own the GPUs. We operate them for our customers. These are typically AI labs or AI startup companies. So those are the two primary products we provide, but it's really, it's the foundation for what runs AI and operates it in the world. 

Ann Berry (02:44):
Whereas you made a reference there to how prior data centers tended to be located in heavily populated areas. Talk to us a little bit about the geographic footprint of your portfolio of data centers right now. 

Wes Cummins (02:55):
Sure. So we're building our first big in a town by the name of Ellendale North Dakota. So this is not a traditional data center market. It's a place where we've found a significant amount of abundant power that doesn't really have the grid to move the power from the point of generation to the point of load or where someone would use it. So we sit at a substation where two gigawatts of wind power feed into the substation. Then when the wind is blowing, it can't really transport all of that out to users. And so the number one ingredient for AI and for these workloads is power. And so we're able to do large scale power, but latency outside of the data center is less important. So the footprint is very different. We prefer cold locations because when you use a lot of power, you generate a lot of heat, you need a lot of cooling. So free ambient cooling outside is very useful. And then low cost power is helpful as well. So power and cooling is what we focus on. 

Katie Perry (03:59):
Cool. And then I know you mentioned in the earnings call there was a pretty pointed question from Alice saying you got someone's looking at multiple contracts, you guys other people, why you guys, why are you different? And you mentioned something about just the approach to where you're finding these energy sources and building from as part of that, can you just explain how this next gen approach is different than some of these legacy systems that are in place today? 

Wes Cummins (04:23):
Sure. So legacy system, when you were doing development for legacy data center, you would go to the location first. So you wanted something near New York City, for example, where we're all sitting or something in Dallas or something in Phoenix. And you would go to the location, you would look at the fiber map and pick the right place from the fiber connectivity perspective, you'd buy real estate, then you would submit an application to the power of the local power provider for X number of megawatts for the system. The issue now is people would still do that however you go and make those submissions, for example, Northern Virginia, the biggest data center market in the world, you're 10 years out on power and especially large amounts of power. And so we reverse that and we've been doing this and one of the things that's interesting about our companies, we've been doing this for a couple of years already, which isn't that long, but you got to remember that this really kicked off 18 months ago when the Nvidia H 100 was introduced right after chat GPT hit. So we were ahead of that. We find power first land fiber and we make sure we have all the ingredients, but we focus on the power first. So we take these workloads to the point of power generation rather than to the point of population and then transport power in to power these workloads. 

Ann Berry (05:39):
Whereas if that's been your edge so far, let's talk about how that might change for applied digital. With the advent of NVIDIA's new chips, I saw first page on one of your previous investor presentations has got NVIDIA's logo and has it up there as applied digital strategic partner and Nvidia earnings just out a lot of chatter around the new Blackwell chip faster but also consuming less energy. So if your key strategic partner is now launching a product where a key aspect of its strength is going to be a reduced consumption of energy, does that change the way in which applied digital competes versus other providers? 

Wes Cummins (06:17):
So we should clarify this so it consumes less energy for the output, but they've made the chips more dense so the server itself consumes more energy. You just get more flops per amount of watts consumed, if that makes sense. So the infrastructure from the power density perspective actually goes up not down, you're just getting more out of it. It's like when you had a laptop that was really power inefficient 20 years ago, your laptop now maybe uses the same amount of power but you're getting so much more out of it. So the infrastructure portion is not going to change the big change that's coming. And we've purpose built our facility in North Dakota for this. I think it's going to be one of the first large scale buildings for this is the liquid Cool, the move to liquid. Cool. This is going to be one of the first very large scale liquid to liquid cooling systems in the world in our facility in Ellendale, North Dakota. And so I think it's just important to make that distinction. Yes, it's getting more efficient but actually the infrastructure power density is still going to go up. 

Ann Berry (07:22):
We, let's talk a little bit about the recent results. So slightly mixed Q4 and fiscal year 24 results. I know that there was some reflection on underlying performance. There are a couple of one-offs that also hit your earnings numbers like a power outage for example. Why don't just give us the highlights from the applied digital earnings that have just come out. 

Wes Cummins (07:44):
Sure. So on the positive side, significant year over year revenue growth, the significant year over year EBITDA growth on the negative side, we had some operational issues at one of our facilities, which I should clarify this. When we started we were building data centers for bitcoin miners, which are built much differently than what we're building today for ai just from a redundancy perspective. So we had some outages due to some transformer issues that kind of hampered operations in the quarter. Those are completely resolved at this point and so we won't have that as we operate going forward. The other item that I would highlight is, and we talked about this on our call, is while kind of our underlying financial performance I think improved the way we account for and depreciate GPUs that we are buying the Nvidia gear that we're buying, we talk about the financial depreciation doesn't reflect the useful life in our numbers and we're working on a different financing mechanism, which we change that dramatically. And so to explain it in somewhat layman's terms is we are purchasing GPUs and servers that have a five to six to seven year useful life we're or expensing those assets over two years. So it has a drag on our EPS number, but we're working to remedy that in future quarters so that the actual financial performance shows in our results. But we're still seeing significant year over year revenue growth and significant year over year EBITDA growth even with the operational challenges we had in the quarter. 

Ann Berry (09:16):
Let's talk a little bit about how your customer base is evolving, kt, I know that you particularly focused on that. Yeah. 

Katie Perry (09:21):
You mentioned on the call huge LOI with what you refer to as a hyperscaler. I think some of our listeners might not know what that is. So first off, can you just frame up what hyperscaler means in the context of you and also paint some color on what this LOI if it comes to fruition will mean for your business? 

Wes Cummins (09:39):
Sure. So hyperscaler is kind of exactly what the word means. So very large scale data center operator. So we'll operate or developing and will operate the data center for them. They put their compute workload inside, they sell it generally as a cloud service. They may consume it themselves. When people refer to hyperscalers here in the US and we said US-based hyperscaler, there's a handful of them, but if you focus on the top five hyperscalers in the US, you're talking about Amazon, Microsoft, Google Meta, and Apple. And so those are the types of companies that we're targeting for our business and for this large scale build out in North Dakota. And this will be one of the largest campuses in the US when we complete the campus, it's 600 megawatts utility power, 400 megawatts critical. IT load just a massive, massive amount of power going in there. 

(10:33)
But those are the type of customers that we're focused on. And what I talked about on the call is the way we're breaking into this market, we're a small company and the way we're breaking in is our speed to market. So we saw this I think a little earlier than other people. We started building this building in ELLENDALE last year on spec. We worked with NVIDIA's data center design team to design the facility so that we were kind of future proofing it for Blackwell deployments. And so we started building it on spec. We didn't have a customer, but then we did sign this LOI with a hyperscaler in April and we've been working through the qualification process so that they call the SOQ. And it's a lot of technical due diligence, design, diligence, power redundancy, all of the things that you need to have a site that operates at five nines of reliability. Then we've made it through that process. We're not quite to assigned lease yet, but we're right call it kind of the one yard line here. And I think that's really going to transform our company. It's a big stamp for us to get through that process and break through into that market. It's a big market and one of the fastest growing markets, especially again with the AI workloads that have just really transformed what's going on in the industry, 

Ann Berry (11:52):
Whereas the hyperscalers are fast growing. They are also including many of the names you just listed as the key hyperscaler potential clients that you are in LOI stages with, many of them are developing their own chips and competition with Nvidia over the longer run in these negotiations, are you seeing any pressure to bake in and openness to change out your NVIDIA chips to a hyperscaler own one down the line? 

Wes Cummins (12:16):
So just to be clear, in this type of building, we don't choose the equipment that goes inside. So we don't choose whether it's Nvidia or whether it's a MD or whether it's their own gear. All of this gear is going to need the same style of infrastructure. So think of us as a high tech landlord and then they put their own compute gear inside. We don't take the risk on that gear, it's theirs. They operate it. We just do smart hands and some operations on site. So we don't actually don't make that choice. So the infrastructure is built kind of in an agnostic way for the future where what we know is it's going to need high power density, which means, I don't think I mentioned this. So the traditional data centers we talked about that are streaming are like seven KW per rack per cabinet of compute. These on our data center goes to 120. So almost we're not cut quite 20 x, but we're significantly higher than everything in the future because when you go from communications focused applications to compute focused applications, compute equals power. And so even whatever their own chip design is going to be, it's going to still need this type of infrastructure and that's what we're building. So we don't take that risk, which I think is nice. 

Ann Berry (13:31):
Interesting. Yeah. 

Katie Perry (13:32):
And I want to go back a bit to revenue breakdown. I know you have the data center business and then this cloud business GPU as a service. What are the trends across those two lines of business? And as a follow-up to that, how much of the data center business is still tied to crypto versus other use cases like the AI supercomputing? 

Wes Cummins (13:52):
Sure. So on our data center services right now it's all crypto revenue. We'll start in the next calendar year recognizing revenue from this large scale build we're doing in ellendale and then on the GPU business, what we call our cloud service business, that's up from zero last year to 17 million in the quarter. And we'll see that we're guided to things going up sequentially again. But the way these work, Katie, is that they'll think of it as we're building a hotel and you spend all of this CapEx and you have no revenue and then all of a sudden you have a huge jump of revenue almost in a single day when you open the hotel. And so that's how this big build, it's a billion dollar building that we're standing up in North Dakota. We will spend all of that then put the customer inside and then it'll turn on one and we expect just that building to generate hundreds of millions of revenue. And so there's going to be these big step functions in our business, but you're seeing some very nice year over year growth and you see some sequential growth in the current quarter and year over year growth, but you, you're going to see these big step functions in our businesses these turn on, 

Ann Berry (15:10):
Whereas the analogy with other kinds of real estate like hotels is an interesting one. It does point a little bit to the capital intensity of your business model. What's your CapEx outlook at the moment? 

Wes Cummins (15:23):
So as I mentioned the building that we're working, the first one, there'll be three buildings in total on this campus. The first one is about a billion dollars of spend. We've spent a little over $200 million on that building so far. Again, as we kind of cross the Rubicon here and work on signing this lease, what's going to open, what that'll open for us is the ability to finance these at the asset level. We've been financing them from the company level so far, which has been somewhat of a strain on us. But when we get this first one across the finish line, we're going to have the ability to finance at the asset level at very attractive rates of capital and so on builds. You can finance, we think somewhere between 80 and 90% of the cost with think of a construction loan. And then you flip that into what's effectively a mortgage. We call it an A BS at the end. So it's going to really lower the strain on the company. So think of the public company and really move the financing down to the asset level. The entire campus, it was going to be around 4 billion through here through the end of 2027 is the expectation. 

Katie Perry (16:37):
And so this is actually a helpful breakdown of the nuances that go into this. And like Anne mentioned, the real estate reference is very helpful for framing. Is it your point of view then that this LOI goes through, if you built this, as you said, as a spec work, basically I think a lot of us are familiar with that in various industries, and then you then try to prove concept and sell that. And you thinking is once that goes through you'll have people coming to you earlier on in the process and that's kind of what you mean by the asset level financing. They're going to actually help you with some of that upfront cash. 

Wes Cummins (17:10):
So once we're through, again, I thought the way for us to break into this industry, this large, fast growing industry is speed to market. I had and others in our company had a strong opinion last year that in 2024, especially going into 2025 data center capacity is going to be the bottleneck for ai. And so that's why we pushed forward on the build. We did the design, we did the build, but once we're through, we're proven, then you can expect that we will not be building on spec. We will be contracted prior to building anything else in future, but we'll be in a position to be able to do that where we wouldn't have been before. 

Ann Berry (17:53):
Whereas your business model is really interesting because you found this real treasure in terms of this location and the access to power that you've described. Talk to us a bit about your pipeline of other locations. 

Wes Cummins (18:07):
Sure. So I mentioned I think both in our press release and on the call last night that we have started marketing three additional campuses totaling 1.4 gigawatts for 2026 power. And so that means power that's ready to be delivered in 26, 20 25. Power is kind of done at this point. If you haven't started building, you're not going to turn anything on in 2025. And so we're focused out marketing these other large campuses for the same style of customers for 2026. So it's a rather large pipeline of power that we're in the market with, which I think also makes us somewhat of a unique asset. 

Ann Berry (18:49):
And are you running into competition, Wes, in terms of people going after the same pockets of land? I know private equity has been circling data centers for quite a long time now and have been pretty successful in doing that. Who are you running up against in the bids for these locations? 

Wes Cummins (19:01):
Sure. So when you look at these locations, so we see a couple of different types of players in the market. So building data centers for hyperscalers is not a new market. There's an industry, there's several large players that are all privately held that we see primarily as competition. Now they're the traditional go-to a tier one, tier two cloud market. The way I described finding the real estate and then asking for the power, I think they're starting to look more in more creative ways for power. There's been some other bitcoin miners that have power contracts that I think are looking at monetizing in this way as well. And then we see a lot of private players getting into this, but it's just a little bit of a gold rush that people have figured this out now. It really started earlier this year. The data center was going to be, again, back to it was going to be the bottleneck. And I think it really is going to prove out to be a significant bottleneck for compute workloads for ai. And so what has historically been a pretty boring market has become very exciting. There's a lot of people interested in being in the market, 

Katie Perry (20:20):
Even just flipping through your materials, I have so many just questions on how this actually happens, how it comes to life. These things are essentially the size of shopping malls. These are huge, huge structures. 

Wes Cummins (20:32):
Think of the one in North Dakota, if you kind of five Costcos maybe put together as the size, they're massive. 

Katie Perry (20:42):
And outside of the hyperscalers, which is going to be obviously a super important strategic customer base, are there any large customers, perhaps government entities or the government itself that would need to utilize services that you offer? Or is the hyperscaler audience really the one to unlock there? 

Wes Cummins (21:01):
So right now we're seeing it for us mostly at Hyperscaler. There's definitely other customers out in the market. Obviously OpenAI is a huge consumer of the compute, and so you have companies that are like that as well, but there's a big group of companies and then this is going to move into enterprise. We're seeing that in the market where on our cloud services business, it was mostly AI labs and AI startups, and now we're starting to see a lot more interest from enterprise level customers that are probably been working on what are we going to do with AI and what's the strategy for the past 12 or 18 months. So there's kind of two ways this goes. You move it into development, which is when you do training on GPUs and then when you move your model into production is when you do inference. But they're definitely moving into the development area. 

(21:55)
And so this is going to be all types of companies. So what we're starting to see first is what I would call enterprise software tech, more tech leaning companies. We're starting to see it from financial banks, large banks, seeing it from healthcare companies, and you're going to see it from consumer companies. You're going to start to see all of these companies figure out how does AI fit in my strategy? And they're all going to be customers of this style of compute. And now how is that rollout through our products? Right now we're focused on the hyperscale market. I think later there will be a chance for us to focus on the enterprise market, but it's a pretty broad spectrum of potential customers out there. We just see the largest percentage of the demand on the hyperscale market. 

Ann Berry (22:40):
Whereas when you think about the many different ways that you could go in your growth strategy, what are the top two right now? What's grabbing your attention? 

Wes Cummins (22:50):
We have such a big opportunity on these data center builds. I mean just massive, massive projects and then several campuses. And so we're hyper focused on how much power can we bring on in these buildings through the end of 27 because I think we can sell everything that we can bring on. And then on our GPU cloud business, we're pushing that hard towards the enterprise market. And I think I was mentioning those starting to come into the market. I think over the next 12 to 18 months, there's a lot of runway for growth on that part of the business as well. 

Katie Perry (23:30):
Yeah. So Wes, you mentioned being a small company initially and I looked on LinkedIn and currently you're listed under 200 employees. So curious what is the headcount and that I'm always interested in public companies that have big opportunities that are fairly lean for a company. So can you just talk about how you look at headcount, where most of your talent is located now in terms of within the organization and what that could look like? For example, if you do go through with this Hyperscaler client? 

Wes Cummins (23:59):
So we are still under 200. I think we're about 175. I don't have the exact number. We've been hiring pretty aggressively, especially for the talent we need in that specific area on the data center side and also on the cloud side. If you look back over the past since May, we've made some announcements of some pretty key hires for us, really talented people coming our way. We get a lot of inbound, which is great. The industry is fairly small from a gossip perspective. And I think generally people know what we have going on. And so we have a lot of people interested in joining the next generation of infrastructure, so hiring aggressively. But we have a great team and we're accomplishing a lot with not a huge number of people. So headquartered in Dallas and we have two locations in Dallas, so that's where a lot of the people are. And then obviously we have operations on site in North Dakota, and then we have every company these days. We have some people who work from different parts of the country that mostly work from their home office, but most of it in Dallas, 

Ann Berry (25:13):
Whereas How do you think about m and a as a potential avenue for growth? Obviously you've got the acquisition of new pieces of real estate, I put that as your core competency, but in terms of consolidation across data center creators like yourself, do you think that there's a scope to accelerate your growth in that fashion? 

Wes Cummins (25:31):
I think maybe there's a chance for us in the future to do that, but right now there's so much organic growth for us from a development perspective that I don't think we could take on m and a projects. We just, again, the scale of what we're doing now, just if we were only doing just our Ellendale campus, it's such a large project, just that single campus, but we're going to be doing more than that. And so the amount of organic growth that we have in our pipeline doesn't make a lot of sense for us to be looking at m and a at the moment. 

Katie Perry (26:07):
And then I love doing these shows, and Ann and I talk about all the time just getting someone with very deep expertise within a specific area that not a lot of people outside of that industry are talking about. And so I'm curious, in your conversations with other business people, perhaps relatives, family, when you're talking about what you're building and your ambitions, what is something about either the process or the opportunity that would tend to surprise people that aren't as close to it as you are 

Wes Cummins (26:38):
Just in this market in general 

Katie Perry (26:41):
In terms of what you're building or just what you view when you look ahead and if you're in a conversation with an in-law and what would blow the regular person's mind as it pertains to what you're doing? 

Wes Cummins (26:57):
So I think the most interesting thing about what we are doing is we are building leading edge facilities that are going to be some of the most advanced supercomputers in the world in rural America. This is going off. And Ellendale is a town of just over a thousand people in North Dakota. I mean, imagine a project that is going to be one of the largest supercomputers in the world when it's finished going into a small town that's going to bring a lot of high paying great jobs, a lot of economic benefits to the area. And so I think what's really interesting about really the whole AI industry at large is we're able to, in many ways, revitalize some of rural America, bring tech jobs into these locations that never would've been possible before. And I myself, I grew up in a town of 200 people in Idaho. And so the thought of, for me, it's kind of special I guess to be able to take this type of infrastructure project into a location just like where I grew up and create some really interesting paying high tech jobs. 

Ann Berry (28:11):
I'd love to dig into that a bit more. If we take the Ellendale campus that's underway right now, how many jobs are forecast to be created in that location? 

Wes Cummins (28:21):
So it's a little bit of a moving target as we iron out all of the operations. So that's one of the big key pieces we're building out now is operations, but you're going to be looking at north of 200 jobs or higher in that location. And again, for the town that size, if you look through some, I don't know if we press release this, but we get a lot of coverage in North Dakota, but we're building houses. We partnered with a developer for building houses in the town. We're working on a lot of infrastructure services there. There's a lot of things that we have to bring so that the town can support the workers that we need at those facilities. So it's an interesting dynamic and actually quite fun that you're solving problems you never would've thought of that you would be involved in. And from a business perspective, our job's to build and operate this data center, but there's a lot of other things when you're doing these types of locations that you need to build out so that you can have the people that'll be there to operate and build the data center. So that's been, I would say a challenge, but it's also been a pretty fun challenge for us and just something I didn't expect, 

Ann Berry (29:34):
Whereas what you've just described is an anecdote in some ways. I mean, it's an important one that's emblematic of a shift in the needs in the labor force here in the US with respect to skills that lend themself to fueling more investment in ai. You've talked about the infrastructure to support the people that will be employed around l and d. Talk to us about the education and the skills levels, how much training is required locally? 

Wes Cummins (30:01):
So we have, in the history of our company, we've hired a lot of people locally and trained them with the skills necessary to operate in our facilities. And I expect that to be the same as we can find the people. There's going to be some skill sets that will have to be brought in externally for this. But for the most part, we just have a history of hiring and training up people. One of our managers, one of our facilities for us, one of the gentlemen that does, he was working at Walmart in the town before we hired him. He came in as a low level operator, did a great job, learned everything, and then when we built a new facility, he became the manager of that facility. And so those are great opportunities that we create and we try to keep that local. But I think that's one of, again, it's probably the most unique thing about what we're doing. 

Katie Perry (30:52):
Yeah, I love that. Wes, I also grew up in a small town recently was announced Micron was opening a big plant there. And everyone I know from back home's talking about it and there has been a lot of conversations around how do we bring jobs back to these rural areas. So I love that you brought that up. We're so excited to see where you guys will go next. We hope you'll come back on the show sometime and give us some updates. I want to thank you so much for taking the time after you had your earnings call. You have a busy schedule, really appreciate you coming on to join us today. 

Wes Cummins (31:24):
Thank you so much for having me. 

Katie Perry (31:25):
I'm Katie 

Ann Berry (31:26):
Perry. And I'm Anne Barry. Thanks for tuning in to After earnings, the show that brings you up close and personal with the executives behind the world's most interesting publicly traded companies. 

Katie Perry (31:36):
If you learn something today, don't forget to subscribe, like share with your friends. Upcoming episodes will feature CEOs and CFOs from a major semiconductor player, one of the world's largest pharmaceutical companies, and more 

Ann Berry (31:50):
Quick announcement, in just a few weeks, we are bringing you a special edition of the show and you are invited to be a part of it. A firm's CFO, Michael Linford is coming back to chat with us after a stellar quarter. And if you are a shareholder of a firm, you'll be able to submit your question for Michael before the show. 

Katie Perry (32:06):
This is a show first and it's something we plan to do a lot more of in the future. Our guests are leaning in and finding ways to connect with their shareholders in more direct and ongoing capacities. So follow us on social media at after earnings for more details.