We've reached peak earnings season, and you're listening to Motley Fool Money. I'm Mary Long, joined today by Aset Sharma. Aset, thanks for being here. Mary, thank you for having me once again.
I'm glad you're here. I'm glad we got to talk for a solid 22 minutes before we even started recording. I'm sure we'll continue that conversation afterwards. We've wasted enough of Rick and Ricky's time. Now let's get to the show. Who's our next victim? Our listeners.
We'll put things on a platter for the listeners. We had a lot of big tech earnings out yesterday. We'll kick things off with meta. Maybe let's start by focusing on their expenses, because that's a big part of the story and the focus here, in particular, capital expenditures that are being spent to rev up and build up AI infrastructure.
Meta plans to continue increasing their capital expenditure and invest between $60 to $65 billion into building a 4 million square foot data center campus to energize AI development. That's about 70% higher than what Wall Street analysts were expecting that CapEx number to be. Both Meta and Microsoft, which we'll talk about a bit later on in the show, are now spending roughly 30% of their annual revenues on capital expenditures.
Asa, what are they hoping to get from all of this spend? You know, Mary, I'm sort of getting tired of all these hold my beer moments when companies announce their CapEx spend. Nonetheless, I do think they're in a difficult spot. It's a good difficult spot to be in. But if you're sucking in Nadella or if you're Mark Zuckerberg or any of these other tech titans, what you're trying to do is to build out the capacity for business that you have a good purview into. So Microsoft talked about its huge
backlog numbers for revenue remaining performance obligation, that sort of signals the demand for their AI infrastructure in the future. So the business is there. They want to build the capacity to get that business. There's another element that they're trying to optimize for, which is to build it out so that their fellow titans don't take the market share from them. So they're forced, committed to invest for that reason as well.
And then there's this weird normalization function going on where
These investors are thinking ahead as they build these data centers. Okay, at some point, this is gonna hit our income statement. Today, it's all investment on our balance sheet, but all this stuff gets depreciated over time. So it's an expense on future income statements. So if I overbuild and the demand isn't there, then that's gonna be a drag on my earnings, because I won't have the AI revenue, but I'll have all the expenses that are associated with the depreciation of building all this stuff.
So these people have much better view into what the future looks like than your average Joe. And they balance these competing objectives every day in committing this capital. All in all, I think we should assume that they're mostly directionally correct. And these hold the beer moments are just about saying, you know, I think the demand is now this much more than it was this time last year. So we're going to commit that much more.
You use the term fellow Titans. And I think typically when we think about like outspending the fellow Titans, we think of these American companies, these massive hyperscalers. Earlier this week, we had another potential Titan enter the ring. I'm referring to deep-seek for those unfamiliar. Does this deep-seek story, which, you know, we've talked about earlier this week on the show, Dylan's going to hit on more tomorrow during the Friday radio show. But broadly speaking, does this deep-seek story change how investors ought to be looking at this just
truly massive monumental AI spending by American tech companies. I think it does. I think that DeepSeek is maybe on its way to being a type. Potentially, that's one outcome for them. So I don't think it's wrong for you to label them as such. They certainly have burst onto the scene with a lot of muscle in artificial intelligence. But they're more to me like a scrappy dreamer type company.
So I spent a lot of time early this week reading the paper that they published on their model. And it's really elegant how they've designed their model to be more efficient than competing models. They totally removed some of the processes of training their model that we take for granted. And when I say we, I don't mean here in the United States or Europe, I mean,
Globally, even their Chinese peers, Indian peers, anyone who is building this stuff, some steps that people took for granted that should be there, they removed it and let the program, let the model figure things out for itself. And the results were sort of surprising. It reminds me a lot of the early days of Alpha Zero. So this is Google's Deep Mind product that taught itself to play chess in four hours and then had novel approaches to the game. So I think there's a lesson here in
the idea that software and algorithms can still play a big role in giving us efficiency and computation. So that's one takeaway we should all be aware of. The game ain't over yet on the development side of these models. Someone could build a model that's twice as efficient as what DeepSeq is offering us.
and perhaps further signal down the road that somebody's going to get hurt in all this build out on their income statement in the future. So many fingers were pointed at Nvidia, and I think that's a good place to start because their whole next several years of business is predicated in part on the demand for really, really high end chips. I still think Nvidia has so many ways to win, so you shouldn't write them off, but there is a challenge there at least on that side.
Yeah, the game ain't over yet. In fact, many would argue that it's probably just begun. Let's focus in on meta earnings in particular. Zuckerberg on the earnings call predicted that this is going to be the year when a highly intelligent and personalized AI assistant reaches more than a billion people. And I expect meta AI to be that leading AI assistant. That's a quote from him on the earnings call. He points to the fact that meta AI is already used by more people than any other assistant.
This is kind of what we got wrapped up talking about earlier, Osset, before we started recording. In my mind, is this a bit of an apples and oranges comparison? Because in my mind, there's a massive difference between using meta AI because I am already on Instagram versus actively seeking out chat GPT for a specific request.
Yeah, it is apples and oranges to me, Mary. One thing that's different is that you have to physically go to chat GPT to ask it a question. It's this resource that sits out there. Now they're developing ways to get right into your browser. They want to potentially replace Google as a search function. So for now,
We should just think of them as this thing that could merge into other stuff we use so we can ask it a question. Whereas meta AI exists on different platforms within Facebook's family of apps. So it's there. What
Meta wants you to do is to use meta AI, the research assistant or the, the AI assistant to answer your questions contextually when you're inside an app and get so used to that that you start going directly to the app when you have a question. Why is that good for them? Cause you're on the app longer. You're on Facebook longer. You're on Instagram longer. You're on WhatsApp longer. So they have a vested interest, but perhaps the comparison is unfair because chat GPT started from nothing.
Whereas Meta has this ecosystem that already encompasses billions of users. So for them to roll out their assistant, yeah, fair note, but logical, if you're Mark Zuckerberg to present this to the investment community, it's a logical and his eyes a fair comparison.
Another Zuckerberg prediction we got is that 2025 will be the year of the AI glasses. Zuckerberg said, and I thought that this was interesting. This will be a defining year that determines if we're on a path towards hundreds of millions and eventually billions of AI glasses and glasses being the next computing platform.
He kind of pauses and then he goes on or if they're going to be a longer grind. So it's on the one, he starts off sounding so definitive that this is in fact the year of the AI glasses. And then kind of says, or it might still happen, but it might take longer than we think. So is there an option here? Are we getting glasses, whether we like them or not? Well, I mean, part of our 22 minutes before the show was just talking about the differences in the meta glasses that we're all using.
Just kidding. I don't own these classes. I don't think a sample size. You don't either. I don't think a sample size before is a great sample. But this, if you extrapolate for those who are listening, think of all the people you know who have met at classes. I'm betting that the population isn't that great. And what he's getting at, I think he mentioned in the call that when you hit numbers of 5 million to 10 million of a given consumer product, that's when you're sort of
off to the races. That's when you have acceleration. And I think there's something like maybe a million of these meta glasses that have been sold cumulatively, which is not a bad number. We're starting to see a little bit of revenue booked by reality labs from the sale of these glasses. So what he's trying to say is, again, this is a signal to the investment community that, look,
If we don't take off this year, don't blame us. But I'm telling you, we're further along than you might have thought. And if we do gear up and can have a sight line to five or 10 million devices on an annualized basis in the near future, this is going to be a little bit of a game changer in our humble opinion, in medicine, but opinion.
Let's stick on this reality labs piece for a moment before we move on to Microsoft. Another Zuckerberg prediction in this earnings call was that this will be a quote, pivotal year for metaverse technology. And fittingly, a couple days before dropping earnings, meta announced internally within their like workspace platform that it would be making reality labs a business priority.
We learned yesterday when Meta did drop earnings that reality labs brought in its best revenue yet, about $1.08 billion, but it lost $5 billion in the fourth quarter and nearly $18 billion for the year. This is pretty standard. Reality Labs has been a money glazing venture for some time now. But considering the fact that it's now becoming a business priority again in this latest reorg, how much do you expect reality labs to contribute to Meta's long-term growth if all goes as planned?
So let me start with the first part of your question, which is to say, this is simply not true. Come on, come on folks. How many times have we heard it'll be a pivotal year from the metaverse from meta? And how many tens of billions of dollars have we seen lit on fire? How many cigar rolls of $100 bills have been set on fire by executives
at meta in the name of reality labs. Of course it's a business priority. Of course it is a huge business priority for this company for Mark Zuckerberg. Else they wouldn't have plowed so much money that could have gone to shareholders in the form of dividends or Sherry purchases if they didn't have a better
Return on incremental invested capital. So I say, no, I say what what this is is yet just another freshen up for the troops for occasion stakeholders on Wall Street for institutional investors retail investors.
But most importantly, the audience at Meta, which is probably getting tired of so much energy getting taken out of places where the spend could have constructively gone, yes, Meta has invested in its advertising business and that's a cash cow, but there's a contingent of investors out there and a contingent of employees who say, what if we had just put some of that money into make our products even better and we didn't have this managerial split focus?
So I think this is messaging. Now, the better and bigger part of your question is, yeah, it's going to be material to their bottom line at some point because there are so many things that Reality Labs has developed, which will find their way into different parts of Meta's product profile in the future. So investment in things like haptics,
being able to just move your fingers and control something on a form function like a phone or a tablet, which is pretty cool. All the virtual reality learnings they have that maybe have an application as generative AI become stronger. Their gen AI prowess
itself, which is formed part on their learnings from reality labs, which is, I think, related to how quickly they were able to open source their llama large language model. There's a bunch of stuff that in parts and pieces means something to their bottom line in the future. But are they anywhere close to having some business unit result that's going to be material and directly traceable to this? No, not anytime soon. This is just about rallying the troops.
Let's move on to Microsoft, which also reported earnings yesterday, rather than focus on AI CapEx and AI spend, because we already kind of fit that at the top of our conversation. We'll focus on the money that Microsoft is actually bringing in from their AI business. They are spending a lot on AI and capital expenditures, make no mistake. But notably, Microsoft's AI business has at this point surpassed an annual run rate of $13 billion, that's up 175% year over year.
Asset Microsoft's AI business is distinct from its cloud business, and that's an important distinction to make. What is it that this AI business actually entails?
This is everything that Microsoft can trace to their AI proficiency and the AI services they offer. So we've got maybe first and foremost in investors mind that partnership with OpenAI. So everything that comes in the house from that partnership gets counted in this bucket of their AI business. Then you've got AI that's running through different products. I mentioned co-pilot before that's found in Microsoft 365.
GitHub, for example, which is owned by Microsoft, has its own AI assistant. So that revenue gets put in this bucket. And then you've got sort of the infrastructure layer. Think Microsoft Azure. When you ask a question of a model that's served up by Microsoft,
There's a cost for that and that goes into this bucket because it's asking that data center to produce output from a model. So whatever they've invested to be able to supply us, to supply us with answers, to supply enterprises with the model inference that they want, the training they want, all of that also gets included in this bucket. Add those together. They're saying that's 13 billion bucks a year up
almost 175% year over year, not bad, I say. Is there any other big tech company that is making that kind of money, that $13 billion annually from their own AI business right now?
There are several businesses that have hit a billion plus run rate. I think ServiceNow is probably their Amazon Web Services. It's probably absolutely their alphabet. All the usual suspects are now selling in the billions. I think low billions to mid billions. Maybe the reason why
With admittedly not a huge and deep and detailed search this morning, Microsoft is the only one that I could find per your question before we taped. That's publishing its number. Maybe it's because they're out in front, but I wouldn't be surprised if we hear within the next year some more disclosures from tech giants about the actual run rate of their AI related revenue. And as we have seen, there's a lot that you can sort of just trace over to that bucket that rests on services these companies were providing before. They just now have
in some ways an AI flavor to them or they bring some of the newer generative AI business into those buckets.
Rather than focus on that money being generated by the AI business, Wall Street really honed in on cloud and results from the cloud segment and from Azure. Cloud revenue as a whole, up 21% year over year, but the Azure cloud computing platform specifically is proving to be an especially sore spot among Wall Street investors. It grew by, I'm gonna use air quotes here, only 31% in the quarter.
that's down from 33% last quarter and it comes in just under what analysts claim to have been expecting of 32%. If 31% isn't quite hitting the mark, what should long-term investors make of that? It didn't work for Wall Street, but for long-term investors is that 31% really something to shrug off.
I wish Bernie Sanders was on that platform where you can pay a celebrity to just record something short, his favorite rendering of the 1%. I wish he could make that the 2% and we could play it whenever Wall Street got all in a tizzy because a business division of a hugely successful company missed their estimates, not my estimates, Mary, not your estimates, but their estimates for
this 31% versus 32%. I don't think it matters anything to the long-term investor. The bigger story here is that Azure, which is Microsoft's cloud infrastructure, the platform services that serve up its AI, grew at a pretty decent rate over the past year. And it makes the case that they should continue investing all that capex we talked about at the top of the segment.
Real quick before we wrap up, one other thing that was mentioned on the Microsoft earnings call, there was an analyst that asked about the relationship between OpenAI and Stargate and kind of how things fit in there. Nadella referred to the network of data centers Microsoft is building in his answer to this question as a fungible fleet. That phrase stuck out to me because honestly, I said, I have no idea what it means. What is a fungible fleet? And why does that matter to investors of Microsoft?
A fungible fleet is what you get when you ask an AI to start rapping about Microsoft's prospects. It's one of the phrases you'll hear pretty regularly. What it means in real terms. I think that Nadella is referring basically to the data centers and the combination of hardware and software that exists in those data centers to serve up AI. So fungible, we borrow that term from finance. A dollar is fungible, right?
Like it doesn't matter if I give you a dollar, you give me a dollar that the two bills are different, but the value is the same. What he really means by this, and this is sort of, perhaps ominous if you're Sam Altman, is that Microsoft is pretty committed both on its hardware stack and its infrastructure software stack to being flexible for people who want to purchase its services. So if DeepSeek comes with a model,
they're going to offer that in their platform, which they have added deep-sea now to their model portfolio. If another company comes and has something that brings more cost-efficient compute, then OpenAI can provide them, they're going to use that. They're already working internally to reduce the cost of compute. Satya Nadella talked about the scaling law in last night's call. He talked about Moore's law. He talked about all these kinds of accepted maxims of
infrastructure and silicon development, which say the cost of computing gets cheaper as the years go by. This is a really crude way to say it, but that's what he meant. So if you're him, if you're Microsoft, then you don't want to get tied into a certain hardware vendor for GPUs. Hello, NVIDIA. You don't want to get tied into a certain large language model. Hello, Sam Altman and OpenAI. You want your fleet
that fleet of data centers, that fleet of infrastructure to be fungible. Let it get switched out. Let the best solution come, we're gonna provide it. That's exactly what he meant. And I found it pretty interesting, except for this sort of lame term. You and I were cracking on Stargate last week. And this just reminded me of that, Mary. Like, why is everyone using these like weird 1970s style terms to talk about the future?
Asana I can always count on you to wrap us up with a discussion about words, why they matter, why a certain phrase matters. They do matter, words matter. Asana Sharma, always a pleasure to have you on. Thanks so much for the time and for dissecting the word fungible. Thanks for letting me come on and dissect words with you Mary.
Big tech earnings in particular bring a lot of stuff to break down, so we wanted to spend a bit more time covering today's news. Because of that, we're skipping our usual second segment today. That's all for now, fools. Thanks for listening. For us at Sharma, I'm Mary Long. We'll see you tomorrow.
As always, people on the program may have interest in the stocks they talk about, and the Motley Fool may have formal recommendations for or against, so don't buy or sell stocks based solely on what you hear. All personal finance content follows Motley Fool editorial standards and are not approved by advertisers. The Motley Fool only picks products that it would personally recommend to friends like you.