Placeholder Image

Subtitles section Play video

  • Welcome back to Morning Brief, brought to you by Invesco.

  • Now, the battle of the semiconductor space heating up with Intel, NVIDIA, and AMD all announcing new next-generation AI chips.

  • But we're going to zoom in on NVIDIA.

  • It's already been a huge year for the stock, obviously.

  • Shares, though, up over 130% so far just in 2024, surpassing that $1,000 milestone for the first time a few weeks ago, and just this morning hitting another record high.

  • But it hasn't just been this year, right?

  • NVIDIA's dominance, sending shares skyrocketing over 500% in the last two years.

  • Take a look at that extreme climb there and that coming as investors double down on AI hype.

  • All of NVIDIA's momentum leading to a lot of bullishness across Wall Street, Bank of America becoming the latest, hiking their price target on NVIDIA this week from $1,320 to $1,500.

  • This is now the highest call on Wall Street for the chip giant.

  • The analyst behind the call, Vivek Arya, is Bank of America's senior semiconductor analyst, and he joins us now.

  • Vivek, thank you so much for being here with us to talk about your bullish call on NVIDIA.

  • I'm curious, why $1,500?

  • Why not doubling your price target on this name considering the amount of bullishness and the fundamentals that back it up?

  • Sure.

  • Good morning.

  • Thank you for having me.

  • So I think there are two aspects to it.

  • The first is where we are in the cycle of converting data centers to accelerated computing.

  • The concept of the data center is not new.

  • It's been around since the 1940s.

  • But then we have these multi-decade infrastructure upgrade cycles from the mainframe to the microcomputer to cloud computing.

  • And now we are at the start of what I think would be a decade-long conversion over to accelerated computing.

  • What is accelerated computing?

  • It's a way of taking a lot of very intensive data processing tasks that have to do with images, with voice, with video, with the processing of these large language models, and converting the data center with new kinds of semiconductors to make sure that processing can be done very efficiently.

  • We are just at the start of this cycle.

  • We think that the spending could be anywhere between $250 billion to $500 billion a year, and NVIDIA is And I know there is often this comparison to, let's compare it to Intel of the past, or let's compare it to Cisco of the past, or other parts of the technology upgrade cycle.

  • But what is different is that NVIDIA is bringing a combination of silicon, of systems, of networking, of software, of developers, and that is unprecedented.

  • So to come to your question, technology stocks, the price targets really move because of the size of the market, because of their execution, and even at this $1,500, the stock would essentially be trading at one times their earnings growth rate.

  • If you look at the S&P 500, it's trading at two times their earnings growth rate.

  • So I would claim that even at this kind of price level, it's actually trading below where the broader market is in terms of the price-to-earnings-to-growth ratio.

  • High target on the street for you, Vivek, for NVIDIA.

  • What is the significance that you and deals like the one that is being reported this morning, XAI potentially going to be taking the chips that Tesla was supposed to be getting from NVIDIA here?

  • Yeah, I think there is a race going on between technology companies of various types.

  • We have seen that across the cloud of players who have raised their capital spending forecast to over 40% to 45% growth.

  • Last year, we were thinking that growth would be 20%.

  • It's already 40% to 45%.

  • There is a race between these different technology companies to be the first to support the next large language model, to make sure that they are the destination of choice for a lot of startups who are trying to really kickstart their own generative AI applications.

  • And then this market is broadening from these cloud service providers to a lot of enterprise verticals.

  • And in this most recent quarter, NVIDIA did call out that automotive and automotive customers, which I assume includes Tesla, are part of that really large vertical.

  • So we see the spreading from cloud customers towards automotive, towards healthcare, a lot of drug discoveries being done with this infrastructure.

  • Financial services are benefiting from it.

  • And then the last but not least part that we will start to see more is the rise of this infrastructure amongst sovereign.

  • So we have seen India, Europe, Japan, every country and region, the Middle East, they are trying to make sure that they're able to use their culture, their language to really go and train these large language models.

  • And we think we are just at the start of what that, and that could be a really large infrastructure upgrade cycle as well.

  • Vivek, you mentioned customers and it makes me think about Intel's latest chip announcement.

  • And the executives really speaking about this idea that their chips are not necessarily for the hyperscalers.

  • They're for more of the enterprise customers.

  • They're going to be a more affordable.

  • To what extent do you see that as a winning strategy for Intel moving forward, particularly given the amount of headwinds facing that stock over the course of the past year here?

  • Sure, absolutely.

  • I think we have to give Intel the credit that they have very strong enterprise incumbency.

  • They are the brand name, one of the most trusted names in enterprise infrastructure.

  • But broadly speaking, Intel is facing this challenge where they have been behind in AI.

  • They have the incumbency, but they don't quite have the deployed base, the range of software developers.

  • For example, NVIDIA has 5 million developers on their CUDA platform.

  • So a lot of the enterprises, as they start to look towards their AI infrastructure, the first name they turn to is NVIDIA for the reason that there's a lot of developers who are already familiar with NVIDIA's cloud instances.

  • And that too is where they find a lot of NVIDIA-based instances.

  • So broadly speaking, if I look at the overall AI chip market today, we think it's about $100 billion or so this year.

  • NVIDIA is about 80% of that.

  • Custom chips from Broadcom, Marvell, and others are another 10 or 15%.

  • And then you have this long tail, which has AMD, which has Intel, which has a number of startups.

  • And even as I look forward over the next three to five years, as this market doubles or triples from here, we think that ratio will stay relatively in that range, that NVIDIA will continue to have 80%.

  • We think custom chips will have another 10 or 15%.

  • And then you will continue to have this long tail of companies that have a presence, they benefit from the rising tide in this market, but they're never quite able to achieve the escape velocity that the leader in the market has.

  • And by the way, that is not too uncommon.

  • If you look at other parts of technology, search, social, e-commerce, operating systems, you see the leader that has the early start, that has the scale, the incumbency, the relationship with developers claiming 80 plus percent of the market.

  • And we see exactly the same thing in the AI chip market as well.

  • So what would change that dynamic moving forward, Vivek?

  • And I would love to get your answer on that in the context of some of the cloud providers that are trying to compete with NVIDIA.

  • I was just in Amazon's chips lab, AWS's chips lab last week, taking a look at some chips that they have coming out here.

  • And I'm curious what you think would make some of the cloud players like an Amazon a formidable opponent to NVIDIA?

  • No, absolutely.

  • I think we have to look at the cloud service providers as in some way competing with NVIDIA, but even more so being very strong customers for the company.

  • Developing chips and developing cloud infrastructure are two different skill sets.

  • So if I'm developing a custom chip in a cloud, only that cloud can use it.

  • It's not going to scale any chip that Amazon develops.

  • It's not going to be used by Google.

  • It's not going to be used by Microsoft.

  • It's not going to be used by Oracle or Meta.

  • So in some ways, it is serving a very specific need that Amazon might have.

  • Similarly, the chips that Google develops, the TPUs, they serve very specific needs.

  • What NVIDIA has developed is a so-called merchant reconfigurable programmable infrastructure that can be deployed across a very wide range of workloads, both for when these cloud companies are handling internal workloads, such as search, such as e-commerce, such as YouTube or TikTok-like video recommender engines, or when it is serving a lot of enterprise customers through their public cloud, which is why I mentioned that if you take a step back and look at the broader accelerated computing market, I do think that 10 to 15 percent will be these custom chips, but they will have several different custom chips across each of these specific cloud infrastructures.

  • But because they don't quite have a way to go from one cloud to another cloud, their range of workloads, their applicability, their scale tends to be limited to just that cloud infrastructure.

Welcome back to Morning Brief, brought to you by Invesco.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it