NVIDIA Corporation (NVDA) on Q4 2021 Results - Earnings Call Transcript
Operator: Good afternoon. My name is Mariama, and I will be your conference operator today. At this time, I would like to welcome everyone to NVIDIA's Financial Results Conference Call. All lines have been placed on mute to prevent any background noise. After the speakers' remarks, there will be a question-and-answer session. Thank you. I will now turn the call over to Simona Jankowski, NVIDIA's Vice President of Investor Relations and Strategic Finance to begin the conference.
Simona Jankowski: Thank you. Good afternoon, everyone, and welcome to NVIDIA's conference call for the fourth quarter of fiscal 2021. With me on the call today from NVIDIA are Jensen Huang, President and Chief Executive Officer; and Colette Kress, Executive Vice President and Chief Financial Officer. I'd like to remind you that our call is being webcast live on NVIDIA's Investor Relations website. The webcast will be available for replay until the conference call to discuss our financial results for the first quarter of fiscal 2022. The content of today's call is NVIDIA's property. It can't be reproduced or transcribed without our prior written consent. During this call, we may make forward-looking statements based on current expectations. These are subject to a number of significant risks and uncertainties, and our actual results may differ materially. For a discussion of factors that could affect our future financial results and business, please refer to the disclosure in today's earnings release, our most recent forms 10-K and 10-Q and the reports that we may file on Form 8-K with the Securities and Exchange Commission. All our statements are made as of today, February 24, 2021, based on information currently available to us. Except as required by law, we assume no obligation to update any such statements. During this call, we will discuss non-GAAP financial measures. You can find a reconciliation of these non-GAAP financial measures to GAAP financial measures in our CFO commentary, which is posted on our website. With that, let me turn the call over to Colette.
Colette Kress: Thanks, Simona. Q4 was another record quarter with revenue exceeding $5 billion and year-on-year growth accelerating to 61%. Full-year revenue was also a record at $16.7 billion, up 53%. Our Gaming business have reached record revenue of $2.5 billion in Q4, up 10% sequentially, and up 67% from a year earlier. Full-year gaming revenue was a record at $7.8 billion, up 41%. Demand is incredible for our new GeForce RTX 30 Series products based on the NVIDIA Ampere GPU architecture. In early December, we launched the GeForce RTX 3060 Ti, which joined the previously launched RTX 3090, 3080, and 3070. The entire 30 Series lineup has been hard to keep in stock and we exited Q4 with channel inventories even lower than when we started. Although we are increasing supply, channel inventories will likely remain low throughout Q1. GeForce RTX 30 series graphics cards were a holiday sensation do not just do their amazing performance, but also to the rich features, including our second-generation RTX ray tracing technologies and DLSS, AI-powered performance accelerator, which massively boost frame rates in graphically demanding titles.
Operator: Your first question comes from the line of C.J. Muse with Evercore ISI. Your line is open.
C.J. Muse: Good afternoon. Thank you for taking the question. I guess, Jensen, higher-level question for you on the enterprise side. You're now a couple of quarters into the ramp of A100.And curious if you could speak to whether you've seen any surprises here, any areas of specific strength worth calling out? And any changes to how you're thinking about the size of this opportunity?
Jensen Huang: Yes. Thanks a lot. It's Jensen. As you know, A100 is a very different type of GPU. This is our first universal computing GPU. It's great at high-performance computing. It's great at data analytics. It's great at training. And also for our highest GPU, it’s also the first time that is incredible for imprint. It's some 20 times faster than previous generation. We may introduce this some really exciting new computational formats like TF32, TensorFloat-32 for training. And with a multi-instance GPU, turning our GPU 1, GPU to a whole bunch of smaller GPUs, autonomous GPU to improve performance and reducing latency. And so the capability is really quite exciting. We're seeing strength in hyperscalers as they continue to accelerate their adoption of AI. Some of the new applications we've spoken about a couple of times before, the transition to deep learning, the conversational AI, speech recognition to natural language understanding all the way to speech synthesis, which is now based on AI – based on deep learning. The other area that's growing incredibly fast is the deep learning recommender models. Just about everything that you do on the Internet is based on recommenders. There are hundreds of different recommenders out there, whether you're shopping or recommending music or recommending news or recommending search, and so all the recommending ads. And so all of these different types of applications are driving that. For the first time, we saw our industrial application - industrial data center growing to be larger than hyperscale. And we're seeing industrial applications across scientific computing where simulation-based approaches are now being fused with AI approaches for weather simulation, genomics, molecular dynamics simulation, quantum chemistry, even simulating quantum computing, which is one of the really exciting areas. We're seeing AI being deployed for big data analytics, RAPIDS, which is NVIDIA’s created open source platform for data analytics of Spark 3.0, which NVIDIA really led and is now GPU accelerated. So now you could have big data in the cloud, while doing big data analytics in the cloud on all of the CSP platforms. You could – we're seeing a lot of excitement around financial services and consumer Internet services are all really growing nicely. And so A100 adoption is just starting. I mean, we're going to see several – couple of years of continued growth ahead of us, while as AI gets adopted in clouds and industries.
Operator: Your next question comes from the line of Vivek Arya with BofA Securities. Your line is open.
Vivek Arya: Thanks for taking my question. Just a clarification and then a question for Jensen. On the clarification, Colette, I was hoping if you could give a little more color around Q1? Do you still expect data center to grow sequentially in Q1? I know you said that most of the growth will come from gaming, but any color on the data center would be useful? And then, Jensen, the question for you is, in your press release you used the phrase AI driving the smartphone moment for every industry. Could you help us quantify what that means? And where I'm going with that is, is there a number in terms of what percentage of servers are shipping today with your accelerators? And where can that ratio go over time? Is that a fair way of looking at the adoption of your technology and AI?
Colette Kress: So thank you, Vivek. Your question regarding the guidance as we beat into Q1. We had indicated that, yes, a good percentage of our growth between Q4 and Q1 those come from gaming, but we also do expect data center to grow. Most of our sequential growth coming from gaming, but keep in mind, we also expect all of our market platforms will likely be able to grow quarter-over-quarter.
Jensen Huang: Because we are entering in the third phase of AI. The first phase of AI was when we invented the computing platforms, the new chips, the new systems, the new system software, the new middleware, the new way of working, the new way of developing software, which the industry, the world is now starting to call ML Ops. The way that software is developed and the way that is deployed is completely different in the past. In fact, I heard a great term Software 2.0, and it makes a lot of sense. It's a computer that is writing software. The way that you develop software is completely different; the way you compute is different. And that was our first phase, and that that started in a journey that was some eight, nine years ago now. The second phase was the adoption of using this in an industrial way for clouds. And we saw it revolutionize new services; whether it's speech-oriented services or search-oriented services, recommended services, the way you shop, the way you use the Internet is completely different today. And so that's really the second phase and those two phases are still continuing to grow, and you're still seeing the growth associated with that. The third phase is the industrialization of AI. And some of the great examples when I say in terms of smartphone moment, I meant that it's a device with AI, it’s autonomous and it's connected to a cloud service, and it's continuously learning. So some of the exciting example that I saw, that I've seen and we're working with companies all over the world, we have some 7,000 AI startups that we're working with, and almost all of them are developing something like this. And large industrial companies whether it's John Deere or Walmart, they're all developing application kind of like this. And basically it's an autonomous system, autonomous machine. In our case it's called Jetson. It's a robotics machine. That robotics machine is a car and it’s called DRIVE. And it's running an autonomous - an AI application on top and AI is still on top, and this could be – they could be moving device – moving things around, it can be picking and placing. It could be just watching a warehouse and monitoring traffic and keeping traffic flow going. It could be connected to a car. And whenever the car – whenever the fleet of cars needs to be retrained because of a new circumstance that was discovered, the cloud service would do the relearning and then we’d deploy into all of the autonomous devices. And so in the future, we're seeing these industries, whether you're in retail or in logistics or transportation, or farming, ag tech to lawnmowers – consumer lawnmowers. They're not going to just be products that you buy and use from that point forward, but it likely be a connected device with an AI service that runs on top of it. And so these industries I'm so excited about because it gives an opportunity to change the way that they interact with their customers. Rather than selling something once, they sell something and provide service that's on top of it. And they can stay engaged with the customers. The customers could get a product that's improving all of the time, just like your smartphone. And that's kind of like – that's kind of the reason – that's the reason why I've been calling it a smartphone moment for all these industry. And we saw what happened to the smartphone revolution. And then we saw what happened to the smart microphone, the smart speaker revolution, and you're going to see smart lawnmowers, smart tractors, smart air conditioners, smart elevators, smart building, smart warehouses, robotic retail storage and entire store – the entire retail store is like a robot. And they will all have autonomous capability that are being driven by AI. And so what’s new for the industry therefore. Is that all if the enterprises in the world use to have computers for IT to facilitate our employees and their supply chain. But in the future, all of these industries whether in medical imaging or lawnmowers, you're going to have data centers that are hosting your products just like the CSP. And so that's a brand new industry and we have a platform that we call EGX, which is the 5G Edge AI systems. And we have the autonomous systems we call AGX, which is what goes into adjusting and drives. And between those two systems and the software stack that we have on top of it we're in a great position to help these industries one at a time transformed their business model from a object oriented business model, a thing based business model to a connected device.
Operator: Your next question comes from the line of Stacy Rasgon with Bernstein Research. Your line is open.
Stacy Rasgon: Hi guys. Thanks for taking my question. First, I don't want to be pedantic, I suppose, but I guess on the Q1 guide, you say the gaming is the majority of the growth. Was that an absolute statement or was that a percentage statement? Can you give us some idea of how you'd sort of rank the sequential percentage growth of say gaming versus data center versus other, especially since it sounds like you've got 50 OEMs in crypto specific stuff that will go into the other? And then I guess just briefly, could you give us some indication of where your supply situation and lead times are on your ampere parts within data center? I think you'd said last quarter, they were many months on six months plus, are they still looking like that? And is that sort of a limiting factor at this point in terms of what you can actually ship on the compute side and data center?
Jensen Huang: Colette, you will take one and I'll take the one.
Colette Kress: Sure. Let me start off Stacy in terms of our guidance for Q1, as you know, we're still in the early innings of our Ampere Architecture. Our Ampere Architecture, as it relates to gaming, as well as what it relates to data center. As we articulated in our call we have been really seeing continued uplift of folks adoption of A100 and it's going quite smoothly than what we had seen in prior overall versions. So when we think about our guidance for Q1, there's many different types of conclusions that will happen at the end of the quarter in terms of what we should put all of our platforms can grow. But the majority of the majority of the growth from Q4 to Q1 will likely be gaming.
Jensen Huang: Thanks, Colette. You asked the question about normal time, our company – at the company level where supply constraint, our demand is greater than our supply. And however for data center, so long as the customers more closely with us, and we do a good job planning between my companies, there should be a supply issue for data center. We just had to do a good job planning and we have direct relationships with each one of the royalty of fees and we have direct relationships with all the young ones. And we could do excellent planning between us. We shouldn't have a supplier. We shouldn't be supply constrained there. But at the company level, we're supply constrained. Demand is greater than supply. And yes, we have enough supply. Yes, we usually have enough supply to achieve that even the outlook. And we had that situation in Q4. We expect that situation in Q1. And we have enough supply to grow through the year. But supply against constraint and demand is really, really great. And so we just have to do a really good job planning. And meanwhile, one of the things that, that really came through for us is we have the world's best operations team. Our company is really, really have had an amazing operations team. We build the most complex products in the world, the most complex chips, the most complex packages, the most complex systems. And during Q4, they improved our cycle time and during Q1 we expecting them to improve our cycle time again. And we really are blessed to have such an amazing operations team. And so during these times it really comes in handy. But overall at the company level where we expected them to be greater than supply, we have enough supply to do better in the outlook. And we have enough supply to grow each quarter throughout the year.
Operator: Your next question comes from the line of Timothy Arcuri with UBS. Your line is open.
Timothy Arcuri: Hi, thanks. I had a question on crypto, I guess Jensen, I know that the CMP stuff and the software drivers stuff that you're doing for the 36. And that's going to help a lot, but I think there is like four or five the big currency was in a move or at least a moving or there on a faster move from a proof of work to proof of stake, which is going to be a lot less compute intensive. So I guess the question that I get a lot is how do you assess the degree to which that drives GPUs back into the secondary market? Is there any way to get treated, get kind of a handle on that? Thanks.
Jensen Huang: Yes. If you look at the recent cash rates first of all, the transitions is going to takes some time. It can't happen overnight. And people have to build trust in all of the new versions. And so they'll take a little bit of time, but I hope it does. I hope that people use our proof of stake over time. And a little bit of these questions don't have to be answered. However I don’t have the optimism either that it will be all proof of stake, I think that proof of work is a very legitimate way of securing the currency. And in the beginning while any currency is building its reputation, then take something like a group of work to do so. And so I think proof of work is going to be around for a bit. We developed CMP for this very reason, just so that there are different versions. We have different versions of our products for gaming, for professional visualization, for high performance computing, for deep learning. It stands to reasoning that you have to do a different version for CMP and we can solve it directly the way that we go-to-market would be to go directly to the industrial miners to – and it’s a great benefit to them. So they don’t have to chasing around spot markets. It’s a great benefit to the gamers. And because they want a game, and the game demand is just probably is off a chart. And so this is visibility really beneficial to everybody. The recent hash rate growth was really a result of several dynamics. The first dynamic is the install base. Most people thought that the – once the mining – the GPUs come back into the aftermarket, a small part does that, some people do that, but the vast majority don't keep them. And the reason for that is because obviously they believe in Ethereum and so – they're industrial miners that's what they do. And so, they're keeping around for when the profitability returns and they could kick start the mining gear. We saw – that's what we saw in the latter part of last year. We saw the hash rates starting to grow. Most of that was result of the installed miners reactivating their equipment. It wasn't until earlier this year that we started to see a demand in our own GPUs. And when that starts to happen, there are some different dynamics. There's – the primary source these days come from powerful ASICs. And then there's some that that comes from our GPU and other GPUs in the marketplace. And so, I think that this is going to be a part of our business. It won't grow extremely large, no matter what happens. And the reason for that is because when it starts to grow large more ASICs comes into market, which kind of needs it. And when the market becomes smaller, it's harder for ASICs to sustain the R&D. And so the spot miners, industrial miners come back. And then we'll create CMPs. And so, we expect that to be kind of a – to be a small part of our business as we go forward. Now, one of the important things is to realize that in the near-term, the – because we're in the beginning parts of our Ampere ramp, only two quarters into a multi-year cycle. This is also the first time that we've completely changed computer graphics. RTX using ray tracing is completely different than virtualization. And so this is a fundamental change in the way we do computer graphics and the results have been spectacular. There are some 200 million install base in desktops and 50 million in laptop. And the vast majority of them, we've only upgraded approximately I think it's something like 50% of the install base has been upgraded to RTX. And so there is a giant install base and the install base is growing that we need to upgrade to the next generation of computer graphics.
Operator: Your next question comes from the line of John Pitzer with Credit Suisse. Your line is open.
John Pitzer: Yes, guys. Thanks for letting me to ask questions. I want to go back to data center. You've been very kind over the last couple of quarters to call out Mellanox both when it was a positive driver and when it was a headwind. I'm kind of curious as you do – when you look into the fiscal first quarter, is there anything of distinction to mention around Mellanox versus core data center? And I guess as a follow on, the key metric that a lot of investors were looking at is when does the core data center business year-over-year growth starts to reaccelerate? And some of that is just simple math where you're just comping very hard compares from last year. But Jensen how would you think about data center year-over-year growth in the context of a reopening trade or any sort of new applications out there? I mean, what happened – what helped last time around was the move to natural language AI. Is there another big sort of AI application we should be thinking about as we think about data center growth reaccelerating?
Jensen Huang: We're expecting – Mellanox was down this last quarter. And our compute business grew double digit and offset – more than offsets the decline in Mellanox. We expect Q1 to be a growth quarter to Mellanox and we expect this coming year to be quite an exciting year of growth for Mellanox. The business is growing and ethernet is growing for CSPs is growing in InfiniBand for high performance computing and the Switch – the Switches have grown. Switch business grew 50% year-over-year. And so, we're seeing really terrific growth there. One of the new initiatives and we're going to see success towards the second half because the number of adoptions, the number of engagements has grown as our BlueField DPUs. It's used for virtualization for hyperscalers. It's also used for security. As you know quite well the future of computing and cloud and it's multi-tenant cloud and there's no VPN front door to the cloud. You've got millions of people who are using every aspect of computing. So you need to have distributed firewalls and you can't have it just in one place. The intense focus of security across all of the data centers around the world is really creating a great condition for BlueField, which is really perfect then. And so, I expect our Mellanox networking business to grow very nicely this year. And we expect Q1 to be a great growth quarter for compute as well as Mellanox. The Killer, great driving application for AI are several – last year you're absolutely right that it with natural language understanding and the transformer model and what is the – what was the core of – and other versions like that really, really made it possible for us to enable all kinds of new applications. So you're going to see a natural language understanding do text completion, and it's going to be integrated – I think it was just announced today that it was going to be integrated into Microsoft Word, and we've been working with them on that for some time. And so there are some really exciting applications out there, but the new ones that came – that emerged recently are deep learning based conversational AI, where the ASR, the speech recognition as well as the speech synthesis are now based on deep learning, it wasn't before. And they were based on models that ran on CPUs, but now with these deep learning models, the accuracy is much, much higher and it has the ability to also mimic your voice and be a lot more natural. And so, the ability – these models are much more complex and much larger. The other big huge driver is the recommenders. This is something really worthwhile to take a look at is called deep learning recommender models, and recommenders have historically – whether it's for shopping and or personalizing websites are personalizing your store, recommending your basket, recommending your music. Historically, it's been – and use the traditional machine learning algorithms, but because of the accuracy and – just the extraordinary economic impact that comes from an incremental 1% in accuracy for most of the – mostly the world's large internet businesses people are moving very rapidly to deep learning based models. And these models are gigantic. They're utterly gigantic. And this is an area that is really driving high-performance computers. And – so we – I expect us to see a lot of momentum there. And the last one is the one that I just spoken out, which has to do with industrial 5G and edge, IoT type of applications for all of the different industries whether it's retail or logistics or transportation, agriculture or warehouses to factories. And so, we're going to see AI and robotics in a very large number of applications in industries and we're just seeing so much excitement there.
Operator: Your next question comes from the line of Aaron Rakers with Wells Fargo. Your line is open.
Aaron Rakers: Yes. Thanks for taking the questions. I wanted to go back again on the data center business that you just mentioned, Jensen, that the BlueField-2 product poised to kind of ramp and materialize in the back half of the calendar year. How do you see that – is it an attach rate? I think there has been discussions in the past about – all servers could potentially over time incorporate this layer of acceleration. How quickly should we think about that ramp? And then the second question is, can you just at a high level talk about how CPU – how a CPU strategy you're thinking about that in the context of the broader data center market?
Jensen Huang: Sure. If I could just work backwards, I believe that every single data center node will be outfitted with a GPU someday. And that someday is probably call it five years from now. And the fundamental driver of it is going to be security. Every single application in the data center and every single node in the data center has to be individually secured, zero trust computing, zero or confidential computing or zero trust computing. These initiatives are going to cause every data center to have every single application and every single node would be secured, which means every one of those computers have to have a control plane that is isolated from the application plane. And all the applications cannot share the same resources because that application could be malware, that application could be an intruder. No application could have access to the control plan. And yet today the software defined data centers, the software defined networking, software defined storage, all of the security agents are running in the same processors as the applications and that hasn't changed. You're seeing the cloud – the CSPs in the world moving in this direction. Every single data center will have to move in that direction. So every node will be a DPU process for the software, for the infrastructure. You're essentially going to see the data center infrastructure would be offloaded from the application point and it will be something like a Bluefield. So I think this is our next multi-billion dollar opportunity, CPS. We support every CPU in the world. And we're the only accelerated computing platform that accelerates every CPU. Ironically, the only CPU that we don't accelerate for AI is ARM, but we want to change them. ARM has such an exciting future because the nature of their business model and the nature of their architecture is perfect for the future of hyperscalers and data centers. You wanted – you want the most energy efficiency and in every single data center, because every data center is power constrained. We are going to be power constrained in every aspect of computing going forward. And so we would love to build around the own processor and invest in building a great ecosystem around it. And so that all the world's peripherals and all the world's applications can work and – work on any one of the CPUs that we know today. And I want to start – we're going to start with high-performance computing and start with – all the areas that we have a lot of expertise in to build out our platform. And so, you're starting to see one industry leader after another embraced on and I think that's terrific, but now we've got to energize it with all of the ecosystem support. It can't just be on vertical applications, but we want to create a broad general ARM ecosystem.
Operator: Your next question comes from the line of Mark Lipacis with Jefferies. Your line is open.
Mark Lipacis: Hi, thanks for taking my question. A question for Jensen, I think. Jensen, if you look at the past computing eras, typically it's one ecosystem that that captures 80% of the value of that computing era and mainframes as IBM and many computers with stack PCs, Wintel, cell phones, Nokia and then Apple. So, if you don't get the ecosystem right then you're splitting 20% of the market with a handful of players. So in this next era of computing parallel processing or AI, I think you've articulated the most compelling architectural vision of the data center of the future with data center scale computing devices with CPUs, GPUs, DPUs integrating to the same box serving all workloads in machine virtualized environment. Can you help us understand where is the market in embracing that vision and where is NVIDIA in building out that the ecosystem for that data center scale competing vision. And then maybe as part of that to what extent is CUDA of the kernel for that ecosystem? Thank you.
Jensen Huang: Yes, we're – I think we've done a great job on building out the platforms for several ecosystems around the world. And the domain that we do incredibly well out on – the domains that I have to do with accelerated computing, we pioneered this approach. And we brought it to high-performance computing at first and we accelerated scientific computing and we democratized supercomputing for all researchers, anybody who wants to have a supercomputer now can. And computing, it will simply not be the obstacle that somebody's discovery. We did the same for artificial intelligence. We did the same for visualization. We brought – we expanded the nature of gaining tremendously. Our GeForce today is the largest gaming platform. It’s the largest single largest body of computers that are used for gaming. And in each case, we expanded the market tremendously. We would like to do the same for data center scale computing, as it applies to virtualizing these applications, these applications are also in the process. They've historically required dedicated systems, but they're moving into a virtualized data center environment. And we are best at doing that. They run on our platform today. We have the ability to virtualize it and put it into the data center and make it remotely available. And so these applications, these domains are some of the most important domains in the world. And so we're in the process of getting them. By doing so and making our architecture available to CSPs and OEMs, we could create this accelerated computing platform available to everybody. And so that's we're seeing our journey doing them. First, creating an architecting this platform, and then putting it literally into every single data center in the world. But we would also like to the next step of our journey is there's the Phase 3 of AI and has to do about – it has to do with turning every end point into a data center, whether it's a 5G tower, a warehouse or retail store, a self-driving car, a self-driving truck, these are going to be – they're all going to be essentially autonomous data centers and/or they're going to run AI, but they're going to run a lot more. They're going to do security in real time. Its networking is going to be incredible. It's going to run software to 5G and GPO accelerating 5G, we call areal. And so these platforms are going to become data centers. There'll be secure. The software is protected and we can't tamper with. It if you tamper with it, of course won't run. And so the capability of these clouds will move all the way out to the edge. And we're in the best position to be able to do that. So I did the – in this new world of post close to Moore's law post in arts gaming in this new world where AI and software that writes software in this new world, where data centers are going to be literally everywhere and they're unprotected. There's no giant building with a whole bunch of people that secured. And in this new world where a software is going to enable this autonomous feature, I think we are a perfectly positioned for it.
Operator: This is all the time we have for Q&A today. I will now turn the call back to CEO, Jensen Huang.
Jensen Huang: Thanks for joining us today. Q4 have the truly breakout year for Nvidia. The two biggest engines of our business gaming and data center posted powerful growth. Gaming has become the world's largest media and entertainment industry, and will grow to be much larger. And again gamers will create, will play, they'll learn, they'll connect. The medium of gaming can host any type of game and eventually evolve with the countless metaverses. So I'm for placed on for work. Gaming at simultaneously a great technology and a great business driver for our company. This year, we also closed our Mellanox acquisition and successfully united the amazing talent of our companies. Combined we possess deep expertise in all aspects of computing and networking to drive the architecture of modern data centers. Crowd computing and hyperscalers have transformed the data center into the new unit of computing. Chips on servers are just elements of the data center scale computers now. With our expertise in AI computing full stack accelerated computing, our deep network to computing expertise and cloud to edge platforms, Nvidia is helping to drive a great computer industry transformation. And our planned acquisition of Arm, the world's most popular and energy efficient CPU company will help position and video to lead in the age of AI. This year was extraordinary. The pandemic will pass, but the world has been changed forever. Technology adoption is accelerating across every industry. Companies and products need to be more remote and autonomous. This will drive data centers, AI and robotics. This underlies the accelerated adoption of Nvidia’s technology. The urgency to digitize, automate and accelerate innovation has never been higher. We are ready. We look forward to updating you on our progress next quarter. Thanks a lot.
Operator: This concludes today's conference call. You may now disconnect.
Related Analysis
Nvidia Corporation (NASDAQ:NVDA) Continues to Thrive in the Tech Sector
- Nvidia's stock receives a "Buy" rating from Benchmark, highlighting its growth potential driven by advancements in AI and strong quarterly performance.
- The company's strategic focus on AI and the introduction of the new Blackwell architecture are key factors in mitigating risks and capitalizing on market opportunities.
- Increased demand for AI technology positions Nvidia as a significant player in the tech industry, with a current stock price of $139.35 and a market capitalization of approximately $3.4 trillion.
Nvidia Corporation (NASDAQ:NVDA) is a leading player in the technology sector, renowned for its graphics processing units (GPUs) and advancements in artificial intelligence (AI). On May 29, 2025, Benchmark reaffirmed its "Buy" rating for NVDA, with the stock priced at approximately $138.76. This endorsement, as reported by Benzinga, underscores Nvidia's potential for growth, driven by strong quarterly performance under CEO Jensen Huang.
Nvidia's recent quarterly results have garnered positive attention from Wall Street analysts. The company has shown robust fundamentals, with a notable increase in AI rack-scale deployments and improved gross margins. Despite challenges like US export restrictions to China, Nvidia has successfully ramped up its new Blackwell architecture, as highlighted by Bank of America. This strategic move has helped mitigate risks associated with China sales.
Gene Munster from Deepwater Asset Management discussed Nvidia's earnings on 'The Exchange,' emphasizing AI as a significant growth driver. Nvidia is well-positioned to leverage the expanding AI market, potentially boosting its stock performance. Munster's insights highlight AI's crucial role in Nvidia's strategic growth plans, aligning with the company's focus on AI advancements.
Nvidia is experiencing a surge in growth due to increased demand for AI technology. CNBC's Deirdre Bosa on 'Money Movers' discussed how Nvidia's advancements in AI are positioning it as a key player in the tech industry. The company's ability to capitalize on the growing reliance on AI across various sectors is driving its success and market presence.
Currently, Nvidia's stock price is $139.35, reflecting a 3.37% increase. The stock has traded between $137.93 and $143.49 today, with a 52-week high of $153.13 and a low of $86.62. Nvidia's market capitalization stands at approximately $3.4 trillion, with a trading volume of 312.2 million shares, indicating strong investor interest and confidence in the company's future prospects.
NVIDIA Corporation (NASDAQ: NVDA) Overview: A Look at Its Market Position and Analysts' Outlook
- Analysts' optimism for NVIDIA's stock has increased, with the average price target rising from $150 to $165.33, indicating a more positive outlook on the company's growth potential.
- The anticipation surrounding NVIDIA's earnings report is high, with some analysts, like Needham, setting a bullish price target of $230 based on the company's performance and market position.
- Despite a slowdown in sales growth, NVIDIA's market capitalization has soared to $3 trillion, reflecting confidence in its long-term prospects within the AI sector.
NVIDIA Corporation (NASDAQ: NVDA) is a prominent player in the technology sector, specializing in graphics, compute, and networking solutions. The company serves diverse markets, including gaming, professional visualization, data centers, and automotive industries. Known for its cutting-edge technologies, NVIDIA has formed strategic partnerships, such as its collaboration with Kroger Co., to enhance its market presence.
In recent months, analysts have shown varying levels of optimism regarding NVIDIA's stock. Last month, the average price target was $150, indicating a cautious short-term outlook. However, this sentiment shifted in the last quarter, with the average price target rising to $165.33. This increase suggests that analysts have become more optimistic about NVIDIA's business segments and potential growth.
The anticipation surrounding NVIDIA's earnings report is significant, as highlighted by FX Empire. Investors are closely watching the company's performance, with analysts like Needham setting a price target of $230. This target reflects a positive outlook for NVIDIA's financial performance, despite potential short-term challenges in key metrics, as noted by Wall Street analysts.
Global markets are reacting to NVIDIA's upcoming earnings report, with mixed signals observed across different regions, as reported by the Wall Street Journal. U.S. stock futures indicate a slightly weaker opening, while Asian and European markets show varied performances. Despite these fluctuations, Needham's price target of $230 underscores confidence in NVIDIA's longer-term prospects.
NVIDIA's market capitalization has reached an impressive $3 trillion, yet the company's sales growth has slowed significantly, dropping from over 250% a year ago. This slowdown raises questions about the broader implications for the AI sector, as discussed by Yahoo Finance's Jared Blikre. Nevertheless, Needham's price target of $230 suggests a positive analysis of NVIDIA's current market position and future potential.
NVIDIA Corporation (NASDAQ:NVDA) Sees Positive Analyst Outlook and Prepares for Earnings Report Amid Market Fluctuations
- NVIDIA Corporation (NASDAQ:NVDA) is trading at $135.50, with a price target of $150 set by Piper Sandler, indicating a potential upside of 10.7%.
- The company's new Blackwell chips are driving investor interest, supported by increased investment from cloud giants, highlighting NVIDIA's growth prospects.
- NVIDIA's upcoming earnings report will focus on data center growth and its strategy in China amidst US chip restrictions, with a market capitalization of approximately $3.3 trillion and a trading volume of 190.5 million shares.
NVIDIA Corporation, listed as NASDAQ:NVDA, is a leading player in the technology sector, known for its advanced graphics processing units (GPUs) and innovative chip technology. The company is currently trading at $135.50, with a recent price increase of $4.21, or 3.21%. NVIDIA's stock has seen significant fluctuations, with a 52-week high of $153.13 and a low of $86.62, reflecting its dynamic market presence.
On May 27, 2025, Harsh Kumar from Piper Sandler set a price target of $150 for NVDA, suggesting a potential upside of approximately 10.7% from its current price of $135.50. This optimistic outlook is supported by the growing demand for NVIDIA's new Blackwell chips, which are attracting significant investor interest. As highlighted by Patrick Moorhead from Moor Insights & Strategy, the increased investment from cloud giants further enhances NVIDIA's growth prospects.
NVIDIA is preparing to release its earnings report, with a focus on data center growth and its strategy in China. The company's approach in China is particularly crucial due to recent US chip restrictions, which could present challenges. Dan Howley from Yahoo Finance will be analyzing these aspects, providing insights into how NVIDIA plans to navigate these potential headwinds and maintain its market position.
The upcoming earnings report is highly anticipated, with investors eager to see how NVIDIA addresses market competition and headwinds, especially in the context of the AI trade. The company's market capitalization of approximately $3.3 trillion and a trading volume of 190.5 million shares underscore its significant influence in the technology sector. As NVIDIA continues to innovate and expand, its performance and strategic decisions remain under close scrutiny by investors and analysts alike.
NVIDIA Corporation (NASDAQ:NVDA) Earnings Preview: A Look into the Future of AI Chip Technology
- NVIDIA Corporation (NASDAQ:NVDA) is set to release its quarterly earnings with an expected EPS of $0.85 and projected revenue of $43.2 billion.
- The company faces challenges in China but is anticipated to see a 64% increase in Q1 revenues, driven by AI and datacenter expansion.
- NVIDIA's financial metrics indicate high market expectations, with a P/E ratio of 44.12, showcasing optimism about its earnings potential.
NVIDIA Corporation (NASDAQ:NVDA) is a prominent player in the technology sector, known for its leadership in AI chip technology. As the company prepares to release its quarterly earnings on May 28, 2025, Wall Street analysts have set expectations with an estimated earnings per share (EPS) of $0.85 and projected revenue of approximately $43.2 billion.
NVIDIA faces a significant challenge with the potential loss of a substantial portion of its business in China, which previously accounted for 13% of its sales. Despite this, the company is expected to see a 64% increase in Q1 revenues compared to the previous year, driven by the growing demand for artificial intelligence and the expansion of datacenters.
Matt Bryson from Wedbush Securities has expressed confidence in NVIDIA's performance, downplaying concerns about the impact of DeepSeek. He emphasizes the importance of Big Tech's spending on NVIDIA, which could significantly influence the company's results. The developments in China and overall tech spending on AI are also key factors to watch in the upcoming earnings report.
NVIDIA's financial metrics reflect the market's high expectations for its future growth. With a price-to-earnings (P/E) ratio of approximately 44.12, investors are optimistic about the company's earnings potential. The price-to-sales ratio of about 24.54 and enterprise value to sales ratio of around 24.55 indicate a premium valuation, while the enterprise value to operating cash flow ratio of approximately 50 highlights the company's strong cash flow generation.
NVIDIA maintains a strong financial position with a debt-to-equity ratio of approximately 0.13, indicating low debt levels relative to its equity. The current ratio of about 4.44 further underscores the company's ability to cover its short-term liabilities with its short-term assets, showcasing its robust financial health.
NVIDIA Corporation (NASDAQ:NVDA) Maintains "Buy" Rating Amid Market Volatility
- NVIDIA Corporation (NASDAQ:NVDA) continues to show strong growth potential, supported by positive earnings estimate revisions and strategic partnerships.
- The ongoing Bitcoin rally, reaching an all-time high, benefits NVIDIA due to its involvement in cryptocurrency mining technology.
- NVIDIA's partnership with Navitas to provide advanced power semiconductors for AI data centers is expected to enhance efficiency and reduce costs, indicating strong market confidence.
NVIDIA Corporation (NASDAQ:NVDA) is a leading player in the technology sector, known for its graphics processing units (GPUs) and AI solutions. On May 22, 2025, Needham maintained its "Buy" rating for NVDA, with the action being "hold," and the stock price was $133.43. This reflects confidence in NVIDIA's potential, despite the current market volatility.
Dale Smothers, a market analyst, is optimistic about NVIDIA's prospects, especially in light of the ongoing Bitcoin rally. Bitcoin has reached an all-time high of over $111,000, which benefits NVIDIA due to its involvement in cryptocurrency mining technology. This surge occurs amid a broader market downturn, driven by easing global tensions and potential Federal Reserve rate cuts.
NVIDIA's strategic partnership with Navitas is another positive development. Navitas will provide advanced power semiconductors for NVIDIA's AI data centers, aiming to enhance efficiency and reduce costs. This collaboration has already resulted in a 150% pre-market increase for Navitas, indicating strong market confidence in the partnership's potential.
Currently, NVIDIA's stock price is $133.24, showing a 1.09% increase today. The stock has fluctuated between $131.55 and $133.95 during the day, with a 52-week high of $153.13 and a low of $86.62. NVIDIA's market capitalization is approximately $3.25 trillion, reflecting its significant presence in the tech industry.
NVIDIA's growth potential is further supported by positive earnings estimate revisions. Alongside companies like Visa and PayPal, NVIDIA is expected to perform well in 2025. The company's strong market position and strategic initiatives, such as the Navitas partnership, contribute to its promising outlook.
UBS Trims Nvidia Price Target but Maintains Bullish Stance Ahead of Earnings
UBS lowered its price target on Nvidia (NASDAQ:NVDA) from $180 to $175 while reiterating a Buy rating, as the firm adjusts expectations to account for a larger-than-anticipated impact from the recent H20 export ban.
Despite the regulatory setback, Nvidia is still expected to slightly exceed its $43 billion Q1 revenue guidance, with Q2 revenue likely to come in just modestly higher. Given that many investors feared a sequential decline, this alone may be enough to maintain market confidence.
UBS sees earnings per share for Q1 around $0.76, below the Street's $0.89 forecast, largely due to lower gross margins as the company absorbs charges tied to the H20 ban. Still, the tone of Nvidia’s upcoming earnings call on May 28 is anticipated to be upbeat.
Looking ahead, growth is forecast to pick up in the second half of the year as shipments of the next-gen GB300 racks begin in late calendar Q3. Additionally, Nvidia may regain partial access to the Chinese market through a modified Blackwell-based GPU, which could offset some of the lost volume.
However, margin guidance could see slight pressure as Nvidia delays its higher-margin Cordelia boards until next year, continuing with the lower-margin Bianca configuration for now.
UBS also noted potential regulatory shifts, with the AI Diffusion Rule possibly being replaced by a direct licensing model—an outcome that could ultimately favor Nvidia if no strict caps are imposed.
The firm revised its 2026 EPS forecast to $4.22, falling short of the consensus of $4.42, with the discrepancy primarily due to lower margin expectations in the current quarter.
UBS Trims Nvidia Price Target but Maintains Bullish Stance Ahead of Earnings
UBS lowered its price target on Nvidia (NASDAQ:NVDA) from $180 to $175 while reiterating a Buy rating, as the firm adjusts expectations to account for a larger-than-anticipated impact from the recent H20 export ban.
Despite the regulatory setback, Nvidia is still expected to slightly exceed its $43 billion Q1 revenue guidance, with Q2 revenue likely to come in just modestly higher. Given that many investors feared a sequential decline, this alone may be enough to maintain market confidence.
UBS sees earnings per share for Q1 around $0.76, below the Street's $0.89 forecast, largely due to lower gross margins as the company absorbs charges tied to the H20 ban. Still, the tone of Nvidia’s upcoming earnings call on May 28 is anticipated to be upbeat.
Looking ahead, growth is forecast to pick up in the second half of the year as shipments of the next-gen GB300 racks begin in late calendar Q3. Additionally, Nvidia may regain partial access to the Chinese market through a modified Blackwell-based GPU, which could offset some of the lost volume.
However, margin guidance could see slight pressure as Nvidia delays its higher-margin Cordelia boards until next year, continuing with the lower-margin Bianca configuration for now.
UBS also noted potential regulatory shifts, with the AI Diffusion Rule possibly being replaced by a direct licensing model—an outcome that could ultimately favor Nvidia if no strict caps are imposed.
The firm revised its 2026 EPS forecast to $4.22, falling short of the consensus of $4.42, with the discrepancy primarily due to lower margin expectations in the current quarter.