/Edited Transcript of NVDA.OQ earnings conference call or presentation 24-Feb-21 10:00pm GMT (via Qpute.com)
Edited Transcript of NVDA.OQ earnings conference call or presentation 24-Feb-21 10:00pm GMT

Edited Transcript of NVDA.OQ earnings conference call or presentation 24-Feb-21 10:00pm GMT (via Qpute.com)


Q4 2021 NVIDIA Corp Earnings Call SANTA CLARA Feb 25, 2021 (Thomson StreetEvents) — Edited Transcript of NVIDIA Corp earnings conference call or presentation Wednesday, February 24, 2021 at 10:00:00pm GMT TEXT version of Transcript ================================================================================ Corporate Participants ================================================================================ * Colette M. Kress NVIDIA Corporation – Executive VP & CFO * Jen-Hsun Huang NVIDIA Corporation – Co-Founder, CEO, President & Director * Simona Jankowski NVIDIA Corporation – VP of IR ================================================================================ Conference Call Participants ================================================================================ * Aaron Christopher Rakers Wells Fargo Securities, LLC, Research Division – MD of IT Hardware & Networking Equipment and Senior Equity Analyst * Christopher James Muse Evercore ISI Institutional Equities, Research Division – Senior MD, Head of Global Semiconductor Research & Senior Equity Research Analyst * John William Pitzer Crédit Suisse AG, Research Division – MD, Global Technology Strategist and Global Technology Sector Head * Mark John Lipacis Jefferies LLC, Research Division – MD & Senior Equity Research Analyst * Stacy Aaron Rasgon Sanford C. Bernstein & Co., LLC., Research Division – Senior Analyst * Timothy Michael Arcuri UBS Investment Bank, Research Division – MD and Head of Semiconductors & Semiconductor Equipment * Vivek Arya BofA Securities, Research Division – Director ================================================================================ Presentation ——————————————————————————– Operator [1] ——————————————————————————– Good afternoon. My name is Mariama, and I will be your conference operator today. At this time, I would like to welcome everyone to NVIDIA’s financial results conference Call. (Operator Instructions) I will now turn the call over to Simona Jankowski, NVIDIA’s Vice President of Investor Relations and Strategic Finance, to begin the conference. ——————————————————————————– Simona Jankowski, NVIDIA Corporation – VP of IR [2] ——————————————————————————– Thank you. Good afternoon, everyone, and welcome to NVIDIA’s conference call for the fourth quarter of fiscal 2021. With me on the call today from NVIDIA are Jensen Huang, President and Chief Executive Officer; and Colette Kress, Executive Vice President and Chief Financial Officer. I’d like to remind you that our call is being webcast live on NVIDIA’s Investor Relations website. The webcast will be available for replay until the conference call to discuss our financial results for the first quarter of fiscal 2022. The content of today’s call is NVIDIA’s property. It can’t be reproduced or transcribed without our prior written consent. During this call, we may make forward-looking statements based on current expectations. These are subject to a number of significant risks and uncertainties, and our actual results may differ materially. For a discussion of factors that could affect our future financial results and business, please refer to the disclosure in today’s earnings release, our most recent forms 10-K and 10-Q and the reports that we may file on Form 8-K with the Securities and Exchange Commission. All our statements are made as of today, February 24, 2021, based on information currently available to us. Except as required by law, we assume no obligation to update any such statements. During this call, we will discuss non-GAAP financial measures. You can find a reconciliation of these non-GAAP financial measures and GAAP financial measures in our CFO commentary, which is posted on our website. With that, let me turn the call over to Colette. ——————————————————————————– Colette M. Kress, NVIDIA Corporation – Executive VP & CFO [3] ——————————————————————————– Thanks, Simona. Q4 was another record quarter, with revenue exceeding $5 billion and year-on-year growth accelerating to 61%. Full year revenue was also a record at $16.7 billion, up 53%. Our Gaming business have reached record revenue of $2.5 billion in Q4, up 10% sequentially and up 67% from a year earlier. Full year Gaming revenue was a record at $7.8 billion, up 41%. Demand is incredible for our new GeForce RTX 30 series products based on the NVIDIA Ampere GPU architecture. In early December, we launched the GeForce RTX 3060 Ti, which joined the previously launched RTX 3090, 3080 and 3070. The entire 30 series lineup has been hard to keep in stock, and we exited Q4 with channel inventories even lower than when we started. Although we are increasing supply, channel inventories will likely remain low throughout Q1. GeForce RTX 30 Series graphics cards were a holiday sensation, due not just to their amazing performance, but also to their rich features, including our second-generation RTX ray tracing technology and DLSS, AI-powered performance accelerator, which massively boosts frame rates in graphically demanding titles. Three dozen games now all support RTX, including the top battle royal game, Fortnite; the top role-playing game, Cyberpunk 2077; the top massively multiplayer online game, World of Warcraft; and the best-selling game of all time, Minecraft. RTX has clearly set the new standard in gaming. Building on this momentum, at CES in January, we introduced a wave of Ampere architecture gaming products, including our biggest ever laptop launch, powered by GeForce RTX 3060, 3070 and 3080 laptop GPUs and with our third-generation Max-Q technology. These new thin and lightweight gaming laptops increased performance and energy efficiency by up to 2x from the prior generation. RTX 3060 laptops starts $999 and are faster than the previous generation laptops, which sold for $2,500. The incredible performance, design and price points of these new laptops will delight the growing universe of gamers and creators as well as students and professionals. The gaming laptop market has grown sevenfold in the past 7 years, and momentum is building. With top OEMs bringing to market a record 70-plus laptop models based on the GeForce RTX 30 Series, GeForce laptops, as a whole, are the fastest-growing and one of the largest gaming platforms. Also at CES, we announced the GeForce RTX 3060 GPU priced at $329, extending the 30 Series desktop lineup further into the mainstream. We expect strong demand when it launches this Friday as 60 class GPUs have traditionally been our most popular products. Starting with the 3060, we’re taking an important step to maximize the supply of GeForce GPUs for gamers. Users are constantly discovering new applications for our powerful programmable GPUs and cryptocurrency mining is one of them. With rising ethereum prices, there are indications that miners are behind GPUs. We would like GeForce GPUs to end up with gamers, so we have created a new special software drivers that will detect the ethereum mining algorithm, cutting in half the mining efficiency of the GeForce RTX 3060. We suspect the significant increase in the ethereum network hash rate observed over the past few months was driven by a combination of previously installed mining capacity that was reactivated as well as new sales of GPUs and ASICs. Since our GPUs are sold to graphics card manufacturers and then onto distribution, we don’t have the ability to accurately track or quantify their end use. And all estimates suggest that cryptomining contributed $100 million to $300 million to our Q4 revenue, a relatively small portion of our Gaming revenue in Q4. Cryptocurrencies have recently started to be accepted by companies and financial institutions and show increased signs of staying power. Through the desk — to address industrial ethereum mining demand, last week, we announced a new line of NVIDIA CMPs or cryptomining processors. Shipments will start in March. CMPs lack display outputs and have other optimizations that improve cryptomining power efficiency. CMP products will let us gain some visibility into the contribution of cryptomining to our overall revenue. For Q1, we estimate that CMP will contribute approximately $50 million. We plan to sell these products to industrial miners. We will quantify their contribution each quarter for transparency. Over the past year, it has become clear that we’ve entered a new era in which gaming is an integral part of global culture. The number of concurrent users on Steam has more than doubled since 2018 and continues to hit new records. In 2020 alone, more than 100 billion hours of gaming content was seen on YouTube and 0.5 billion people watched eSports. Increasingly, we aren’t just gaming, we’re also watching sports, attending concerts, creating content and connecting with our friends in virtual environments. Additionally, we are excited about the new experiences like VR. Significantly more content is now available, including arguably the first VR killer app, Beat Saber. And there’s now almost 2 million VR users on Steam. And with these powerful structural shifts, we expect our Gaming business to remain on a robust growth trajectory. The GeForce RTX 30 Series GPUs have kicked off a powerful upgrade cycle, and we estimate only around 15% of GeForce gamers own an RTX class GPU, which is needed to experience the beautiful ray trace graphics of modern games. Moreover, the universe of gamers is rapidly expanding. And the reach of GeForce has extended beyond gamers to some 45 million creators. In addition, Gaming revenue continues to benefit from a favorable mix shift as gamers and creators keep moving to higher-end GPUs. We expect another great year for GeForce. Earlier this month, we celebrated the 1-year anniversary of the GeForce NOW cloud gaming platform, which is has now over 6 million members strong. GeForce NOW offers 800 PCs from over 300 publishers, more than any other cloud gaming service, including 80 of the most played free-to-play games. Starting with support for our Windows PCs, Macs and Android devices, we added support in recent months to Chromebooks, iPhones and iPads. GFN has grown globally with more than 65 countries on our service and more added regularly by our GeForce NOW alliance partners. Moving to Pro Viz. Q4 revenue was $307 million, up 30% sequentially and down 10% year-on-year and ahead of our expectations. Full year revenue was $1.1 billion, was down 13%. Strong sequential growth was driven primarily by a recovery in desktop workstations as some customers returned to the office and enterprises resumed purchases that have been deferred by the pandemic. Notebook GPUs grew sequentially to a record as enterprises continue to support remote workforce initiatives. Looking ahead, the reopening of businesses will benefit desktop workstations, but longer-term workforce trends will likely shift our mix to notebook GPUs and cloud offerings. Health care was a standout vertical in the quarter, with significant orders from GE, Siemens and Oxford Nanopore Technologies, public sector and automotive also showed strength. Omniverse, our real-time 3D collaboration and simulation platform is now in open beta. Over 500 creators and professionals have tested Omniverse through our early access program. Omniverse is one of our most important and exciting platforms. We are delighted by its initial acceptance and look forward to sharing more details on its long-term growth opportunity in the coming months. Moving to automotive. Q4 revenue was $145 million, up 16% sequentially and down 11% year-on-year. Full year revenue of $536 million declined 23%, sequential growth was driven by continued recovery in the global automotive production volumes and growth in AI cockpit revenue. The year-on-year decline reflects the expected ramp down of legacy infotainment. NVIDIA has emerged as the industry’s leading end-to-end full-spec technology provider for self-driving and AI-enabled vehicles. Orin, the SoC that drive — self-driving platform was built on, delivers an unrivaled 254 trillion of operations per second of performance on industry-leading power efficiency, helping to revolutionize the transportation industry. Our technology leadership has driven a robust, rapidly growing set of opportunities. We have great momentum with an expanding list of electric vehicle OEMs, including Nio, SAIC, Li Auto and Xpeng, which are all using the NVIDIA DRIVE platform to power their next-generation of vehicles. We look forward to growing with them as they continue to scale. Our software-defined platform is the only solution that spans from the data center for training deep neural nets and running physically accurate simulations to a full stack in car solutions, scaling from ADAS to Level 5 fully autonomous functionality. Autonomous vehicle companies are harnessing this technology. Zoox recently unveiled its Level 5 bidirectional robotaxi powered by NVIDIA. Einride launched its next-generation cab-less autonomous truck using NVIDIA DRIVE Orin. And earlier this year, Mercedes announced a 56-inch wide MBUX Hyperscreen powered by NVIDIA AI cockpit technology. This win builds on our momentum with Mercedes first generation MBUX system, which is now in 1.8 million cars. We are in the early innings of a significant opportunity. We have built a multibillion-dollar design win pipeline for our self-driving AI cockpit solutions, which will drive a material inflection in revenue over the next few years. Our transformational partnership with Mercedes announced last June demonstrates the power of our evolving business model as we expand our addressable market and layer in software revenue. We are exceptionally well positioned to capitalize on the significant opportunity that lies ahead. Moving to Data Center. Revenue was $1.9 billion, which exceeded our expectations, was comparable to last quarter and up 97% from the year-ago period, which did not include Mellanox. Data Center compute revenue was up 45% year-on-year. Full year Data Center revenue rose 125% to a record $6.7 billion, including almost 70% growth from Data Center compute. From a sequential perspective, the Data Center compute’s stronger-than-expected double digit growth more than offset the anticipated decline in Mellanox revenue, which included a large nonrecurring network sale to a single OEM in Q3. Compute growth was led by vertical industries, where OEM partners continued ramping up to 100 — A100-based servers, and our own DGX system sales were strong. Vertical industries were well over 50% of Data Center revenue across compute and networking, with particular strength in supercomputing, financial services, higher education and consumer Internet verticals. Additionally, hyperscale customers continue to deploy the A100, driving both sequential growth and exceptionally strong year-on-year growth in Data Center compute. The A100 has been adopted by all major cloud customers globally and is being deployed by hyperscale customers for internal workloads. Still, we are in the early stages of adoption and expect continued growth this year. The ramp of the A100 has been smoother and accomplished by better visibility than prior generation. Its universal AI training and inference capabilities as well as support for a wider set of applications and outstanding performance are driving high customer utilization, a clear sign of the A100’s value. Turning to Mellanox. We are seeing continued strong traction and robust momentum across our customer sets. Its revenue was up over 30% from Mellanox’s Q4 revenue in calendar 2019, when we were still a stand-alone company. Year-on-year growth in the quarter was led by hyperscale and large consumer Internet customers which grew over 60% from last year, with several contributing record revenues. Consistent with our outlook, Mellanox had a sequential decline, impacted by a nonrecurring sales to a China OEM in Q3. We expect a return to sequential growth in Q1, driven by strong demand for our high-speed networking products, including the ramp of ConnectX adapters with CSPs and all major server OEMs in their upcoming refresh. We also see strong momentum in high-performance computing with HDR InfiniBand products. For example, we won 6 of the 7 supercomputers awarded over the past few months by EuroHPC. Starting next quarter, we will continue to provide color on networking as part of the Data Center market platform, but we will no longer break out Mellanox revenues separately. Looking forward, we are incredibly excited about the opportunities in Data Center. Accelerated computing is not only delivering super more — long gains in performance but is also an energy-efficient and cost-effective method of computing. And virtually every industry is adopting technology with greater urgency of companies to adopt to the new world of more distributed workers and customers. As industries embark on this journey, they are also increasingly focused on combating climate change. To that end, the A100 performed AI computations with 1/20th the power consumption of CPUs. It powers our Selene supercomputer, which is #1 on the Green500 list of the world’s most efficient supercomputers. Indeed, NVIDIA’s powered machines recently captured 25 of the top 30 spots on the Green500 list. Accelerated computing is not only serving the exponential growth in demand for compute, they can also help bend the power consumption curve. With accelerated computing, NVIDIA is pioneering a path forward the computing industry. Before I move to the P&L and outlook, let me give you an update on our proposed acquisition of Arm. In September, we announced plans to acquire Arm from SoftBank Group in a transaction that will create the premier computing company for the age of AI. At that time, we said it would take approximately 18 months to secure regulatory approvals in the U.S., the U.K., the EU, China and other jurisdictions. Thorough reviews are typical with the deal of this size. This process is moving forward as expected. We are in constructive dialogue with the relevant authorities and are confident that regulators will see the benefits to the entire tech ecosystem. As we have said, this combination will spur competition. Together, Arm and NVIDIA will provide greater choice to the data center ecosystem, a compelling alternative CPU architecture for the market and further enhance Arm’s offering in mobile and embedded. Our intention is to increase investment in arm’s existing road map. Adding resources to stimulate growth in new markets, we love and intend to maintain Arm’s open licensing model, a commitment guaranteed both by long-term legally binding contracts as well as our own interest in ensuring this investment is a profitable one for us. We are on the cusp of a new age of which AI fuels industries ranging from health care to scientific research to the environment. With this transaction, our vision is to boost Arm’s potential so it can thrive in this new era and grow into promising new markets. Moving to the rest of the P&L. Q4 GAAP gross margins were 63.1% and non-GAAP gross margins were 65.5%. GAAP gross margins declined year-on-year due to amortization of developed technology acquired from Mellanox, partially offset by product mix. The sequential increase was due to higher margins for Gaming GPUs and lower IP-related costs, partially offset by lower margin mix in our Data Center portfolio. Non-GAAP gross margins increased by 10 basis points year-on-year and was flat sequentially, in line with our expectations. Q4 GAAP EPS was $2.31, up 51% from a year earlier. Non-GAAP EPS was $3.10, up 64% from a year ago. Q4 cash from operations was a record $2.07 billion. With that, let me turn to the outlook for the first quarter of fiscal 2022. Revenue is expected to be $5.3 billion, plus or minus 2%, with most of the sequential growth driven by Gaming. GAAP and non-GAAP gross margins are expected to be 63.8% and 66%, respectively, plus or minus 50 basis points. The GAAP and non-GAAP operating expenses are expected to be approximately $1.67 billion and $1.2 billion, respectively. For the full year, we expect to grow non-GAAP OpEx in the mid-20% range. GAAP and non-GAAP other income and expenses are both expected to be an expense of approximately $50 million. GAAP and non-GAAP tax rates are both expected to be 10%, plus or minus 1%, excluding discrete items. Capital expenditures are expected to be approximately $300 million to $325 million. Further financial details are included in the CFO commentary and other information on our IR website. In closing, let me highlight a pending event for the financial community. We will be virtually attending the Raymond James Institutional Investors Conference on March 1; the Morgan Stanley Technology, Media and Telecom Conference on March 3; and the Arete Virtual Semis Conference on March 3. In addition, we will be hosting a Virtual Investor Day on Monday, April 12, following the livestream of Jensen’s opening keynote at our GPU Technology Conference. Our earnings call to discuss our first quarter and full — our first quarter is scheduled for Wednesday, May 26. We will now open the call for questions. Operator, would you please poll for questions. ================================================================================ Questions and Answers ——————————————————————————– Operator [1] ——————————————————————————– (Operator Instructions) Your first question comes from the line of C.J. Muse with Evercore ISI. ——————————————————————————– Christopher James Muse, Evercore ISI Institutional Equities, Research Division – Senior MD, Head of Global Semiconductor Research & Senior Equity Research Analyst [2] ——————————————————————————– I guess, Jensen, a higher-level question for you on the enterprise side. You’re now a couple of quarters into the ramp of A100, and curious if you could speak to whether you’ve seen any surprises here. Any areas of specific strength worth calling out, and any changes to how you’re thinking about the size of this opportunity? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [3] ——————————————————————————– Yes. Thanks a lot, C.J. So as you know, A100 is a very different type of GPU. This is our first universal computing GPU. It’s great at high-performance computing. It’s great at data analytics. It’s great training. And also for our highest GPU, it’s also the first time that is incredible for imprint. It’s some 20x faster than previous generation. We introduced this, some really, really exciting new computational format like TF32, TensorFloat-32 for training. And with a multi-instance GPU, turning our GPU, 1 GPU into a whole bunch of smaller GPUs, autonomous GPUs to improve performance and reducing latency. And so the capability is really quite exciting. We’re seeing strength in hyperscalers as they continue to accelerate their adoption of AI. Some of the new applications we’ve spoken about a couple of times before, the transition to deep learning for conversational AI, speech recognition to natural language understanding, all the way to speech synthesis is now based on AI — based on deep learning. The other area that’s growing incredibly fast is the deep learning recommender models. Just about everything that you do on the Internet is based on recommenders. There are hundreds of different recommenders out there, whether you’re shopping or recommending music or recommending news or recommending search and so all the recommending ads. And so all of these different types of applications are driving that. For the first time, we saw our industrial application — industrial data center growing to be larger than hyperscale. And we’re seeing industrial applications across scientific computing, where simulation-based approaches are now being fused with AI approaches for weather simulation, genomics, molecular dynamic simulation, quantum chemistry, even simulating quantum computing, which is one of the really exciting areas. We’re seeing AI being deployed for big data analytics, RAPIDS, which is NVIDIA’s created open source platform for data analytics. Spark 3.0, which NVIDIA really led and is now GPU-accelerated. So now you could have big data in the cloud while doing big data analytics in the cloud on all of the CSP platforms. You could — we’re seeing a lot of excitement around financial services. Financial services and consumer Internet services are all really growing nicely. And so A100 adoption is just starting. I mean we’re going to see several couple of years of continued growth yet ahead of us, well, as AI gets adopted in clouds and industries. ——————————————————————————– Operator [4] ——————————————————————————– Your next question comes from the line of Vivek Arya with BofA Securities. ——————————————————————————– Vivek Arya, BofA Securities, Research Division – Director [5] ——————————————————————————– Just a clarification and then a question for Jensen. On the clarification, Colette, I was hoping if you could give a little more color around Q1. Do you still expect the Data Center to grow sequentially in Q1? I know you said that most of the growth coming — will come from Gaming, but any color on the Data Center would be useful. And then Jensen, the question for you is, in your press release, you used the phrase, AI driving the smartphone moment for every industry. Could you help us quantify what that means? And where I’m going with that is, is there a number in terms of what percentage of servers are shipping today with your accelerators? And where can that ratio go over time? Is that a fair way of looking at the adoption of your technology and AI? ——————————————————————————– Colette M. Kress, NVIDIA Corporation – Executive VP & CFO [6] ——————————————————————————– So thank you, Vivek. Your question regarding the guidance as we lead into Q1. We had indicated that, yes, a good percentage of our growth between Q4 and Q1 rolls down from Gaming. But we also do expect Data Center to grow. Most of our sequential growth coming from Gaming. But keep in mind, we also expect all of our market platforms will likely be able to grow quarter-over-quarter. ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [7] ——————————————————————————– Because we’re entering in the third phase of AI. The first phase of AI was when we invented the computing platforms, the new chips, the new systems, the new system software, the new middleware, the new way of working, the new way of developing software, which the industry, the world is now starting to call ML ops. The way that software is developed and the way that it’s deployed is completely different than the past. Something — in fact, I heard a great term, Software 2.0, and it makes a lot of sense. It’s a computer that is writing software. The way that you develop software is completely different. The way you compute is different. And that was our first phase, and that started in the journey that was some 8, 9 years ago now. The second phase was the adoption of using this in an industrial way for clouds. And we saw it revolutionize new services, whether it’s speech-oriented services or search-oriented services or recommender services, the way you shop, the way you use the Internet is completely different to that. And so that’s really the second phase. And those 2 phases are still continuing to grow. And you’re still seeing the growth associated with that. The third phase is the industrialization of AI. And some of the great examples, when I say kind of a smartphone moment, I meant that it’s a device with AI. It’s autonomous. And it’s connected to a cloud service, and it’s continuously learning. So some of the exciting examples that I saw that I’ve seen and we’re working with companies all over the world. We have some 7,000 AI start-ups and working with. And almost all of them are developing something like this. And large industrial companies, whether it’s John Deere or Walmart, they’re all developing applications kind of lengthy. And basically, it’s an autonomous system, autonomous machine, in our case, it’s called Jetson. It’s a robotics machine. That robotics machine is a car and it’s called DRIVE. And it’s running an autonomous — an AI application on top, and AI is still on top, and this could be — they could be moving device — moving things around. It can be picking and placing. It could be just watching a warehouse and monitoring traffic and keeping traffic flow going it could be connected to a car. And whenever the — whenever the fleet of cars needs to be retrained because of a new circumstance that was discovered, the cloud service would do the relearning and then we’d deploy into all of the autonomous devices. And so in the future, we’re seeing that these industries, whether you’re in retail or logistics or transportation or farming, ag tech to lawn mowers, consumer lawn mowers, they’re not going to just be products that you buy and use from that point forward, but it will likely be a connected device with an AI service that runs on top of it. And so these industries are so excited about it because it gives them an opportunity to change the way that they interact with their customers. Rather than selling something once, they sell something and provide service that’s on top of it, and they can stay engaged with the customers. The customers could get a product that’s improving all of the time. Just like your smartphone, and that’s kind of like — that’s the reason why I’ve been calling it a smartphone moment for all these industries. And we saw what happened to the smartphone revolution. And then we saw what happened to the smart microphone, the smart speaker revolution. You’re going to see smart lawn mowers, smart tractors, smart air conditioners, smart elevators, smart building, smart warehouses, but robotic retail storage, entire store — and the entire retail store is like a robot. And they will all have autonomous capability. They’ll all be driven by AI. So what’s new for the industry, therefore, is that all of the enterprises in the world used to have computers for IT to facilitate, to host their employees and their supply chain. But in the future, all of these industries, whether you’re in medical imaging or in lawn mowers, you’re going to have data centers that are hosting your products just like the CSPs. And so that’s a brand-new industry. And we have a platform that we call EGX, which is the 5G Edge AI systems. And we have the autonomous systems we call AGX, which is what goes into Jetson and DRIVE. And between those 2 systems and the software stack that we have on top of it, we’re in a great position to help these industries one at a time transform their business model from an object-oriented business model, a thing-based business model to a connected device business plan. ——————————————————————————– Operator [8] ——————————————————————————– Your next question comes from the line of Stacy Rasgon with Bernstein Research. ——————————————————————————– Stacy Aaron Rasgon, Sanford C. Bernstein & Co., LLC., Research Division – Senior Analyst [9] ——————————————————————————– First, I don’t want to be pedantic, I suppose. But I guess, on the Q1 guide, you’re saying the Gaming is the majority of the growth. Was that an absolute statement? Or was that a percentage statement? Can you give us some idea of how you sort of rank the sequential percentage growth of, say, Gaming versus Data Center versus other, especially since it sounds like you’ve got $50 million in crypto-specific stuff that will go into the other. And then I guess, just briefly, could you give us some indication of where your supply situation and lead times are on your Ampere parts within Data Center? I think you said last quarter, they were many months on 6 months plus. Are they still looking like that? And is that sort of the limiting factor at this point in terms of what you can actually ship on the compute side of the data center? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [10] ——————————————————————————– Colette will take one, and I’ll take the other. ——————————————————————————– Colette M. Kress, NVIDIA Corporation – Executive VP & CFO [11] ——————————————————————————– Sure. Let me start off, Stacy, in terms of our guidance for Q1. As you know, we’re still in the early innings of our Ampere architecture, our Ampere architecture, as it relates to Gaming as well as what it relates to Data Center. As we articulated in our call, we have been really seeing continued uplift of folks’ adoption of A100, and it’s going quite smoothly than what we had seen in prior overall versions. So when we think about our guidance for Q1, there’s many different types of conclusions that will happen at the end of the quarter in terms of what we said. But all of our platforms can grow. But the majority of the growth from Q4 to Q1 will likely be Gaming. ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [12] ——————————————————————————– You asked a question about lead time. Our company — at the company level, we’re supply-constrained. Our demand is greater than our supply. And however, for Data Center, so long as the customers work closely with us, and we do a good job planning between our companies, there should be a supply — there shouldn’t be a supply issue for Data Centers. We just have to do a good job planning. And we have direct relationships with each one of the world’s CSPs, and we have direct relationships with all the OEMs. And we could do excellent planning between us. We shouldn’t have a supply — we shouldn’t be supply-constrained there. But at the company level, we’re supply-constrained, demand is greater than supply. And we have enough supply. We usually have enough supply to achieve better than the outlook. And we had that situation in Q4. We expect that situation in Q1. And we have enough supply to grow through the year. But supply is constrained and demand is really, really great. And so we just have to do a really good job planning. And meanwhile, one of the things that really came through for us is we have the world’s best operations team. Our company is really — really has an amazing operations team. We build the most complex products in the world, the most complex chips, the most complex packages, the most complex systems. And during Q4, they improved our cycle time. And during Q1, I’m expecting them to improve our cycle time again. And we really are based to have such an amazing operations team. And so during these times, it really comes in handy. But overall, at the company level, where we expect demand to be greater than supply, we have enough supply to do better than the outlook. And we have enough supply to grow each quarter throughout the year. ——————————————————————————– Operator [13] ——————————————————————————– Your next question comes from the line of Timothy Arcuri with UBS. ——————————————————————————– Timothy Michael Arcuri, UBS Investment Bank, Research Division – MD and Head of Semiconductors & Semiconductor Equipment [14] ——————————————————————————– I had a question on crypto. I guess Jensen, I know that the CMP stuff and the software driver stuff that you’re doing for the 3060, that’s going to help a lot. But I think that there’s like 4 or 5 of the big currencies that are going to move or at least they’re moving or on a path to move from proof of work to proof of stake, which is going to be a lot less computing-intensive. So I guess the question that I get a lot is how do you assess the degree to which that drives GPUs back into the secondary market? Is there any way that you can get kind of a handle on that? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [15] ——————————————————————————– Yes. If you look at the recent hashrate, first of all, the transition is going to take some time. It can’t happen overnight. And so people have to build trust in the new versions. And so it’ll take a little bit of time. But I hope it does. I mean I hope that people use their proof of stake over time. And a little bit of these questions don’t have to be answered. However, I don’t have that much optimism either that it will be all proof of stake. I think that proof of work is a very legitimate way of securing the currency. And in the beginning, while any currency is building its reputation, it’s going to take something like a proof of work to do so. And so I think proof of work is going to be around for a bit. We developed CMP for this very reason, just so that there are different versions. We have different versions of our products for gaming, for professional visualization, for high-performance computing, for deep learning. It stands to reason we have the ability to do a different version for CMP. And we can sell it directly. The way that we go to market would be to go direct into the industrial miners. To — and it’s a great benefit to them so that they don’t have to chase around spot markets. It’s a great benefit to the gamers. And because they want a game, and the gaming demand is just incredible, it’s off the chart. And so I think this is going to be really beneficial to everybody. The recent hashrate growth was really a result of several dynamics. The first dynamic is the installed base. Most people thought that the — once again mining, the GPUs come back into the aftermarket. A small part does that. Some people do that. But the vast majority don’t keep them. And the reason for that is because, obviously, they believe in ethereum. And so they’re industrial miners, that’s what they do. And so they keep it around for when the profitability returns and they could kick start their mining gear. We saw — that’s what we saw in the latter part of last year. We saw the hashrate starting to grow. Most of that was a result of the installed miners reactivating their equipment. It wasn’t until earlier this year that we started to see demand in our own GPUs. And when that starts to happen, there are several different dynamics. There’s the primary source these days come from powerful ASICs. And then there’s some that comes from our GPU and other GPUs in the marketplace. And so I think that this is going to be a part of our business. It will grow extremely large, no matter what happens. And the reason for that is because when it starts to grow large, more ASICs comes in the market, which kind of mutes it. And when the market becomes smaller, it’s harder for ASICs to sustain the R&D. And so the spot miners — industrial miners come back, and then will create CMPs. And so we expect it to be kind of a — to be a small part of our business as we go forward. Now one of the important things is to realize that in the near term, the — because we’re in the beginning parts of our Ampere ramp, only 2 quarters into it, into a multiyear cycle, this is also the first time that we’ve completely changed computer graphics. RTX using ray tracing is completely different than rasterization. And so this is a fundamental change in the way we do computer graphics, and the results have been spectacular. There is some 200 million installed base in desktop, some 50 million in laptop. And the vast majority of them, we’ve only upgraded, approximately, I think it’s something like 15% of the installed base that’s been upgraded to RTX. And so there’s a giant installed base, and the installed base is growing, that we need to update to the next generation of computer graphics. ——————————————————————————– Operator [16] ——————————————————————————– Your next question comes from the line of John Pitzer with Crédit Suisse. ——————————————————————————– John William Pitzer, Crédit Suisse AG, Research Division – MD, Global Technology Strategist and Global Technology Sector Head [17] ——————————————————————————– I want to go back to Data Center. You’ve been very kind over the last couple of quarters to call out Mellanox, both when it was a positive driver and when it was a headwind. I’m kind of curious when you look into the fiscal first quarter, is there anything of distinction to mention around Mellanox versus core Data Center? And I guess as a follow-on, the key metric that a lot of investors are looking at is when does the core Data Center business year-over-year growth start to reaccelerate. And some of that is just simple math. You’re just comping very hard compares from last year. But Jensen, how would you think about Data Center year-over-year growth in the context of a reopening trade or of any sort of new applications out there? I mean what happened — what helped us last time around was the move to natural language AI, is there another big sort of AI application we should be thinking about as we think about Data Center growth reaccelerating? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [18] ——————————————————————————– Yes. We’re expecting — Mellanox was down this last quarter. And our compute business grew double digit and offset — more than offset the decline in Mellanox. We expect Q1 to be a growth quarter with Mellanox, and we expect this coming year to be quite an exciting year of growth for Mellanox. The business is growing in Ethernet. It’s growing for CSPs. It’s growing in InfiniBand for high-performance computing and the switches have grown — switch business grew 50% year-over-year. And so we’re seeing really terrific growth there. One of the new initiatives, and we’re going to see success towards the second half because the number of adoptions and number of engagements, is our new BlueField DPUs. It’s used for virtualization for hyperscalers. It’s also used for security. As you know quite well, the future of computing is cloud and it’s multi-tenant cloud. And there’s no VPN front door to the cloud. You’ve got millions of people who are using every aspect of your computing. So you need to have distributed firewalls, and you can have it just in one place. The intense focus of security across all of the data centers around the world is really creating a great condition for BlueField, which is really perfect for them. And so I expect our Mellanox networking business to grow very nicely this year. And we expect Q1 to be a great growth quarter for compute as well as Mellanox. A killer — the great driving application for AI are settled. Last year, you’re absolutely right that it was natural language understanding and the transformer model and the core of birth in other versions like that really made it possible for us to enable all kinds of new applications. So you’re going to see a natural language understanding do text completion and it’s going to be integrated. I think it was just announced today that it was going to be integrated into Microsoft Word. We’ve been working with them on that for some time. And so there’s some really exciting applications like that. But the new one that came — that emerged recently are deep learning-based conversational AI, where the ASR, the speech recognition as well as the speech synthesis are now based on deep learning. It wasn’t before. They were based on models that ran on CPUs. But now with the deepening models, the accuracy is much, much higher, and it has the ability to also mimic your voice and be a lot more natural. And so the ability to — but these models are much more complex and much larger. The other big, huge driver is recommenders. This is something really worthwhile to take a look at. It’s called deep learning recommender models. And recommenders have historically — whether it’s their shopping or personalizing websites or personalizing your store, recommending your basket, recommending your music, historically, it’s been used a traditional machine learning algorithm. But because of the accuracy and the — just the extraordinary economic impact that comes from an incremental 1% inaccuracy, for most of the world’s large Internet businesses, people are moving very rapidly to deep learning-based model. And these models are gigantic. They’re utterly gigantic. And this is an area that is really driving high-performance computers. So we — I expect us to feel a lot of momentum. And the last one is the one that you I spoke about which has to do with industrial, 5G and edge IoT-type of applications for all of the different industries, whether it’s retailer, logistics or transportation, agriculture or warehouses to factories. And so we’re going to see AI and robotics in a very large number of applications and industries. And we’re just seeing so much excitement there. ——————————————————————————– Operator [19] ——————————————————————————– Your next question comes from the line of Aaron Rakers with Wells Fargo. ——————————————————————————– Aaron Christopher Rakers, Wells Fargo Securities, LLC, Research Division – MD of IT Hardware & Networking Equipment and Senior Equity Analyst [20] ——————————————————————————– I wanted to go back again on the Data Center business. You just mentioned, Jensen, the BlueField-2 product poised to kind of ramp and materialize in the back half of the calendar year. How do you see that? Is it an attach rate? I think there’s some discussions in the past about all servers could potentially, over time, incorporate this layer of acceleration. How quickly should we think about that ramp? And then the second question is can you just, at a high level, talk about how CPU — how a CPU strategy — you’re thinking about that in the context of the broader data center market? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [21] ——————————————————————————– Sure. If I could just work backwards, I believe that every single data center node will be outfitted with a DPU someday. And that someday is probably, call it, 5 years from now. And the fundamental driver of it is going to be security. Every single application in the data center and every single node in the data center has to be individually secured. Zero-trust Computing, 0 — or confidential computing and zero-trust computing, these initiatives are going to cause every data center to have every single application and every single node be secured. Which means every one of those computers have to have a control play that is isolated from the application plan. And all the applications cannot share the same resources because that application could be network, that application could be an intruder. No application could have access to the control. And yet today, the software-defined data centers, the software-defined networking, software-defined storage, all of the security agents are running in the same processors as the application, and that hasn’t changed. You see the cloud — the CSPs in the world moving in this direction. Every single data center will have to move in this direction. So every node will be a DPU process for the software, for the infrastructure. You’re essentially going to see the data center infrastructure be offloaded from the application plan, and it will be something like a BlueField. So I think this is our next multibillion-dollar opportunity, CPUs. We support every CPU in the world, and we’re the only accelerated computing platform that accelerates every CPU. Ironically, the only thing we don’t accelerate for AI is Arm, but we want to change that. Arm has such an exciting future because the nature of their business model and the nature of their architecture is perfect for the future of hyperscalers and data centers. You want the most energy efficiency in every single data center because every data center is power-constrained. We are going to be power constrained in every aspect of computing going forward. And so we would love to build around the Arm processor and invest in building a great ecosystem around it, so that all the world’s peripherals and all the world’s applications to work on any one of the CPUs that we know today. And we’re going to start with high-performance computing and start with AI and all the areas that we have a lot of expertise in, build out our platform. So you’re starting to see one industry leader after another embrace Arm. And I think that’s terrific. But now we’ve got to energize it with all of the ecosystem support. It can’t just be vertical applications, but we want to create a broad general Arm ecosystem. ——————————————————————————– Operator [22] ——————————————————————————– Your next question comes from the line of Mark Lipacis with Jefferies. ——————————————————————————– Mark John Lipacis, Jefferies LLC, Research Division – MD & Senior Equity Research Analyst [23] ——————————————————————————– A question for Jensen, I think. Jensen, if you look at the past computing eras, typically, it’s one ecosystem that captures 80% of the value of that computing era. And mainframe says IBM and many computers was stacked, PCs, Wintel, cell phones, Nokia and Apple. So if you don’t get the ecosystem right, then you’re splitting 20% of the market with a handful of players. So in this next era of computing, parallel processing or AI, I think you’ve articulated the most compelling architectural vision of the Data Center, of the future with Data Center scale computing devices with CPUs, GPUs, DPUs, integrated in the same box, serving all workloads, I imagine, virtualized environment. Can you help us understand where is the market in embracing that vision? And where is NVIDIA in building out that ecosystem for that Data Center scale computing vision? And then maybe as part of that, to what extent is CUDA the kernel for that ecosystem? ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [24] ——————————————————————————– Yes. We’re — I think we’ve done a great job building out the platforms for several ecosystems around the world. And the domains that we do incredibly well at are the domains that I have to do with accelerated computing. We pioneered this approach. And we brought it to high-performance computing at first. And we accelerated — sent to the computing and democratized supercomputing for all researchers, anybody who wants to have a supercomputer now can. And computing it will simply not be the obstacle to somebody’s discovery. We did the same for artificial intelligence. We did the same for virtualization. We brought — we expanded the reach of gaming tremendously. Our GeForce today is the largest gaming platform. It’s the largest — single largest body of computers that are used for gaming. And in each case, we expanded the market tremendously. But we would like to do the same for Data Center scale computing as it applies to virtualizing these applications. These applications are also in the process. They’ve historically required dedicated systems, but they’re moving into a virtualized data center environment. And we are best at doing that. They run on our platform today. We have the ability to virtualize it and put it into the Data Center and make it remotely available. And so these applications, these domains are some of the most important domains in the world. And so we’re in the process of doing that. By doing so, and making our architecture available to CSPs and OEMs, we could create this accelerated computing platform available to everybody. And so that’s — you’re seeing our journey doing that. First, creating and architecting this platform and then putting it literally into every single data center in the world. But we would also like to — the next step of our journey is the Phase III of AI, and it has to do with turning every endpoint into a data center. Whether it’s a 5G tower, a warehouse, a retail store, a self-driving car, a self-driving truck. These are going to be — they’re all going to be essentially autonomous data centers. And they’re going to run AI, but they’re going to run a lot more. They’re going to do security in real time. Networking is going to be incredible, it’s going to run software to 5G and GP accelerated 5G we call AERIAL. And so these platforms are going to become data centers. They’ll be secure. The software is protected and you can’t tamper with it. If you temper with it, it, of course, won’t run. And so the capability of these clouds will move all the way out to the edge. And we’re in the best position to be able to do that. So I do think in this new world of post-Moore’s Law, post-Dennard scaling, in this new world where AI and software, the right software in this new world where data centers are going to be literally everywhere and they’re unprotected. There’s no giant building with a whole bunch of people to secure it. And in this new world, where software is going to enable this autonomous future, I think we are perfectly positioned for it. ——————————————————————————– Operator [25] ——————————————————————————– This is all the time we have for Q&A today. I will now turn the call back to CEO, Jensen Huang. ——————————————————————————– Jen-Hsun Huang, NVIDIA Corporation – Co-Founder, CEO, President & Director [26] ——————————————————————————– Thanks for joining us today. Q4 capped a truly breakout year for NVIDIA. The 2 biggest engines of our business, Gaming and Data Center, posted powerful growth. Gaming has become the world’s largest media and entertainment industry and will grow to be much larger. And again gamers will create, will play, they’ll learn, they’ll connect. The medium of gaming can host any type of game and eventually evolve into countless metaverses, some for play, some for work. Gaining a simultaneously a great technology and a great business driver for our company. This year, we also closed our Mellanox acquisition and successfully united the amazing talent of our companies. Combined, we possess deep expertise in all aspects of computing and networking to drive the architecture of modern data centers. Cloud computing and hyperscalers have transformed the data center into the new unit of computing. Chips and servers are just elements of the data center scale computers now. With our expertise in AI computing, full-stack accelerated computing, our deep network to computing expertise and cloud-to-edge platforms, NVIDIA is helping to drive a great computer industry transformation. And our planned acquisition of Arm, the world’s most popular and energy-efficient CPU company, will help position NVIDIA to lead in the age of AI. This year was extraordinary. The pandemic will pass, but the world has been changed forever. Technology adoption is accelerating across every industry. Companies and products need to be more remote and autonomous. This will drive data centers, AI and robotics. This underlies the accelerated adoption of NVIDIA’s technology, the urgency to digitize, automate and accelerate innovation has never been higher. We are ready. We look forward to updating you on our progress next quarter. Thanks a lot. ——————————————————————————– Operator [27] ——————————————————————————– This concludes today’s conference call. You may now disconnect.


This is a syndicated post. Read the original post at Source link .