NVDA - 2nd thoughts, or maybe 3rd

I’d be curious how others who are invested in Nividia overcame these objections, or do you just feel that AI will be so big that the rest doesn’t matter?

  1. I looked at the revenue growth by business segment and saw the several are growing rapidly and still relatively small compared to gaming. I think those that are growing fast have a lot of room to run.

  2. I view NVDA as a call option on AI (e.g. machine learning). I think more and more companies may be
    “forced” to adopt machine learning in order to compete. Companies that use it will out compete those that don’t so everyone will need to use it. Since NVDA chips enable faster processing which is required for effective machine learning, demand for NVDA chips should rise.

  3. As an add to #2, AI might significant;y change the way decisions get made by offloading more and more decision to computers. Could demand for computing really explode?

I have a 5% position in NVDA.

Chris

6 Likes

There are plenty of negatives about the company, just none of them as Denny specifies “Barbarians at the Gate”.

Unlike perception on this board, and by most, NVDA really has little to no real competition. Gasp! {goes the crowd}. The conventional wisdom comes out and says, “they are a chip company…a chip company…and they have no recurring revenue!” Yeah, whatever.

Don’t believe me, look at NVDA’s margins and ROIC. There are few companies in the entire world of this size that have similar margins. Wait…I know two…QCOM and INTC. And yes, it can be said that neither QCOM nor INTC have any real competition either. Gasp! {goes the crowd again} Intel has AMD, QCOM has Samsung and multiple others! Really? On the margins, that is about it. Neither Intel nor QCOM have any competitors that are threatening their market dominance, their ROIC, or their margins.

Similarly NVDA has no competitors. Like with INTC, NVDA has AMD. AMD is really the only competitor to INTC’s x86 chip architecture that runs PCs, and AMD is the only competitor (of any volume) to NVDA’s GPUs. Look at AMD’s margins, 50% or less of the margins that INTC and NVDA have, and yet THEY ARE IN THE SAME BUSINESS! In fact, although it is hard to prove, AMD is said to have the finest chip designers in the world. Who cares.

You buy AMD if you don’t want to pay the Intel tax. As soon as AMD comes out with a chip good enough to steal material marketshare from Intel, Intel drops its prices, picks up its R&D, and AMD cannot keep up with the product cycle, and simply cannot provide the whole product (that goes to more than just new hardware chips, but software, support, packaging, and all the other things that go into high volume “commodity” {not} chips). To make matters worse, INTC controls the x86 architecture, and each new generation changes things. You can run an emulator, next generation you need a new emulator to emulate what INTC is doing.

AMD tries the same with NVDA, particularly in the gaming market. Even here, AMD cannot keep up with NVDA. NVDA controls the high end, the place where real profits exist, and AMD cannot keep up with each new product cycle. To go further, eSport, as an example, is growing into the largest sports in the world. With more people playing eSports (and growing) than any other sport in the world, and by far I think. NVDA is a lead sponsor of eSports, and is as branded in eSports and gaming graphics as Intel is with “Intel inside”.

It is so bad for AMD, that even good news is bad news. AMD’s latest GPU chip for gaming is selling out, you cannot find it except on the “black” market. Why? It is the best chip for a specific sort of bit coin currency. Problem though. AMD and everyone else knows that this demand, for this use, is transitory (it will end when the mania for this particular currency ends, or an ASIC is developed (if it becomes a more long term demand source). But is AMD ramping up production to meet this demand? No. Why? AMD does not want to get stuck with the inventory. In the meantime, instead of these chips being used in gaming, there is no supply of these AMD chips to be used in gaming. Thus, even those who want to use an alternative to NVDA GPUs for gaming, they are turning to NVDA - growing NVDA’s marketshare and brand. AMD cannot win by trying sometimes. And they have never been able to.

That is but one market for NVDA. NVDA dominates this market like INTC dominates the PC market. More than 70% of NVDA’s current customers are using legacy GPUs that will inevitably be upgraded, particularly as AR, VR, and gaming grows. Even the gaming market has long-term growth for NVDA.

Other markets are AI and autonomous driving. There are several segments to these markets that I won’t go into. suffice to say that Nvidia is the leading autonomous driving technology enabler in the world. No one disputes this. NVDA is in every Tesla out there, and Tesla, if it sticks to its plan, is 2 to 3 years ahead (at least) of any competitor. And TSLA has the most partnerships in the industry. NVDA’s position on this (and it makes sense) is that once TSLA shows what is possible, that not only will their partners (the most and most important) stay with them and want to accelerate their programs, but that the entire industry will move towards NVDA and the single proven working autonomous solution in the world.

Sure you have INTC with Mobileye (paid $15 billion for MBLY), and is behind and desperate to catch up and already filing millions of dollars of commercials promoting AI and autonomous driving.

You have Waymo, owned by Google, and Morgan Stanley saying it will have a $70 billion market cap if it IPOs (despite no real customers, no real product, and current product $10s of thousands of dollars too expensive for any mass market). Dubious.

But if correct, what value to place on NVDA’s autonomous driving platform alone?

AI. NVDA has a near monopoly on GPU chips used for training machines in the AI functionality. A little less, but roughly equivalent to INTC’s 99% marketshare in server chips (where is AMD!). Part of what makes NVDA so powerful in training, actually a few things (1) ASICS and CPUs are not good at training machines. They cannot compete in this function, and no one else makes GPUs that can compete with NVDA in the data center. giving NVDA an incredible competitive position in the data center, (2) CUDA.

CUDA is basically the equivalent (a little different but close enough) of x86 to INTC. the AI programming universe has standardized on CUDA, just like the PC market has on x86. As such, you cannot just make an equivalent or slightly better per cost GPU chip, and hope to gain marketshare on NVDA. Just as you are either going to buy a Windows computer, or a Chromebook, or a Mac, your not going to buy a PC, based just on price, in an operating system that you don’t want or use.

I am not alone with this, it is well known in the industry, in fact there are professional boards discussing this very issue of “how did we let a proprietary software system to become standard in our industry!” It is not like the industry wanted to standardize on CUDA. They just did.

There is an alternative to CUDA, it is an open source language. Guess who maintains it? NVDA. NVDA chips run CUDA, but NVDA maintains this alternative. This is what AMD chips run. And from the talk in the industry (again, from many sources, but also from professional sources) this open source language is just fine, it works great. However, it usually take a year or so to receive updates, and does not have the same power as CUDA, at least not easily.

Here is a random piece on CUDA from a Hewlett Packard researcher. This is just random. But read about his opinion of CUDA: http://www.nvidia.com/content/cuda/spotlights/gpu-accelerate…

Shocking! HP sells many a CUDA powered server with NVDA GPU chips in them.

I will stop there on the good. There is a heck of a lot more that could be said. As for negatives:

(1) although GPUs are unchallenged in training, GPUs are not unchallenged in executing AI once trained. Google, as an example has in-housed its own TPUs to run AI execution, and these ASIC chips, run their specific function better and faster than GPUs can. An ASIC, however, is designed for a very specific task, where as a GPU (like a CPU) can be used for general tasks and more flexibly programmed to do multiple tasks. NVDA is fighting to win this execution market of the AI. It is quite material. There is no guarantee that NVDA will win this market.

(2) autonomous vehicles. There is an open question as to how much $s NVDA gets get autonomous car. I don’t know. It is either enormous, or meager. I hope people out there can help me. The autonomous driving industry is also inevitable but who knows how it will play out. If it plays out like Elon Musk thinks it will, NVDA will remain the leading vendor in the world.

Can Google compete (they are almost not competing in the same market). NVDA, with Musk, is out to create a car that can drive itself 10x, 100x better than a human can do (but not perfect), but alway better than humans can drive. Google is building a car that will never, ever, crash or make a mistake. Its equipment costs more than $70k per car at this point in time (yes, will come down with scale, but that is a long way to come down).

INTC is perhaps a stronger competitor with Mobileye. But Tesla dumped Mobileye (or vice versa - who cares, obviously Tesla did not need Mobileye), and second you are competing against INTC, which has sat on its x86 monopoly for decades, and has had to make more than $30 billion in acquisition of two companies, just to have product in the AI market. The standard line in the brokerage world is that turning INTC is like turning a huge sea going vessel. Take a long time, but once you get it turned around…not actually the recipe for a flexible and heated competitor, but we shall see.

(3) the edge. NVDA obviously has an advantage on the edge in automobiles. But what about in other products? NVDA is mostly ahead of everyone here as well, with product, and they continue to produce product, but mobile edge products are a very nascent thing. Some say that NVDA’s dominance in the automobile will lead to competitive advantages in other edge products such as IoT (Internet of Things) or in drones, planes, boats, trucks (another place NVDA is a leader - not sure if THE leader), robots, who knows what.

Here, Apple is showing interest in competing. And we still do not know what Apple is doing in in autonomous vehicles (other than everything is currently vaporware, and as I was upset with again today, Siri still really is not that good).

So not really negatives, but challenges going forward, uncertainties going forward. NVDA has margins and ROIC that you see only from “monopolies”, not from nefarious means, but due to the structure of the market. Whether it is path dependency, or product architectures that get standardized on, branding, scale, speed of product cycles, whatever, NVDA has it.

NVDA is still owned and run by its founder (6% holding still and not selling. This is his baby), and is one of the best managed businesses that I have ever seen, bar none.

Because of this, I am quite comfortable buying and holding NVDA. It is a fascinating company to follow, a fascinating industry to follow, so I will stay on top of things, but the company itself, is a rare company. It really has no real competition, anymore than AMD was for INTC. Unlike INTC however, where no one could imaginable upset their x86 monopoly once it was set, NVDA has similar advantages, but does not mean that better solutions cannot be found and gain material marketshare (but it won’t be AMD).

I am not one of those who sees a few day of a down market and goes buy, buy, buy. Ridiculous. Stock markets move on longer cycles. Look at the 52 week highs and lows, a bit more than 5-10% differentiates the highs from the low, and these swings do not happen in a linear fashion. So I don’t play that game, but from a business perspective, NVDA is a rare company. I have used the term Gorilla in the Geoffrey Moore sense largely because of CUDA. But as you can see NVDA has multiple competitive advantages, and multiple markets where it holds a 70% or greater marketshare.

Tinker

81 Likes

“I don’t remember anyone expressing any negatives or worries about the company.”

The negative case is not too hard to imagine. The bulk of compute workload (AI included) are moving to a few large cloud providers who operate at the scale where they can run a hardware R&D program to design their own ASICs, which are intrinsically better suited for the workload than GPUs. Google has TPU (Tensor Processing Unit) and Apple is reportedly close to announcing Apple Neural Engine. In terms of other competition, Intel recently purchased Nervana which was also designing a custom ASIC. Even if GPUs are still used in the training phase, it’s a smaller part of the AI/ML pie.

Even if that’s not the case, they are selling into terrible markets (cloud providers that have lots of leverage over suppliers and automobile components).

On top of that, they are 10x next years estimated sales. Priced for perfection.

Final note: cryptocurrencies have no intrinsic need for mining at all. It exists to gamify network participation by introducing artificial difficult with a large reward in order to increase the number of network participants. If a large retailer or fintech company was to operate it’s own private blockchain there’s no reason they’d impose artificial constraints on block hashes (which is what requires the GPUs). Anyway, even looking at Bitcoin miners, they’ve all moved onto ASICs, too.

4 Likes

“I don’t remember anyone expressing any negatives or worries about the company.”

Here’s a few for you Saul…

  1. Qualcomm comes along and boots Nvidia out of graphics and servers the way it booted them out of mobile
  2. Graphics and Gaming becomes a saturated market and upgrade cycles lengthen
  3. AMD out performs them in Graphics for Gaming and VR markets
  4. Microsoft keeps the WINTEL duopoly going through X86 standards and re-asserts itself in servers
  5. Cyclical chip industry tending towards commoditisation hits Nvidia
  6. AI turns out to be a crock of old nonsense or even if AI does deliver chips aren’t the answer (perhaps the solve is at the Big Data end rather than the peripheral end with the processor chips
  7. ARM system licensees outperform off the shelf purveyors e.g. Apple A10 vs Qualcomm Snapdragon

Ant

12 Likes

“I don’t remember anyone expressing any negatives or worries about the company.”

Here’s a few for you Saul…

Ant, I sure hope the market never ceases to do its proper job of eliminating all needless profit. For any business you can always come up with multiple threats. The investor’s task, a difficult one, is to separate the wheat from the chaff. How credible or likely are the ones you listed? Wasn’t Intel going to eat ARM’s lunch?

Would you repost your list of threats with an indication of the likelihood of any of them succeeding? Since one can take one’s chips off the table at any time – which is hard to do in real estate, for example – the liquidity of the stock market removes the need to worry about distant threats. Saul is good at being nimble.

Denny Schlesinger

1 Like

I haven’t read any of this thread, so apologies to start.

But I’ll give you two things to think of off the bat:

  1. Commoditization. There’s no denying that NVDA is the biggest and most prolific player in the move to AI, but it won’t be that way forever. It’s so profitable that it’s attracting competition, and with sales to OEMs, it’s not about the best chip, but the good enough one.

  2. Customer concentration: One customer accounted for 12% of all revs last year.

Very surface level, but thought I’d throw it out there.

Brian
No position

1 Like

The conventional wisdom comes out and says, “they are a chip company…a chip company…and they have no recurring revenue!”

Another way of looking at recurring revenue is, let us say there is $50B server market and say Intel owns 80% market share then Intel has $40B recurring revenue. Of course, that $40 could move up or move down a bit due to economic cycles, product launch, etc. But dismissing it as no recurring revenue and paying a higher multiple for recurring revenue (which I view as false security in SaaS model, where it is easy to move away) to software companies doesn’t make much sense. On the other hand, if you are paying a higher multiple to software companies for their superior margin, lower cap-ex that is a different story.

Wow! I can’t believe all the help I’ve gotten from the board on this. Yesterday and today I increased my position up to 2.0% from 0.5% (probably won’t go any higher).

thanks to you all.

Saul

Unfortunately, there aren’t a lot of great comparisons for NVDA. Not a lot of companies that do what it does, growing nearly as fast, nearly as profitable, etc.

Intel has been discussed. Intel’s market cap is around 175B. NVDA’s is just over half that, around 95B.

But Intel’s revenue in the March quarter was almost 15B. Net income was 3B. ~20% net margin

Nvidia’s revenue was 2B, and Net Income was 500M. ~25% net margin

So it’s half the price of Intel but has 1/8 of the revenue and 1/6 of the profits. I think it’s safe to say NVDA is expensive. The question, of course, is how big can it get? How much longer can it keep growing at 20%, 30%, 40%? How many chips are there going to be?

Those are difficult questions, but I see companies around the same PS ratio as NVDA (which is around 12 or 13) that are growing much faster and where the runway ahead seems much clearer. Their names are Talend, Shopify, Mulesoft, Hortonworks…you get the picture.

Bear

6 Likes

I see companies around the same PS ratio as NVDA (which is around 12 or 13) that are growing much faster and where the runway ahead seems much clearer. Their names are Talend, Shopify, Mulesoft, Hortonworks…you get the picture.

Before someone corrects me, I realize SHOP’s PS is higher…but it’s growing faster. And Talend’s and HDP’s PS ratios are actually much lower than NVDA’s.

Bear

Bear,

I think when people are talking about Intel and NVIDIA, they are comparing where NVDA is now to where INTC was in early 90s, at the beginning of the PC boom rather than fundamentals of where both are currently.

Intel also comes up because it is a competitor. People can debate how much of a threat they are currently and potentially.

There is no doubt NVDA is expensive. Also that they have a positive future.

For me, I am happy with where they are in my portfolio but will be keeping a close eye for anything that comes up to give me pause. They are currently at 5%, mostly by growing, not by funds allocated. I would like to say I won’t trim just based on valuation, but if they get really crazy ex[pensive, I will have to see.

Kevin

1 Like

Hit submit too soon.

With that being said, I agree there are other companies that look as good or better.
I added a little SHOP yesterday, increasing my position by about 20%.

Kevin

So it’s half the price of Intel but has 1/8 of the revenue and 1/6 of the profits. I think it’s safe to say NVDA is expensive.

This is an oversimplified comparison. Intel is known for its “Fabulous Fabs” which in addition to being fabulous are fabulously expensive. If NVDA does not have chip fabs then the comparison you are making does not fit reality. A favorite comparison is between the Wintel Twins

http://softwaretimes.com/pics/intc-msft.png

People area lot cheaper than chip factories.

Who makes NVIdia chips? How are they manufactured?

Nvidia: TSMC Remains Our Primary Manufacturing Partner for 16nm FinFETs and 10nm

https://www.quora.com/Who-makes-NVIdia-chips-How-are-they-ma…

Denny Schlesinger

I think when people are talking about Intel and NVIDIA, they are comparing where NVDA is now to where INTC was in early 90s

I know it was 25 years ago, but Intel’s market cap was around $10B company or less. If NVDA’s was 20B or 30B maybe it would be close to what Intel was then.

NVDA’s is almost 100 billion. That’s an order of magnitude different from Intel in the early 90’s.

Just sayin’,
Bear

1 Like

<<<I think when people are talking about Intel and NVIDIA, they are comparing where NVDA is now to where INTC was in early 90s, at the beginning of the PC boom rather than fundamentals of where both are currently.>>>

Intel is not growing. Its growth rate is in the single digits. That is why Intel paid $15 billion for Mobileye as an example, desperation, and also defense. INTC dominates the data center. 99% marketshare with its CPUs. The autonomous driving car is the new data canter, but on the edge. It is like a cellular phone or iPad but not just on steroids, but a troupe or a battalion of such devices on steroids. It is a mobile data center that is connected wherever it goes.

Things not even discussed on what does that mean, if you have data centers, everywhere, all the time, driving around, always connected. What will this mean for distributed computing of the future? What will this mean for information delivery (such as ads in car to locked in recipients). Many collateral things not even discussed in autonomous driving. Everyone is thinking linearly.

Keeping with linear thought, INTC fears that owning the edge with AI technology (as the autonomous vehicle is the ultimate such device) will also lead to advantages in providing technology to the data center. Thus, INTC not only wants the edge business, but it needs it to defend its future data center monopoly. Thus spending more for a company like Mobileye than you would even see in the age of the Internet bubble.

But these are other conversations. I just wanted to put valuations into the equation. If you adjust for multiple vs. growth rate (and I have not done that calculation yet), I would wager NVDA probably looks pretty good in comparison to INTC. INTC maintains the multiple it has with no growth because it has such an extreme lock on its CAP. Different factors, different ways to derive a multiple.

The market rewards growth, and the market rewards CAP, and the market rewards TAM. INTC only has CAP left, whereas once, long ago, it had all in spades.

Tinker

2 Likes

I know it was 25 years ago, but Intel’s market cap was around $10B company or less. If NVDA’s was 20B or 30B maybe it would be close to what Intel was then.

$10B 25 years ago is probably equivalent to $30B today.

I am not sure I have seen it mentioned but another area that creates demand of NVDA GPUs is for crypto-currency mining. It seems that CPUs matter less and most use high powered GPUs to make it effective. Otherwise, you probably spend more in electricity than you gain in value. Though, some of the crypto-currencies may also be disruptive to large data farms and other places that utilize GPUs.

<<<I am not sure I have seen it mentioned but another area that creates demand of NVDA GPUs is for crypto-currency mining. It seems that CPUs matter less and most use high powered GPUs to make it effective. Otherwise, you probably spend more in electricity than you gain in value. Though, some of the crypto-currencies may also be disruptive to large data farms and other places that utilize GPUs.>>>

The battle between ASICS and GPUs can be seen in crypto-currencies. Once a crypto-currency becomes a longer term thing, that gives incentive for someone to invent an application specific ASIC for it that can displace the GPUs.

But for crypto-currencies, that don’t have staying power, there is not enough lead time nor incentive to do so, and thus GPUs remain the only viable means of dealing with the currency.

It must be said though, that crypto-currencies keep changing, making it harder to produce an application specific ASIC for the particular currency, thus requiring GPUs.

In another win for GPUs over ASICs, an AI consulting firm raised more than $100 million in venture capital funding today. I won’t go into the details, but very high pedigree this company. What they want to do is compete in AI against the big players like google and Microsoft and Amazon.

But it does not sound like this business will work with ASICS. This is because its primary offering is software delivery to provide AI solutions. And the hardware they use will need to be flexible (like a CPU), and not specific to one function (that an ASIC would provide). Thus this business will not turn to ASICs, but to GPUs for its infrastructure for not only machines learning (as discussed in my prior post on the subject) but also for execution of the AI learning.

Google, as an example has developed an ASIC, called a TPU, for its data centers. But these TPUs only work for specific functions. Google still uses GPUs throughout its data centers. NVDA’s top of the line new product combines its own TPUs with its GPUs, so NVDA is not ignoring this.

However, where the need is more general, like what we want out of a CPU, instead for one specific task (or set of similar specific tasks) you are not going to replace GPUs with an ASIC.

How all this plays out will be seen. But it also an example to instruct on why an ASIC vs. a GPU, and that the complexity of this, I think, will lead to further advantage to NVDA, as long as GPUs are a material part of this AI infrastructure. Mixing and matching can be a nightmare (As we have seen with Intel in the data center - this is certainly the case. No one mixes and matches Intel chips with AMD chips, and very few enterprises do so with PCs either, even thought AMD PC CPUs are mainstream, although a minority of marketshare). There is a lot of complexity that goes into programming and running these things, and mixing vendors has not been the best practice in these industries in the past.

Tinker

4 Likes

(probably won’t go any higher)

Probably will, you just won’t have to add to it.

Cheers
Qazulight

chip fabs were Intel’s not so secret weapon. Cost over $1 billion each and Intel had lots of trade secrets.
Nvidia will need to find and keep other edges. The cost was a benefit for Intel in the early days but lots of people know how to make them efficiently today and the mucho expensive and challenging of ever shrinking die sizes seems to be near an end.