Skip to main content
No. of Recommendations: 9
Just a thought experiment here, but if AMD, a company totally focused on CPUs and GPUs cannot make a product competitive with Nvidia in the data center, how is a company like Amazon suppose to do so, unless Amazon were to put together a multi billion dollar group, segmented it as a start up, to focus on making a better solution, and keeping the product road map, to what Nvidia offers?

Is this a logical error in reasoning? I mean really, can we expect a company like Google or Amazon or Facebook, who produce their internal technologies only extraneously to be able to keep up with the product road map of Nvidia when even AMD is not able to do so?

Quantum computers, by golly, possibly.

Tinker
Print the post Back To Top
No. of Recommendations: 5
I can not imagine FB, Amazon, or Google starting their own chip fab company.

It is certainly a thing they *could* do. But it would end up being far more expensive than just outsourcing that to the likes of Intel, AMD, Nvidia, ARM, etc.

I suppose a lot of it has to do with what their goal is in creating a chip fab unit. Is it to compete in the open market with those companies? Probably not. Is it because they think they can lower their own costs by doing so? Probably. Google builds it's own servers and network hardware. But I'm guessing they provide specs to a white box vendor that builds those things for them. Same with Amazon, and probably FB.

I think getting into hardware production at any level is way too far afield for any of these companies, all of which on the surface have little focus in what they do (but under the covers are in fact very, very focused).

And as you pointed out, if AMD can't even compete with Nvidia, never mind their arch-nemesis Intel, what hope do FB, Google, or Amazon have of doing so when it's not even close to their core focus? If anyone was going to take on Nvidia, it would be Apple. They have already bought various chip companies and over time swapped out 3rd party products for their own. And they recently announced they're moving their laptops away from Intel for another of their own designs.

--
Paul
Print the post Back To Top
No. of Recommendations: 0
The good thing for NVDA in regards to Apple is that they are not tied into each other. Apple doesn’t use any NVDA products so far as I know. Not so for the other tech companies including Google that buy NVDA chips en masse. Apple also keeps everything so tied into their own ecosystem that it’s hard to imagine them producing something that threatens NVDA which is a company more concentrated on enabling other companies to develop their own ecosystems.
Print the post Back To Top
No. of Recommendations: 1
Apple is concerned about one thing, and one thing only; the customer experience. Well, under Jobs that was true. I'm beginning to wonder what the hell is happening now that he's gone. Their software is getting "updated" for the sake of updating it. Their hardware is getting useless things tacked on to it, and useful things removed from it just for the sake of being different. I'm a long-time Apple fan, and they're really starting to piss me off with some of their design decisions. I can't decide if they're made because Apple is attempting more user lock-in, or they're trying to sway the industry in different directions, or they're just plain flat out of good ideas. But I digress.


Nvidia is also concerned about one thing. Creating the best GPUs available. If it so happens they can sell them to those who wish to re-purpose a graphics engine into mining crypto-currencies, they're happy to do that. But they'll still make the best graphics cards out there because that's their focus.

Apple doesn't care about the absolute best unless it's having the best customer experience. And if that experience doesn't improve enough relative to the cost of using Nvidia, they won't bother. People don't buy Apple to do the things that benefit from such a highly performant GPU. They buy Apple these days as a status symbol. Or a gadget of convenience, like phones, watches and tablets. Very few people buy desktops from Apple. Most are laptops. And those aren't systems people do graphics intensive operations on. So if they can keep the costs lower by using a cheaper GPU, they will as long as the customer experience doesn't suffer that much.

--
Paul
Print the post Back To Top
No. of Recommendations: 1
they buy Apple these days as a status symbol. Or a gadget of convenience,
it must be wonderful to so precisely know the motivation of all those millions of Apple users out there.
Print the post Back To Top
No. of Recommendations: 2
it must be wonderful to so precisely know the motivation of all those millions of Apple users out there.

I mean, they're not buying Macs for gaming. Sure, some people play games on Macs, but serious gamers are using Intel based PCs with Nvidia graphics cards. Lots of people use Macs for work, myself included. But it's a convenience thing, not a GPU-driven need. I *could* do my work on an Intel based PC running Windows or Mac, but that would be hugely *inconvenient*. No one is buying an iPhone out of need over an Android, it's convenience. Apple makes it easier to use. No one is buying an Apple Watch out of need. If *need* a smart watch it's for a specific thing, and the Apple Watch is not as good at those specific things as the alternatives.

--
Paul
Print the post Back To Top
No. of Recommendations: 0
I mostly buy and use Apple stuff. Because I like the OS better, no spam ware, longer life expectancy, local Apple store if I have problems ,and great trade in value. I despise Windows.

The only exception, tablets which I use only for reading e books, Android devices are" good enough" I have no interest in watches and little time to game. I do have a 27" iMac with top of line (at that time) GPU and it plays an excellent flight simulator ( X plane) just fine.

<o> No one is buying an iPhone out of need over an Android kind of dogmatic statement. One example in my family, an elderly non tech type, needs an iPhone because that is the one used by all the family members available to on hands help her if she as a problem.
Print the post Back To Top
No. of Recommendations: 0
Hi Tinker

My thoughts are, the GPU is currently the best general purpose chip for a variety of specific problems. This was kind of a lucky break for Nvidia, it just happened that GPUs were the best available option for a number of matrix-based operations.

However, theres plenty of reason to believe that more specific chips (ASICs, FPGAs) can be orders of magnitude better than GPUs for those specific problems.

Its a 'general purpose versus domain-specific' battle. General-purpose by definition will tend to lose those battles.

So the GOOG/AMZN/FB etc of the world don't need to care about NVDA's roadmap. They have specific problems to solve (eg: AI training and inference) and if customers want to solve those problems, they can invest in provisioning custom chips.

The challenge for NVDA is if those datacenter type problems (AI et al) consolidate around a few specific problems that can be solved by specific ASICs (eg: TPUs) which can be commoditised.



cheers
Greg
Print the post Back To Top
No. of Recommendations: 1
Seems to me that facial recognition and voice recognition could be specific enough to allow for an ASIC sort of solution. To date however, not so much.

As we discussed earlier, the latest Google tensor chip is 30% more expensive to solve the same task than is the latest Volta from Nvidia. This is of course subject to the specific problem being solved. But despite this rule of thumb, to date, no ASIC has out done a Nvidia GPU except in very specific instances such as bitcoin, which one might expect, bitcoin is a very specific application.

Certainly could happen, but to date Nvidia is still the superior solution even against the toughest of its competitors.

Tinker
Print the post Back To Top