No. of Recommendations: 0
<<<Our new offerings for the AI inference market are also gaining momentum. The recently launched TensorRT programmable inference acceleration platform opened a new market opportunity for us, improving the performance and reducing the cost of AI inferencing in order -- by orders of magnitude compared with CPUs. It supports every major deep learning framework, every network architecture and any level of network complexity. More than 1,200 companies are already using our inference platform, including Amazon, Microsoft, Facebook, Google, Alibaba, Baidu, JD.com, [indiscernible], Hi Vision and Tencent.>>>

This is an interesting new market opportunity for Nvidia, displacing Intel CPUs. Like I was talking about with Noz and QCOM and its new server chips vs. Intel, we shall see if this is a trend that starts taking real marketshare, or simply more of a marketing experiment that is touted but never really takes hold. Lots of potential upside here I think.

Tinker
Print the post  

Announcements

What was Your Dumbest Investment?
Share it with us -- and learn from others' stories of flubs.
When Life Gives You Lemons
We all have had hardships and made poor decisions. The important thing is how we respond and grow. Read the story of a Fool who started from nothing, and looks to gain everything.
Contact Us
Contact Customer Service and other Fool departments here.
Work for Fools?
Winner of the Washingtonian great places to work, and Glassdoor #1 Company to Work For 2015! Have access to all of TMF's online and email products for FREE, and be paid for your contributions to TMF! Click the link and start your Fool career.