LIcencia Creative Commons

Tuesday, February 20, 2024

NVIDIA, $NVDA ("ENVIDIA") Y LA "REVOLUCIÓN" DE LA "IA ABIERTA"

  

 

 

 $NVDA, ITS RECYCLED "#AI REVOLUTION" AND THE DARK SIDE OF IT KEPT AWAY FROM THE PUBLIC  

$NVDA convinced everyone that we are just at the beginning of the #AI revolution and its GPUs will sell like hot cakes. Yes, we are at the beginning of the #AI revolution. However, $NVDA isn’t going to be part of it as much as they (desperately) want people to believe, and I am about to tell you why. 

 $NVDA, desperate for revenues growth, needed an "AI REVOLUTION"… in 2015! 

For FY 2015, $NVDA reported 5bn$ in annual revenues, a mere 7% YoY growth. By the 8th of February 2016, its stock was down from 8.24$ at the end of 2015 to 6.43$, a -22% against a -12% drop in $QQQ (yes, a forgotten era for #tech #stocks). This is when $NVDA came across an interesting group of data scientists who were trying to launch a non-profit AI project for the benefit of humanity. The name of the project was #OpenAI

It was no secret in Silicon Valley that $NVDA GPUs could be used effectively for Deep Learning models since 2012 when the paper "ImageNet Classification with Deep Convolutional Neural Networks" was published (Ilya Sutskever, one of #OpenAI co-founders was one of the 3 authors) [Pic 1]. However, what was clearly missing was the "revolution" that could make #AI mainstream, and $NVDA saw it coming with #OpenAI and its supporter at that time, our dear@elonmusk

In the first comment below you can find the link to $NVDA “AI Revolution”presentation from 2016 (Pic 2), and I'm sure that if I didn’t share the date, no one would have guessed this is 8 years old. Does now everyone agree with me that today’s PR campaign we are seeing is recycled from what was previously attempted? 

Sadly, $NVDA didn’t start selling a gazillion of GPUs to power the #AI revolution in 2016 because at that time Silicon Valley wasn't bitten by the #fomo bug. So another "revolution" was urgently needed to sell GPUs and unload plenty of unsold inventory accumulated: #crypto mining was the answer (see post below). $NVDA's growth busted again as soon as that #fomo bubble, which they of course were pushing to the extreme with non-stop PR campaigns, busted. For FY 2019, $NVDA recorded a -6.5% drop in revenues. This is when #OpenAI returned to be handy again for $NVDA, thanks to the 1bn$ investment the not-so-non-profit company just received from $MSFT. Everyone is familiar with what came after that, so no need to expand further here.  

With the capacity they already bought from $NVDA, all cloud operators (and major $NVDA customers) already have plenty of capacity to fulfil theirs and their cloud customers' requests for years to come: 

1- GPUs are better than CPUs for training deep learning models, but for all other types, they don’t have a significant advantage (Research paper in Pic 3) 

2 - GPUs are far more expensive and 3-5 times more energy consuming than CPUs. As such, $NVDA's (legitimate) clients have been working for years on how to optimise their use (pic 4) 

3 - It's no secret that $NVDA's (legitimate) clients are on their way to become $NVDA competitors to develop GPUs better optimised for #AI needs (and not a repurposing from gaming) and, most of all, at much lower costs.  

GPUs Data centres are very bad for the environment. 

Sorry to break the spell here, but GPUs are the "hippos" of the Data centres in terms of energy consumption. I strongly recommend reading @TechSpot article (Pic 5 & 6 - in comment) "The Rise of Power: Are CPUs and GPUs Becoming Too Energy Hungry? - An Unnecessary Price to Pay" in this regard to quickly understand why these GPUs data centres have limited growth potential (like #crypto mining rigs had….) and cloud providers running them will have a significant cost problem from this front too. 

I bet in less than 48 hours, $NVDA will likely release another set of "blockbuster" results, because what started as a "fake it till you make it" after all we have seen so far, has pretty obviously now become a "fake it till you can"

 

 

No comments: