It’s scary when the best chip company in the world rolls out new products.
It’s scary because others can’t compete and they get left further behind.
It’s scary because the high level of technology facilitates another new wave of technological expertise in other companies from the software and hardware side.
These new products are almost always faster, more efficient, and better than the previous products catalyzing a snowball effect that lifts everybody’s revenue.
This type of outstanding performance of late is the reason that made Nvidia (NVDA) into the world’s most valuable chipmaker and they have announced they are updating its H100 artificial intelligence processor, adding more capabilities to a product that has fueled its dominance in the AI computing market.
The new model, called the H200, will get the ability to use high-bandwidth memory, or HBM3e, allowing it to better cope with the large data sets needed for developing and implementing AI.
Amazon’s AWS, Alphabet’s Google (GOOGL) Cloud and Oracle’s (ORCL) Cloud Infrastructure have all committed to using the new chip starting next year.
Winning orders is easy with the outsized brand recognition and type of game changing product on offer.
The current version of the Nvidia processor is already experiencing accelerated demand.
But the product is facing stiffer competition: Advanced Micro Devices (AMD) is bringing its rival MI300 chip to market in the fourth quarter, and Intel Corp. claims that its Gaudi 2 model is faster than the H100.
AMD is another chip company that readers should feel comfortable diversifying into if they don’t feel comfortable putting all eggs into the Nvidia basket.
AMD’s stock is surging towards old highs around $125 and should overtake that soon after the nice rally in the 2nd half of the year.
With the new product, Nvidia is trying to keep up with the size of data sets used to create AI models and services.
Adding the enhanced memory capability will make the H200 much faster at bombarding software synthesizing data.
Large computer makers and cloud service providers are expected to start using the H200 in the second quarter of 2024.
Nvidia got its start making graphics cards for gamers, but its powerful processors have now won a following among data center operators.
That division has gone from being a side business to the company’s biggest moneymaker in less than five years.
Nvidia’s graphics chips helped pioneer an approach called parallel computing, where a massive number of relatively simple calculations are handled at the same time.
That’s allowed it win major orders from data center companies, at the expense of traditional processors supplied by Intel.
The growth helped turn Nvidia into the poster child for AI computing earlier this year — and sent its market valuation soaring.
Nvidia is like a freight train that has left the station.
The stock is up 9 straight days as we cruise into its earnings report on November 21st.
It’s hard to see this earnings report being nothing short of spectacular and Nvidia have become famous for forecasting the unthinkable.
They then go and surpass a high bar and push the envelope further so it’s not a bad idea to buy NVDA before the earnings report.
The speed at which they come out with products is astounding and now being able to boast the best server chip in the tech enterprise community, it just represents yet another powerful part of their stunning array of tech arsenal.
$600 per share is a no-brainer for Nvidia and that will be surpassed in 2024.