Although there is no sign of slowdown in the boom of artificial intelligence (AI), the actions of IA basically were hammered in 2025 due to the Trump Administration pricing and politicians.
AI actions have obviously increased considerably in the past two years, they have therefore entered the year to relatively high assessments. And the threat of economic recession has legitimately questioned the massive investments of AI provided by the main technological companies earlier this year.
Fortunately, on Wednesday, April 9, the administration reduced the most extreme reciprocal prices. Last week, several CEOs of Big Tech also reiterated their huge IA investment plans this year while indicating that the demand for AI remains incredibly strong.
But with these bullish comments, some remarks pose risks for the biggest company in everyone’s AI: Nvidia (Nasdaq: NVDA).
First of all, the good news. Even in the middle of the enormous agitation of the market, the AI revolution always seems to be booming. In fact, two CEOs of the company “Magnifiment Seven” confirmed this week, even though the markets plunged.
First of all, Alphabet (Nasdaq: Goog)(Nasdaq: Googl) Held its Google Cloud 2025 event last week, which probably stolen from the radars of many investors. During the event, not only has there been a number of exciting announcements, in particular with regard to the new Gemini 2.5 model from Google, but CEO Sundar Pichai also confirmed that the plans previously announced by Alphabet of expenditure of $ 75 billion on AI data centers this year. Pichai added that spending gives good yields, saying: “The opportunity with AI is as great as possible.”
And not only have the leaders of alphabet spoke optimistic about their IA and Google Cloud offers, but also the clients of the Google cloud. At the event, customer Intuity confirmed that he “doubled” on AI efforts, while another large customer, Verizonhas described enormous advantages of the use of Google AI models.
Meanwhile, it is not only the price repercussions, but also the introduction of the Chinese model at low prices Deepseek R1 in January which turned the actions of AI. But Thursday, Amazon (Nasdaq: Amzn) CEO Andy Jassy expressed his concerns about the need for all these expenses to rest in an interview on CNBC. He noted:
People are confused. And we have also seen it with AWS (Amazon Web Services), which is that customers like when you take the cost per unit of something, this saves them money on what they do, but they do not really spend less. It actually denounces them to do much more innovation and, in absolute terms, they spend more.
In his letter to the shareholders, also published Thursday, Jassy noted: “The generative AI will reinvent practically all the customer experiences that we know and allow news on which we have only fantasized.” Jassy also reiterated that Amazon notes three -digit growth rates in AI income.
Thus, although there is certainly a source of concern at the macro level, the technological initiates always believe that the generator will transform the world, and none of them want to be left behind. This probably means that AI spending will continue to be resilient, regardless of the economy.
Although the AI revolution remains intact, there are certainly changing dynamics, especially around AI costs. These concerns would only be amplified in a recession.
It could make things gradually more difficult Nvidia (Nasdaq: NVDA). So far, Nvidia has been synonymous with the construction of AI, and demand for its new Blackwell chip seems incredibly strong.
However, in addition to making bullish comments on AI, Amazon and Google noted their enormous efforts to reduce the cost of AI. Jassy, in particular, did not chop the words when he noted that the IA costs had to lower:
AI does not have to be as expensive as today, and it will not be in the future. Chips are the biggest culprit. Most of the AI to date have been built on a flea supplier. It’s expensive.
There is no secret to whom Jassy speaks: Nvidia. Nvidia chips can operate between $ 30,000 and $ 40,000 per chip. So when you hear about one hundred thousand clusters or even million GPU clusters, which is why the construction of AI costs as much. And with Nvidia making 75% raw margins or even more, you could say that Nvidia can be exaggerated today-that is to say, of course, if someone else could make a more competitive cost of costs.
While Nvidia and her Cuda software are currently governing the day, all the well -funded cloud giants are certainly trying to change this. Jassy continued by saying that the Amazon’s current trainium 2 fleaium2 generation offers 30% to 40% price performance compared to NVIDIA current instances, which probably means H100. While Nvidia is working on the reduction of her new Blackwell chip, Amazon is also at work on Trainium3.
In the interview, Jassy noted: “If you are sitting in meetings with the AWS team at the moment, they have the impression that it is their responsibility and their mission to make the cost of AI significantly less than today.”
Meanwhile, during the alphabet event, management unveiled what seems to be an incredibly powerful internal chip called Ironwood. Ironwood is the seventh generation of the company’s tensor’s treatment unit (TPU), which Alphabet uses for his own IA internal workloads.
The new chip is designed to run in servers of 256 chips or in massive clusters of 9,216 chips, which Google plans to use not only for its Gemini models but also for its cloud customers who wish to train their own models. Each iron wood chip can manage the memory of the previous generation of TPU six times. And the performance is massive – in this large cluster configuration, Google Ironwood has a maximum inference flow of 4,614 teraflops. It is 10 times faster than the fifth generation TPU and 5 times faster than the sixth generation.
Without a doubt, Nvidia is a multi -year step ahead in the manufacture of AI chips, and its Cuda software acts as a little a ditch, at least for the moment. However, Amazon and Google are massively powerful companies that can also produce fleas at the cost of the brochure, while Nvidia is currently making a gross margin of 75%. This means that NVIDIA tokens cost the cloud companies 4 times.
Given the missions of the two companies to reduce IA costs and delete NVIDIA as an intermediary, so to speak, NVIDIA’s revenues could possibly slow down, or its margins could also decrease. But it is only if Amazon, Google and other cloud giants make it possible to succeed in designing and implementing their own silicon and facilitating the use of AI developers.
Before buying actions in Nvidia, consider this:
THE Motley Fool Stock Advisor The team of analysts has just identified what they believe 10 Best Actions So that investors are buying now … and Nvidia was not part of it. The 10 actions that cut could produce monster yields in the coming years.
Inquire Netflix Make this list on December 17, 2004 … if you have invested $ 1,000 at the time of our recommendation, You would have $ 495,226! * Or when Nvidia Make this list on April 15, 2005 … if you have invested $ 1,000 at the time of our recommendation, You would have $ 679,900! *
Now it’s worth noting Stock advisorTotal average yield is 796% – an outperformance that has made the market compared to 155% For the S&P 500. Do not miss the last list of the best 10, available when you join Stock advisor.
See the 10 actions “
* Return Actions Advisor from April 5, 2025
John Mackey, former CEO of Whole Foods Market, a subsidiary of Amazon, is a member of the board of directors of Motley Fool’s. Suzanne Frey, director of Alphabet, is a member of the board of directors of Motley Fool’s. Billy Duberstein and / or its positions in alphabet and Amazon. The Motley Fool has positions and recommends Alphabet, Amazon, Intuit and Nvidia. The Motley Fool recommends Verizon Communications. The Word’s madman has a Disclosure policy.
Don’t worry, AI investors, the boom in artificial intelligence is still underway – but there are growing dangers for Nvidia was initially published by the Motley Fool