Although there is no sign of slowdown in the boom of artificial intelligence (AI), the actions of IA basically were hammered in 2025 due to the Trump Administration pricing and politicians.
AI actions have obviously increased considerably in the past two years, they have therefore entered the year to relatively high assessments. And the threat of economic recession has legitimately questioned the massive investments of AI provided by the main technological companies earlier this year.
Fortunately, on Wednesday, April 9, the administration reduced the most extreme reciprocal prices. Last week, several CEOs of Big Tech also reiterated their huge IA investment plans this year while indicating that the demand for AI remains incredibly strong.
But with these bullish comments, some remarks pose risks for the biggest company in everyone’s AI: Nvidia (Nvda 2.91%)).
AI appears to the test of recession
First of all, the good news. Even in the middle of the enormous agitation of the market, the AI revolution always seems to be booming. In fact, two “Magnificent seven“The CEOs of the company confirmed this week, even if the markets plunged.
First of all, Alphabet (Goog 2.56%)) (Googl 2.79%)) Held its Google Cloud 2025 event last week, which probably stolen from the radars of many investors. During the event, not only has there been a number of exciting announcements, in particular with regard to the new Gemini 2.5 model from Google, but CEO Sundar Pichai also confirmed that the plans previously announced by Alphabet of expenditure of $ 75 billion on AI data centers this year. Pichai added that spending gives good yields, saying: “The opportunity with AI is as great as possible.”
And not only have the leaders of alphabet spoke optimistic about their IA and Google Cloud offers, but also the clients of the Google cloud. At the event, customer Intuity confirmed that he “doubled” on AI efforts, while another large customer, Verizonhas described enormous advantages of the use of Google AI models.
Meanwhile, it is not only the price repercussions, but also the introduction of the Chinese model at low prices Deepseek R1 in January which turned the actions of AI. But Thursday, Amazon (Amzn 2.01%)) CEO Andy Jassy expressed his concerns about the need for all these expenses to rest in an interview on CNBC. He noted:
People are confused. And we have also seen it with AWS (Amazon Web Services), which is that customers like when you take the cost per unit of something, this saves them money on what they do, but they do not really spend less. It actually denounces them to do much more innovation and, in absolute terms, they spend more.
In his letter to shareholders, also published Thursday, noted Jassy “AI Generative will reinvent practically all the customer experiences we know, and allow news on which we have only fantasized. “Jassy also reiterated that Amazon notes three -digit growth rates in AI income.
Thus, although there is certainly a source of concern at the macro level, the technological initiates always believe that the generator will transform the world, and none of them want to be left behind. This probably means that AI spending will continue to be resilient, regardless of the economy.
Two cloud giants undertake to reduce IA costs, taking him to Nvidia
Although the AI revolution remains intact, there are certainly changing dynamics, especially around AI costs. These concerns would only be amplified in a recession.
It could make things gradually more difficult Nvidia (Nvda 2.91%)). So far, Nvidia has been synonymous with the construction of AI, and demand for its new Blackwell chip seems incredibly strong.
However, in addition to making bullish comments on AI, Amazon and Google noted their enormous efforts to reduce the cost of AI. Jassy, in particular, did not chop the words when he noted that the IA costs had to lower:
AI does not have to be as expensive as today, and it will not be in the future. Chips are the biggest culprit. Most of the AI to date have been built on a flea supplier. It’s expensive.
There is no secret to whom Jassy speaks: Nvidia. Nvidia chips can operate between $ 30,000 and $ 40,000 per chip. So when you hear about one hundred thousand clusters or even million GPU clusters, which is why the construction of AI costs as much. And with Nvidia which is 75% raw margins Or even higher, you could say that Nvidia can be exaggerated today-that is, of course, if someone else could make a more competitive cost of costs.
While Nvidia and her Cuda software are currently governing the day, all the well -funded cloud giants are certainly trying to change this. Jassy continued by saying that the Amazon’s current trainium 2 fleaium2 generation offers 30% to 40% price performance compared to NVIDIA current instances, which probably means H100. While Nvidia is working on the reduction of her new Blackwell chip, Amazon is also at work on Trainium3.
In the interview, Jassy noted: “If you are sitting in meetings with the AWS team at the moment, they have the impression that it is their responsibility and their mission to make the cost of AI significantly less than today.”
Meanwhile, during the alphabet event, management unveiled what seems to be an incredibly powerful internal chip called Ironwood. Ironwood is the seventh generation of the company’s tensor’s treatment unit (TPU), which Alphabet uses for his own IA internal workloads.
The new chip is designed to run in servers of 256 chips or in massive clusters of 9,216 chips, which Google plans to use not only for its Gemini models but also for its cloud customers who wish to train their own models. Each iron wood chip can manage the memory of the previous generation of TPU six times. And the performance is massive – in this large cluster configuration, Google Ironwood has a maximum inference flow of 4,614 teraflops. It is 10 times faster than the fifth generation TPU and 5 times faster than the sixth generation.
Nvidia will get serious competition
Without a doubt, Nvidia has a multi -year step ahead to make AI fleas, and its Cuda software acts like a little ditchAt least for the moment. However, Amazon and Google are massively powerful companies that can also produce fleas at the cost of the brochure, while Nvidia is currently making a gross margin of 75%. This means that NVIDIA tokens cost the cloud companies 4 times.
Given the missions of the two companies to reduce IA costs and delete NVIDIA as an intermediary, so to speak, NVIDIA’s revenues could possibly slow down, or its margins could also decrease. But it is only if Amazon, Google and other cloud giants make it possible to succeed in designing and implementing their own silicon and facilitating the use of AI developers.