The adage says: “Your arbitration is my opportunity” and could be used to summarize Meta’s thrust in the construction of an internal chip for IA training tasks. Reuters reports The company recently started a small deployment of fleas after having managed to build them in a test with TSMC of Taiwan (sorry Intel). Meta already uses its chips for the inference or adaptation of content to specific users after the AI model has already been developed and formed. He wants to use them for training models by 2026.
Of the article:
The push to develop internal chips is part of a long -term plan at Meta to reduce its gigantic infrastructure costs, because the company places expensive bets on AI tools to stimulate growth.
Meta, who also owns Instagram and Whatsapp, has Total forecast 2025 expenses From $ 114 billion to $ 119 billion, including up to $ 65 billion in capital spending mainly motivated by IA infrastructure expenses.
One of the sources said that Meta’s new training chip is a dedicated accelerator, which means that it is designed to manage only specific AI tasks. This can make it more powerful in power than integrated graphic processing units (GPU) generally used for workloads of the AI.
Even if the consumer applications of generating AI, such as chatbots, end up being a ravished bubble, Meta can deploy technology to improve content recommendations and targeting ads. The vast majority of META revenues come from advertising, and even small improvements in targeting capacities can produce billions of new incomes, as advertisers see better results.
Despite some flops and the dull results of the reality laboratories division, Meta has managed to constitute strong material teams over the years and has been successful with his Ray-Ban AI glasses. However, the leaders warned the teams internally that their material efforts have still not had the changing impact in the world they hope. Meta’s VR helmets are sold in millions of people a year. CEO Mark Zuckerberg has long sought to develop its own hardware platforms so that it can reduce its dependence on Apple and Google.
Large technological companies have paid billions of dollars in Nvidia since 2022 in order to fill up on its highly sought -after GPUs which have become the industry standard for the treatment of AI. Although the company has competitors, like AMD, NVIDIA was praised for having offered not only chips themselves, but the CUDA software toolbox to develop AI applications.
At the end of last year, Nvidia indicated that almost 50% of her income in a quarter came from Just four companies. All these companies have sought to build fleas so that they can cut the intermediary and reduce costs, and they can wait for many years for a return. It has only been so long that investors will tolerate heavy expenses before asking Meta Show that he is paying. Amazon has its own Inferentia chips, while Google has been developing tensor treatment units (TPUS) for years.
The concentration of Nvidia in some customers who build their own processors, as well as the rise of effective AI models like China In depthhave raised certain concerns about whether Nvidia can maintain its growth forever, although CEO Jensen Huang has declared that it was optimistic that suppliers of data centers will spend $ 1 billion in the next five years to build an infrastructure, which could see its business continue to develop in the 2030s. And, of course, most companies will not be able to develop flea.