The pace of change around servers technology is growing considerably, motivated by hyperscalers but also spreads in the local world. There are many global trends, according to experts, in particular:
- Ai Everything: Ai Mania is everywhere and without high power material to execute it, it’s just steam. But it’s more than a simple fashionable word, it’s a very real and measurable trend. IA servers are remarkable because they are decorated with high -end processors, GPU accelerators and often a smartnic network controller. All the main actors – Nvidia, Supermicro, Google, Asus, Dell, Intel, Hpe – as well as small sellers offer AI equipment for designed purpose, according to a Recent article in the Network World.
- Growth of the AI Edge server: There is also a tendency to deploy AI Edge servers. The global AI servers market is expected to be worth around $ 26.6 billion by 2034, compared to $ 2.7 billion in 2024, according to a market. report. Considerable amounts of data are collected on the edge. EDGE servers do the job to eliminate unnecessary data and only return the data necessary for data centers for processing. The market is developing quickly as indications such as manufacturing, automotive, health care and retail trade increasingly IoT devices and require immediate processing of data for decision -making and operational efficiency, depending on the report.
- Liquid cooling gain Gains: The liquid cooling is dragging a path of fringes in the dominant current of the infrastructure of the data center. What was once a difficult complementary module now becomes a standard feature, explains Jeffrey Hewitt, vice-president and analyst at Gartner. “Server suppliers are working to develop internal plumbing of the chassis for cooling directly on the chip in order to support the next generation of AI and GPU processors who will produce high quantities of heat in their servers,” he said.
- New data center structures: Not as much a server trend as a data center trend, but data centers arrangements change to adapt to the AI server hardware. AI equipment is extremely dense and works very hot, more than typical server systems. Data centers operators of all types that deploy AI equipment must be aware of their place, explains Naveen Chhabra, principal analyst at Forrester Research.
“You have to identify the areas in which you can put those who put this power,” he said. “You cannot just concentrate power in a particular area in the data center and say that this is where I will run all my AI applications. It may not be the most pragmatic architecture. »»
- Virtualization terrains climbing: Broadcom management of the acquisition of VMware has embittered many potential customers and they are looking elsewhere, explains Hewitt. “I would say that some server OEM move to support the virtualization options of the additional server since the acquisition of VMware by Broadcom. This latest trend aims to support other virtualization choices if their customers are looking for them, “he said.
- Infiniband begins to fade: Infiniband will start to fade as an option for high -speed interconnectivity in favor of Ethernet, said Chhabra. “The way Ethernet evolves, expectations are only in two to three years, it would have the ability to manage interconnection at high speed. Organizations would not want to maintain two different connectivity stacks when we could do the job, “he said.
- Component shortages lead people to the cloud: Chhabra says that with this current component shortage and data center request The equipment could lead people to the cloud rather than in the premises. “I can tell you that if you wish, say, 20 server units with Nvidia Gpus, you will wait at least a year, a year and a half, for this to be shipped to your doors. And that forces companies to think about this interim, can I go to get it somewhere? And people explore all these options, “he said.