By Kenrick Cai and Krystal Hu
San Francisco (Reuters)-Alphabet and Nvidia have joined eminent venture capital investors to support safe superintendent (SSI), a startup co-founded by the former Openai chief scientist Ilyya Sutskever who quickly increased to become one of the most precious artificial intelligence start-ups after its launch.
Funding illustrates a renewal of the interests of major technologies and infrastructure suppliers to make strategic investments in startups that develop a cutting -edge AI that requires massive amounts of computer power. Alphabet, which has its own AI models, earlier in the week, announced an agreement by its Cloud Computing arm to sell SSI access to tensor processing units (TPU), its internal AI chips.
SSI, which, according to sources, has been recently estimated at $ 32 billion in a round led by Greenoaaks, is one of the most prominent startups working on research on AI models, thanks to the stellar assessment of Sutskever to predict the next great thing in the development of AI.
Like many of his competitors, he has a huge request for fleas.
Reuters could not determine the exact terms of the investment of Alphabet and Nvidia in SSI. The spokesperson for the three companies refused to comment.
The twin movements of the alphabet corporate and cloud division with high -level AI laboratories, in particular SSI and Anthropic, show the evolutionary material strategy of the technology giant.
Google initially reserved TPUs for internal use. The agreement to sell SSI fleas in large quantities to support its research on the border AI illustrates the continuous strategy of the company aimed at extending sales to external customers, Darren Mowry, general manager in charge of Google partnerships with startups, in an interview with Reuters this week.
“With these manufacturers of fundamental models, gravity increases considerably for us,” he said.
AI developers have historically preferred NVIDIA graphic processing units, which hold more than 80% of the IA flea market.
But SSI has so far used TPUs so far rather than GPUs for its research and development in AI, two sources said.
Google offers both NVIDIA GPUs and its own TPUs via its cloud service. Its own chips are intended to excel in specific AI tasks and are more effective than GPU for general use. These chips were used to build large -scale AI models, such as Apple and Anthropic, a competitor Openai which received billions of dollars in Google and Amazon funding.
Google and Nvidia are also faced with a challenger in Amazon, which builds its own competing processors called Trainium and Inferentia. Amazon said in 2023 that Anthropic would develop its technology on these chips. The technology giant announced in December that Anthropic would be the first customer to use a massive supercomputer powered by hundreds of thousands of its own chips.