This would not have taken a billion parameters Great language model (LLM) To predict that the dominant theme of this year Google Cloud conference would be AI Generative—Endeed, this will probably be the dominant theme of the year for most business software developers.
During the event, Google introduced a multitude of updates to its cloud platform to facilitate work with LLMS and add generative assistants based on AI to many of its offers. Here are six key dishes of the conference:
Recognizing that AI workloads differ from other workloads, Google has presented a range of updates to its cloud infrastructure to support them and help companies optimize cloud expenditure. First of all: Google made the last iteration of its owner accelerator module for AI workloads, the TENSOR treatment unit (TPU) V5P, generally available in its cloud. TPU pods now have a support for Google Kubernetes (GKE) engine and multi-hosts serving on GKE.
In addition, in an extended partnership with Nvidia, Google also presents the A3 MEGA virtual machine (VM) to its cloud, powered by GPU Nvidia H100.
The other updates include a multitude of optimizations, in particular chatting, in its storage products. These improvements are also delivered with a new resource management service and work planning for AI workloads, named Dynamics Workload Scheduler.
The programming of pairs with the AI coding tool of Google will no longer be a duo, however. Google changed the name of its AI Duo previously published for developersRename it gemini code helps correspond to the brand of its latest LLM.
Gemini Code Assist has new features to go with his new name. Based on the Gemini 1.5 PRO model, it provides code completion services, code generation and cats supplied by AI. It works in the Google Cloud console and fits into popular code editors such as Visual Studio code And Brains of jetwhile also supporting the basis of a company code through the premises, Github,, Gitlab,, Bitbucketor several benchmarks.
The new improvements and features added to Gemini Code Assist include complete awareness at the code base, code personalization and improvements in the partner ecosystem of the tool that increases its effectiveness.
In order to increase the efficiency of the generation of code, the company extends the partner’s partner ecosystem by adding partners such as Datadog, DatastaxElastic, Hashicorp,, Neo4j,, Pinion,, Repeat,, SinglestoreSynk, and Battery overflow.
For the management of the cloud service provider, introduced Gemini Cloud Assist, an AIA assistant designed to help business teams manage applications and networks in Google Cloud.
Gemini Cloud Assist can be accessible via a chat interface in the Google Cloud console. It is powered by the model of large, the owner of Google, Gemini.
Companies can also use Gemini Cloud Assist to prioritize cost savings, performance or high availability. Based on natural language The contributions given by any business team, Gemini Cloud Assist identifies the fields of improvement and suggests how to achieve these objectives. It can also be directly integrated into interfaces where business teams manage different cloud products and Cloud workloads.
In addition to managing the life cycles of applications, Gemini Cloud Assist can be used by companies to generate AI -based assistance in a variety of networking tasks, including design, operations and optimization.
The AI assistant based on Gemini was also added to Google Cloud safety operations offers. He can provide Identity and access management (IAM) Key recommendations and information, including information on confidential IT, which helps reduce risk exposure.
In order to compete with similar offers from Microsoft and AWS, Google Cloud has published a new generator-Ai tool to create chatbots, Vertex AI Agent Builder. It is a code -free tool that combines the search for Vertex AI and the company’s conversation product portfolio. It provides a range of tools to build virtual agents, supported by Google Gemini LLMS.
His big sale argument is his loan to use CLOTH System, Vertex Ai Search, who can anchor agents faster than traditional cloth techniques. Its integrated cloth APIs can help developers quickly perform checks on Earthmoles.
In addition, the developers have the possibility of bringing the model outings out of the Google research to the ground to further improve the answers.
Other changes to Vertex AI include updates of existing and extended LLM Mlops capacities.
The LLM updates include a public overview of the Gemini 1.5 Pro model, which supports a context of 1 million. In addition, Gemini 1.5 Pro in Vetex AI will also be able to treat audio flows, including speech and audio from videos.
The cloud service provider has also updated his Imagen 2 family from LLMS with new features, including photo editing capacities and the possibility of creating 4 -second videos or “live images” from prompts of text. Other LLM AI LLM updates include the addition of Codegemma, a new light model from his owner family Gemma.
Updates Mops tools include the addition of the management of Green Pristles AI, which helps business teams to experiment with prompts, migrate prompts and follow the prompts with parameters. Other enlarged capacities include tools such as rapid evaluation to check the performance of the model while iterants on rapid design.
Google Cloud has added capacity motivated by its model owner of Grande Language, Gemini, to its database offers, which include Bigtable, Spanner, Memorystore for Redis, Firestore, Cloudsql for MySQL and Alloydb for PostgreSql.
Geminitis centered capacities include Sql generation, and IA Assistance in the management and migration of databases.
In order to help better manage the databases, the Cloud service provider has added a new feature called the database center, which will allow operators to manage an entire fleet of databases from a single component .
Google also extended Gemini to its database migration service, which previously had the support for Duet AI.
The improved features of Gemini will improve the service, said the company, adding that GEMINI can help convert the resident code of the database, such as stored procedures, the functions in PostgreSQL dialect.
In addition, the migration of the database fueled by Gemini also focuses on the explanation of the translation of the code with a comparison side by side of the dialects, as well as detailed explanations of the code and recommendations.
As part of these updates, the Cloud service provider has added new generative features based on AI to Aulloydb AI. These new features include the license to allow AI -based applications to question data with natural language and a new type of database view.
Google at Google Cloud 24 Next has unveiled three open source projects for the construction and execution of generative AI models.
Newly revealed open source projects are Maxdiffusion,, JetstreamAnd Optimum-TPU.
The company has also introduced new LLM Maxtext JAX-CONSTRIT LLMS project. The new LLM models in Maxtext include Gemma, GPT-3, Llama 2 and Mistral, which are supported on Google Cloud Tpus and Nvidia Gpus.