Google’s Vertex AI machine learning platform gets generative AI tools
Google’s Vertex AI machine learning platform gets generative AI tools
At Google’s annual I/O conference Wedneday, the internet giant announced that it was adding new tools designed to help build generative AI capabilities to its machine learning operations platform, dubbed Vertex AI.
The generative AI tools added to Google Cloud’s Vertex AI include three new foundation models; so-called embeddings APIs for text and images; a tool for reinforcement learning from human feedback; and previously showcased tools such as the Generative AI Studio, Model Garden and the PaLM 2 large language model for text and chat.
(Separately, Google also announced Duet, a new generative AI engine for Google Cloud, designed to help developers code. See Google’s Duet AI to take on Amazon CodeWhisperer, GitHub Copilot.)
In March, Google Cloud announced a new service, dubbed Gen App Builder, to help enterprises build AI-powered chat and search applications by basing them on Google’s own foundation models.
During the same period, the company had said that it would add Model Garden and Generative AI Studio to Vertex AI. While Model Garden is a repository of foundation models from Google and its partners, Generative AI Studio is a low-code suite for tuning, deploying and monitoring foundation models.
Foundation models to help code, edit images
The new foundation models — Codey, Imagen, and Chirp — added to Vertex AI will help with code generation, editing images and the creation of apps that can help users converse in various native languages, the company said.
Codey, according to the company, can complete code, help developers generate code via natural language processing, and allow developers to chat with a bot to get help for debugging code, creating documentation, learning new concepts, and other code-related queries.
Codey supports over 20 languages including Python, Typescript, Java, Go and Google Standard SQL.
Imagen, meanwhile, can be used to create and edit images via natural language prompts, the company said, adding that the foundation model can also be used to caption images.
Enterprises can upload images of their own products to create content such as marketing collateral, Google said, noting that the generated images can be iterated infinitely.
The third new model, Chirp, is aimed at helping enterprise speak to their customers in their native languages, the company said. The model supports 100 languages over chat and voice.
Embeddings APIs find new data relationships
As part of the updates to Vertex AI, the company is adding embeddings APIs for text and images to help enterprise developers convert text and images into numerical vectors to be processed by large language models.
The numerical vectors map semantic relationships between data and makes it easier for the models to read, the company said.
These APIs can be used by developers to build question-and-answer chatbots based on an enterprise’s data, creating text classification and powering semantic search as well as other capabilities such as improving clustering, anomaly detection, and sentiment analysis.
In machine learning, clustering is the process of grouping unlabeled data.
The embeddings APIs, according to Omdia’s chief analyst Bradley Shimmin, will encourage the adoption of large language models as enterprises use the idea of vectorizing data to expose their data to these models.
Human feedback helps tune models
Google has also added a human-prompt tuning tool to Vertex AI.
The tool, dubbed reinforcement learning based on human feedback (RLHF), will help enterprises maintain model performance and safety over time, the company said, adding that the tuning tool can be used to root out bias or any toxicity.
“This is particularly useful in industries where accuracy is crucial, such as healthcare, or customer satisfaction is critical, such as finance and e-commerce” the company said in a statement.
The embeddings APIs and the RLHF tool, according to Forrester analyst Rowan Curran, will be important for enterprises in developing generative AI-powered applications.
“For many companies who want to get started with generative AI and build their own capabilities a major limiting factor is internal skills, Google Cloud and other vendors are trying to help abstract away some of that complexity challenge by providing tools to enhance the capabilities of the foundation models to make them enterprise-application-ready,” Curran said.
The new foundation models, embeddings APIs, and the reinforcement learning from human feedback tool is currently available through Google’s Trusted Tester program, Google said, without giving a timeline for general availability.
Vertex AI updates similar to Amazon Bedrock, Microsoft OpenAI APIs
The updates to Google’s Vertex AI will pitch the public cloud service provider against rivals such as AWS, IBM and Microsoft, analysts said, adding that the updates were similar to offerings such as Amazon Bedrock, Microsoft OpenAPIs, and IBM’s Watsonx.
“Google Cloud’s capabilities for offering these three foundation models, along with the additional capabilities, is broadly similar to both Amazon Bedrock and Azure’s OpenAI Services,” Curran said, adding that the distinct difference was Google was offering the capabilities through a graphical interface and data science notebooks.
The new generative AI features from Google, according to Shimmin, highlight the rivalry that Microsoft ignited with the release of ChatGPT.
The vendor with best infrastructure capable of supporting enterprises and global privacy along with trust or security concerns will lead the AI race, Shimmin said.
Copyright © 2023 IDG Communications, Inc.
<!-- var slotName = 'bottomleaderboard'; var slotSize = []; if ($thm.deviceClass == 'mobile') { slotSize = [[300,50],[320,50],[300,250]]; } else if ($thm.deviceClass == 'tablet') { slotSize = [[728,90],[468,60]]; } else { slotSize = [[728,90],[970,90],[970,250]]; } IDG.GPT.addDisplayedAd(slotName, "true"); document.write('
'); IDG.GPT.defineGoogleTagSlot(slotName, slotSize, false, true); document.write('
'); $('#' + slotName).responsiveAd({screenSize:'971 1115', scriptTags: []}, true); //-->