Amazon is diving into generative AI.
The company announced launching Bedrock last week, its own AI-based Amazon Web Services (AWS) cloud service, which allows customers to build and scale generative AI-based applications using Foundation Models (FMs).
“Think of it as a cloud-based and configurable alternative to OpenAI’s ChatGPT and DALL-E 2 aimed at businesses and developers,” Will Shanklin of engadget explained. The service makes FMs from AI21 Labs, Anthropic, Stability AI, and Amazon accessible via an API.
On the same day of the announcement, CEO Andy Jassy, who took over for Jeff Bezos in 2021, released his 2022 letter to shareholders. The letter presented a vision for Amazon’s future and artificial intelligence (AI) and machine learning (ML) — including generative AI — was a notable highlight.
The heavy investment Amazon is pouring into Large Language Models (LLMs) and Generative AI is “core to setting Amazon up to invent in every area of our business for many decades to come,” Jassy said in the letter.
Amazon has been using machine learning for 25 years, he noted, but Generative AI “promises to significantly accelerate ML adoption.”
“Generative AI is based on very Large Language Models (trained on up to hundreds of billions of parameters, and growing), across expansive datasets, and has radically general and broad recall and learning capabilities,” Jassy wrote. “We have been working on our own LLMs for a while now, believe it will transform and improve virtually every customer experience, and will continue to invest substantially in these models across all of our consumer, seller, brand, and creator experiences.
“Additionally, as we’ve done for years in AWS, we’re democratizing this technology so companies of all sizes can leverage Generative AI. AWS is offering the most price-performant machine learning chips in Trainium and Inferentia so small and large companies can afford to train and run their LLMs in production. We enable companies to choose from various LLMs and build applications with all of the AWS security, privacy and other features that customers are accustomed to using. And, we’re delivering applications like AWS’s CodeWhisperer, which revolutionizes developer productivity by generating code suggestions in real time.”
Swami Sivasubramanian, VP, Database, Analytics and ML at AWS, explained how impressive the advancements in generative AI have grown in just a few years, in a blog post announcing Amazon Bedrock.
“Like all AI, generative AI is powered by ML models—very large models that are pre-trained on vast amounts of data and commonly referred to as Foundation Models (FMs),” Sivasubramanian wrote. “Recent advancements in ML (specifically the invention of the transformer-based neural network architecture) have led to the rise of models that contain billions of parameters or variables. To give a sense for the change in scale, the largest pre-trained model in 2019 was 330M parameters. Now, the largest models are more than 500B parameters — a 1,600x increase in size in just a few years.”
Amazon Bedrock will offer the ability to access a range of FMs for text and images —including Amazon’s Titan FMs, which consist of two new LLMs Amazon also announced — through a AWS managed service. With Bedrock’s serverless experience, customers can privately customize FMs with their own data, and integrate and deploy them into their applications using AWS tools, without managing any infrastructure.