Join top executives in San Francisco July 11-12 to learn how business leaders are getting ahead of the generative AI revolution. Learn more
While there are some big names in the tech world concerned about a possible existential threat from artificial intelligence (AI), Matt Wood, VP of product at AWS, is not one of them.
Wood has long been a standard bearer for machine learning (ML) at AWS and is a fixture at the company’s events. For the past 13 years, he has been one of the leading voices at AWS on AI/ML, speaking about Amazon’s technology, research and service advancements at nearly every AWS re:Invent.
AWS has been working on AI long before the current round of generative AI hype, with its Sagemaker product suite leading the charge for the past six years. Make no mistake though: AWS has entered the generative AI era just like everyone else. On April 13, AWS Amazon announced Bedrock, a suite of generative AI tools that can help organizations build, train, refine, and deploy large language models (LLMs).
There is no doubt that there is great power behind generative AI. It can be a disruptive force for both business and society. That great power has led some experts to warn that AI poses an “existential threat” to humanity. But in an interview with VentureBeat, Wood handily dismissed those fears by succinctly explaining how AI really works and what AWS does with it.
“What we have here is a mathematical trick that can present, generate and synthesize information in ways that help people make better decisions and work more efficiently,” said Wood.
The transformative power of generative AI
Rather than posing an existential threat, Wood emphasized the powerful potential AI has to help businesses of all sizes. It’s a strength borne out by the large number of AWS customers already using the company’s AI/ML services.
“Today we have over 100,000 customers using AWS for their ML efforts and many of them have standardized Sagemaker to build, train and deploy their own models,” said Wood.
Generative AI takes AI/ML to another level and has generated a lot of enthusiasm and interest among the AWS users. With the advent of transformer models, Wood said it’s now possible to take highly complicated natural language inputs and map them to complicated outputs for a variety of tasks, such as text generation, summation and image creation.
“I haven’t seen this level of engagement and enthusiasm from customers, probably since the very, very early days of cloud computing,” said Wood.
In addition to the ability to generate text and images, Wood sees many business use cases for generative AI. At the root of all LLMs are numerical vector embeddings. He explained that embeddings allow an organization to use the numerical representations of information to drive better experiences across a number of use cases, including search and personalization.
“You can use those numerical representations to do things like semantic scoring and ranking,” Wood said. “So if you have a search engine or some kind of internal method that needs to collect and rank a range of things, LLMs can really make a difference in terms of how you summarize or personalize something.”
Bedrock is the AWS foundation for generative AI
The Amazon Bedrock service is an effort to make it easier for AWS users to take advantage of the power of multiple LLMs.
“We don’t believe there will be one model to rule them all,” Wood said. “So we wanted to be able to offer model selection.”
In addition to providing model selection, Amazon Bedrock can also be used alongside Long chain, which allows organizations to use multiple LLMs at the same time. Wood said that with Langchain, users have the ability to chain and sequence prompts in multiple different models. For example, an organization might want to use Titan for one, Anthropic for another, and AI21 for yet another. In addition, organizations can also use custom models based on specialized data.
“We’re definitely seeing it [users] breaking large tasks into smaller tasks and then routing those smaller tasks into specialized models and that seems like a very fruitful way to build more complex systems,” Wood said.
As organizations move to generative AI, Wood noted that a key challenge is ensuring companies are approaching the technology in a way that allows them to actually innovate.
“Any big shift is 50% technology and 50% culture, so I really encourage clients to think hard about both a tech piece that’s getting a lot of attention right now, but also a lot of the cultural pieces about how you drive invention using of technology,” he said.
VentureBeat’s mission is to become a digital city plaza where tech decision makers can learn about transformative business technology and execute transactions. Discover our Briefings.