Visit our on-demand library to view VB Transform 2023 sessions. Register here
AI21 Laboratoriesthe Tel Aviv-based NLP major behind the word tune editor, announced the launch of a plug-and-play generative AI engine to help enterprises extract value from their data assets.
Dubbed Contextual Answers, the offering comes as a dedicated API that can be embedded directly into digital assets to deploy LLM (Large Language Model) technology to selected organizational data. It can enable business employees or customers to get the information they need through a conversational experience, without going through different teams or software systems.
“This is the first time that this kind of technology is offered as a solution that works out of the box and does not require major effort and resources. We built the entire solution as a plug-and-play capability and optimized every component, enabling our client to achieve the best results in the industry without investing the time of AI, NLP or data science practitioners,” said Tel Delbari , representing the API team at AI21 Labs.
Meeting the demand for business-specific generative AI
Since the emergence of ChatGPT, businesses large and small have been looking for ways to implement LLMs into their data stack and provide internal teams and customers with a faster and more seamless way to communicate with accurate, useful information. The usual approach here is to refine existing models and make them work in specific business scenarios, but that approach requires significant engineering – something not every company can afford.
With the new Contextual Answers API, AI21 Labs offers a solution that can bring any generative AI use case to life from scratch.
“Companies can get started by uploading documents to our Studio using the web GUI or by uploading the documents through our API and SDK. After loading the files, they can ask questions and get answers through the API. Our API is easy to use and any developer, even without being an NLP or AI expert, can get started,” Delbari told VentureBeat.
Once the AI engine is up and running, business customers or internal employees can write any question in free form, whether it’s for internal support, reviewing policies, or searching for information in large documents or manuals. The model takes that question and provides a concise answer from the context within the uploaded knowledge base. It works for both structured and unstructured information.
“The model is specifically optimized to adapt to internal jargon, acronyms, project names, etc. As long as the documents contain the information, the model can learn it and its meaning automatically. In addition, it will not mix the organizational knowledge with external knowledge, jargon or information it has learned from the Internet, keeping it grounded and truthful with regard to the organizational data and internal language,” explains Delbari.
Data access control efforts, security
Since the AI engine supports unlimited upload of internal company data, Delbari clarified that it takes into account the access and security of the information.
For access control and content separation based on roles, he says, the model can be restricted to using a specific file, a number of files, a specific folder, or tags or metadata. Meanwhile, for data security and confidentiality, he claims that the company’s AI21 Studio ensures a secure and soc-2 certified environment.
“We promise a segregated and protected environment that is already trusted by companies from various industries, including banks and pharmaceutical companies,” he said. He also noted that the AI engine can be used through AWS Sagemaker Jumpstart and AWS Bedrock, allowing companies to leverage the core capabilities of this product on their virtual private clouds (VPCs).
In the future, the company plans to embed the feature in its Wordune writing platform, allowing users to quickly extract selected information from uploaded documents.
Leading data ecosystem players Databricks and Snowflake have worked on similar projects. The former recently announced LakehouseIQ, which uses LLM technology to answer specific questions about Lakehouse data with context, while the latter launched Document AI, a purpose-built, multimodal LLM that can extract insights from unstructured documents.
VentureBeat’s mission is to become a digital city plaza where tech decision makers can learn about transformative business technology and execute transactions. Discover our Briefings.