This article is intended to help with the initial steps of setting up and familiarizing yourself with Knowledge AI as a concept. More advanced concepts and best practices will be shared in future articles.

Setting up your LLM

As Knowledge AI requires generative AI capabilities outside of Cognigy, the most important component that you need to provide is a large language model (LLM) with access via API. Cognigy supports a wide range of providers and ready-made or custom models, such as OpenAI, Azure, Anthropic, AWS, Google, Aleph Alpha.

The full list of models supported by Cognigy can be found in the Cognigy.AI documentation.

You can either select a standard model or enter a custom model name, which is useful when newer models or versions become available.

Generally, you need two types of models to use Knowledge AI: an embeddings model to vectorize and prepare your knowledge data, and a generative model that then searches, extracts and formats the results based on the embeddings.

Setting up your Knowledge Store

Now that you have your LLM set up, you can begin the process of preparing the knowledge data that will be used to generate the response in a Knowledge Store.

Here you have many options to customize and configure your Knowledge Store. For a step-by-step guide on how to format your knowledge data and set up your Knowledge Store you can visit the respective documentation.

Setting up your flow

With the two main components, the LLM and the Knowledge Store configured and set up, you can now integrate this functionality into your flows. As with other flow mechanisms, Knowledge AI does not always run magically in the background but needs to be triggered or executed at specific steps in the flow.

You can trigger and query the Knowledge Store by using the Search Extract Output node, provided by Cognigy.

Commonly most bots consist of both manual and generative content, where only certain parts of the flow are handled by generative AI and more critical areas, such as greetings, escalations and handovers, are defined as a regular flow with intents.

With such a hybrid approach, you can set up a simple flow as seen below, which utilizes a Lookup Node to orchestrate when to either use manually written content or generative AI to handle the input.

A diagram of a company

Description automatically generated

Using the Search Extract Output Node

For more information on all parameters and options of the Search Extract Output node, you can visit the documentation.

Testing and iterating your setup

After setting up your Knowledge Store and integrating this functionality into your flow, you should now test if everything works as intended. Using the Interaction Panel you can navigate to the respective part of the flow and inquire about information you imported in the Knowledge Store, similarly to how you would test a regular NLU bot.

 


Comments

0 comments

Article is closed for comments.

Was this article helpful?
1 out of 1 found this helpful