All articles on how to create & orchestrate LLM-powered AI Agents
- Getting Started with AI Agents
- Prerequisites: Set up your AI Agent's brain
- Create your AI Agent's persona
- Give your AI Agent a Job
- Make knowledge available to your AI Agent
- Give your AI Agent access to memory
- Deploy and use your AI Agent
- Improve your AI Agent’s skills using Tool Actions
- Enable your AI Agent to understand images
- Talk to your AI Agent via voice or phone
- Debugging your AI Agent
To deploy and use AI Agents, Cognigy Endpoints act as gateways to connect them to the external world. These endpoints empower AI Agents to handle text-based conversations, multimodal interactions, and voice communication, including support for contact center phone lines. Cognigy supports a wide range of channels, including Facebook (Meta) Messenger, Slack, Amazon Alexa, Microsoft Teams, WhatsApp, Cognigy Voice Gateway, Genesys, and more.
In this guide, we’ll show you how to create and configure an Endpoint for your AI Agent, in this instance a Webchat.
To enable your AI Agent to communicate via a Webchat widget, follow these steps:
-
Deploy Endpoint.
In the sidebar, go to Deploy > Endpoints. Click New Endpoint.
-
Set Flow.
Assign a name to the endpoint and link it to the Flow you created. Click Save.
-
Test Endpoint.
In the top-right corner, select Open Demo Webchat to interact with your AI Agent.
The Webchat Endpoint Editor offers various configuration and integration options to add the webchat widget to your website.
Comments
0 comments