A user study found out that 9 of 10 participants use WhatsApp regularly (Tyntec, 2019). For this reason, one may think about delivering their Cognigy Conversational AI through this channel as well.
Create an Endpoint Transformer Function
The Cognigy output needs to be converted into a valid Telegram message. Therefore, this tutorial uses the following two Cognigy project resources:
Please add a simple SAY node to your flow and insert a welcome message:
Hi, 👋 I am your personal Telegram assistant. How can I help you?
In the Webhook endpoint, this message now will be converted. In order to do so, navigate to the Endpoints section and click on your recently created Webhook endpoint. If one clicks on the Transformer Functions expansion panel, it opens the detail view to define the transformer.
- Open the Transformer Functions section and enable:
- Abort on Error in Transformer
- Enable Input Transformer
- Enable Output Transformer
- Enable Execution Finished Transformer
- Copy the source code from the Transformer Code File and paste it into the code field in Cogngiy.AI
Set the Cognigy Endpoint in Telegram
Since the virtual agent is now able to handle Telegram messages, a so-called Telegram Bot needs to be configured and, last but not least, connected to the Cognigy.AI Webhook Endpoint URL.
- Create a new Application
- Register a Telegram Bot (Bot Father)
- Set the Cognigy.AI Webhook Endpoint URL
Before the AI can be tested, the configured Cognigy endpoint needs to be connected to Telegram. In order to so, the following API request needs to be done, at the moment:
- url: <Cognigy Webhook Endpoint URl>
Test the Connection
Nothing more is needed to publish our AI through WhatsApp. Now, write a message to your Tyntec number using WhatsApp and wait for the Cognigy.AI response.
If everything went well, you should see something like this:
Take a look at the complete transformer function here: