A bot can initiate a web chat conversation in different ways. One of them is using a Get Started button, which can also have a welcome message above it.
Another option is to open the normal web chat with some messages from the bot already present, which may include quick replies to suggest the user what to do next. These messages can be sent from the flow using Say nodes. A potential issue, however, is that these messages will start a new billable conversation, even though the user has not sent any messages yet. As the result, the amount of billable conversations can considerably increase. In addition, the stats in OData will have many "empty" sessions.
That can be avoided by using the method described in this tutorial.
Note: This method requires use of the Inject API, therefore it can only be used with certain endpoint types that support message injection.
Overview
A billable conversation starts with the first message from the web chat received by the flow. By default, this is a text message having the following text: "GET_STARTED". To avoid triggering the flow, this message can be intercepted by the endpoint transformer's handleInput method. If the method returns null, the flow is not triggered and a new session is not started.
The transformer should then respond with some welcome messages. It cannot do it directly because methods such as handleOutput and handleExecutionFinished will not be executed since the flow was not triggered. To work around this, the transformer can use the inject API to send the messages.
However, endpoint transformers cannot trigger their or other endpoints directly, including inject and notify calls. It is a platform limitation that was added for safety reasons: endpoints triggering each other could end up in an infinite loop. This limitation can be worked around by using a function as a proxy.
To summarize: the GET_STARTED message is intercepted by the input transformer, which then triggers a function that injects a message back to the endpoint. The injected payload is sent to the web chat without triggering the flow. That results in a web chat message that looks like as if it came from the bot.
Webchat endpoint transformer
Paste the code below in your webchat endpoint transformer. Make sure that Enable Input Transformer is turned on. Replace the first three constants with values that will work for your environment.
const COGNIGY_FUNCTION_ID = 'your-function-id'
const API_KEY = 'your-api-key'
const API_BASE_URL = 'https://api-trial.cognigy.ai' // Replace with your environment API base URL
createSocketTransformer({
handleInput: async ({ payload, endpoint }) => {
// This is executed upon receiving a message from the web chat
const { userId, sessionId, text, data } = payload
if (text === 'GET_STARTED') {
// Received the default message sent by the web chat to start a conversation
try {
const urlToken = payload['URLToken']
await httpRequest({
method: "POST",
uri: `${API_BASE_URL}/new/v2.0/functions/${COGNIGY_FUNCTION_ID}/trigger`,
headers: { 'X-API-Key': API_KEY },
body: { parameters: { urlToken, userId, sessionId }},
json: true
});
} catch (error) {
console.error(error);
}
return null // When a null is returned, the flow is not triggered
}
return { userId, sessionId, text, data }
}
})
Function that injects a welcome message into the webchat endpoint
Create a new Cognigy Function using the code below.
// The payload that will be injected in the endpoint and forwarded to the web chat
// This is the same payload as sent by a Say node with Output Type = Text with Quick Replies
const welcomePayload = {
_cognigy: {
_default: {
_quickReplies: {
type: "quick_replies",
text: "Welcome to the demo of a Welcome message that doesn't increase billable conversations count!",
quickReplies: [
{ contentType: "postback", title: "Tell me about it", payload: "Tell me about it" },
{ contentType: "postback", title: "How does this work?", payload: "How does this work?" }
]
}
},
_webchat: {
message: {
text: "Welcome to the demo of a Welcome message that doesn't increase billable conversations count!",
quick_replies: [
{ content_type: "text", payload: "Tell me about it", title: "Tell me about it" },
{ content_type: "text", payload: "How does this work?", title: "How does this work?" }
]
}
}
}
}
export default async ({ parameters, api }: IFunctionExecutionArguments) => {
const { urlToken, userId, sessionId } = parameters
await api.httpRequest({
method: "post",
url: `https://endpoint-trial.cognigy.ai/notify/${urlToken}`,
headers: {},
data: {userId, sessionId, text: null, data: welcomePayload}
})
}
Demo
You can see it in action in this demo webchat page. The flow is only triggered after the user sends the first message, either by typing it or by clicking on a quick reply.
Comments
0 comments