The most basic thing a mobile app integration should provide is sending and receiving chat messages -- how this information is ultimately displayed to the user is basically due to the respective use case. This article aims to explain and demystify the different approaches to handle a chat conversation in Cognigy.AI.

Such as described in the Overview article, there are three different ways of sending and receiving chat messages called:

  1. Socket
  2. REST
  3. Webhook

While, in the end, each of them simply makes available the connection between an app and the virtual agent, they partly differ in how this is done.

The Differences

If one takes a look into the linked documentation, it can be seen that all of them are listed as so-called endpoints. This means that they could be used such as the Webchat or Facebook Messenger endpoint, where the only difference is that there is no specific software on the other side of the connection. Exactly what, for example, the Webchat does with the incoming chat message, namely displaying it on the screen, is not defined yet when using the mentioned endpoints above. 

However, before diving into use case related topics, one needs to understand when it makes sense to use Socket instead of REST, REST instead of Webhook, or the Webhook instead of Socket. 

Socket

Once the connection is established it stays open until it is clearly closed.

Pro Contra
+ The connection needs to be established only one time. - A message is polled until it is received. That could worsen the performance of the mobile app.
+ The messages are "automatically" sent to the respective recipient.  
+ The latency is very low since messages are just forwarded without executing a new HTTP request.  

 

REST

The connection is re-established every time a message is sent by the user or virtual agent.

Pro Contra
+ There is no message polling and thus a stable app performance. - Every message executes another HTTP request. This could increase latency.
+ Very secure since the HTTP request could use authentication. - REST expects that there is an input followed by output. Thus, for every input message, there should be output.

 

Webhook

Webhook (sometimes called a web callback or HTTP push API) provides information as soon as it is available, to the specified callback URL.

Pro Contra
+ More dynamic than REST since after one input message, there could be two outputs. On the other hand, there could be two inputs for one output. - It has the same disadvantages such as REST.
+ It does not need to send the response to Cognigy.AI but to another service.  
+ It could be used as a middleware solution to preprocess messages.  

Which of these three solutions should be used for the own mobile app depends on the already made experience or the currently used software solutions in the app's code.

Virtual Agent Example Message

Now, one should be ready to see a basic example of a message sent by the virtual agent. As may be known, the virtual agent uses the AI Channel from the default to provide an easy-to-use building process.

The following example shows a quick reply message

quickReplyMessageInteractionPanel.png

where the backend information looks like this:

[
"output",
{
"type":"output",
"data":{
"text":null,
"data":{
"_cognigy":{
"_default":{
"message":{
"text":"How can I help you today?",
"quick_replies":[
{
"content_type":"text",
"payload":"I want to order a pizza",
"title":"Order Pizza"
},
{
"content_type":"text",
"payload":"Please show me your locations.",
"title":"Show Restaurants"
}
]
}
},
"_fallbackText":"How can I help you today? You can, for example, order a pizza."
}
},
"traceId":"endpoint-realtimeClient-794cd9fb-a475-48db-a56f-252fb91a9255",
"disableSensitiveLogging":false,
"source":"bot"
}
}
]

All messages are sent in a so-called JSON format to provide global data exchange between applications. Right now there are only a few pieces in the preview above that are interesting:

 

"type":"output"

Every sent Cognigy.AI chat message has the output type.

 

{
"text":null,
"data":{..}
}

Furthermore, a message consists of its text and data information. Since a quick reply needs to be sent with a specific structure, the text is null. However, the data includes the text and quick replies.

A normal text message would look like this:

{
"text": "How can I help you today?",
"data": null
}

 

{
"_cognigy":{
"_default":{
"message":{
"text":"How can I help you today?",
"quick_replies": [...]
}

This part contains the basic message structure that is used for all Cognigy.AI endpoints. Since this example uses the AI Channel, it says _default. Would one use the Webchat instead, it would say _webchat and so on.

 

"quick_replies":[
{
"content_type":"text",
"payload":"I want to order a pizza",
"title":"Order Pizza"
},
...
]

This list stores the defined quick replies that could be displayed as buttons in the later mobile app.

 

"_fallbackText":"How can I help you today? You can, for example, order a pizza."

If there is no screen to display the structured message, the fallback text can be used to output a plain text version to the user. This is used for voice-based apps, for example.


 

The next article shows three so-called clients that can be used to process messages in an app. In this case, the client exposes various functions to provide an easy-to-develop experience.


Comments

0 comments

Please sign in to leave a comment.

Was this article helpful?
0 out of 0 found this helpful