This article was originally posted on medium.

Amazon Connect is a cloud contact center that can be up and running in minutes, on both telephone and chat channels. It has a pay-as-you-go pricing model that many businesses are starting to find attractive.

A powerful feature of Connect is that it supports Amazon Lex out-of-the-box for natural language understanding (NLU), able to understand both spoken input on the telephone, and typed input in chat. With Amazon Connect being so easy to access, all of a sudden, the power of Conversational AI can be readily employed in contact centers of all shapes and sizes.

Full integration code is available

A GitHub Gist is available with all the code needed for this integration. To complete a running test you will need access to a Cognigy.AI platform version 3.6 or greater, your own Amazon Connect instance, a Lex bot to test, and an AWS Lambda function you’ll create.

But please read the full post before you jump into the code! The easiest order of deployment is described near the end of this post.

The integration with Connect

The following graphic shows the overall architecture.


As illustrated above, a small Amazon Lambda function is used, since Lambda is the only back-end integration mechanism supported by Lex. This Lambda is essentially a ‘shim’, of just a few lines of code, which translates the Lambda event from Lex into a request to a Cognigy.AI REST Endpoint:


There is a little more to it than the few lines above, which you can see in the Gist that accompanies this post, but the above code illustrates the core behavior of the Lambda function — Not much to it.

Cognigy.AI’s Endpoint Transformers are used on the REST Endpoint to translate the NLU result in Lex format, into the expected Cognigy REST format.

An NLU Transformer (a new feature in Cognigy.AI v3.6) is also used to achieve some increased convenience with NLU handling in the Cognigy.AI Flows. More on this below.

NLU Options

Both Lex and Cognigy.AI can carry out NLU. For this tutorial, Lex NLU is used for processing. For an organization invested in the Amazon Connect ecosystem, this could be a common choice. This means that intents and slots will be extracted by a fully-formed Lex NLU model, before being passed into the Cognigy.AI REST Endpoint. The raw text string of what the user typed or spoke is also included.

Another option for NLU would be to use the raw text string supplied from Lex, and process that within Cognigy.AI to derive intents and slots. This could utilize the in-built Cognigy NLU, or another NLU engine available in the platform.

Passing Lex NLU results into Cognigy

The Lambda code snippet above shows that one passes the full Lambda event into Cognigy.AI, which is not expecting such a format.

In fact, Cognigy.AI does not typically expect NLU information to come in via a channel Endpoint.

The format problem can be solved with an Input Transformer configured within a REST Endpoint, which simply takes the Lex Lambda event, and processes it into data structures that we then pass into our conversational Flows.

The core of the Input Transformer function is quite simple:


See the Gist for full details.


The critical part above is that the full Lex event, in the variable lex above, is passed through into the platform in data. This will be accessible as in a Flow, and critically: is also available in an NLU Transformer as data.lex.

An Execution Finished Transformer is also configured, to turn aggregated Flow outputs (mainly: The response to speak) into a format that Lex will understand.

Note that CustomerNumber (the calling phone number) and ContactId (a UUID for the specific call) as seen above should be passed through from Connect. The code above defaults these to random UUID values if not supplied.

Using Lex intents in place of Cognigy intents

The remaining challenge is using the Lex-detected intents inside Cognigy Flows.

One wanted to achieve a solution which would capitalize on some of the convenient features of Cognigy NLU — Namely, the convenience of Default Replies attached to intents, such that no Flow logic is needed to handle simple user queries that have simple ‘one-shot’ answers, often referred to as FAQs.

This is where the new NLU Transformer feature was employed. Above, the Lex result was stored as ‘data.lex’, and this can now be used in an NLU Transformer.

The REST Endpoint is configured to use Cognigy NLU, but then ‘transform’ the result to essentially cause the Lex intent result to overwrite the Cognigy intent result, with the following NLU Transformer code:


With this in place, any Cognigy intents with names matching Lex intents will emit their Default Replies as configured in the Cognigy.AI GUI. This is a simple, powerful way to answer basic questions from users (FAQs), without having to build an explicit flow for each.

One may also notice above that Lex slots are ‘merged’ in with the Cognigy slots, so both are available. This allows Flows to take advantage of Cognigy’s powerful and automatic functionality of Slot and Keyphrase extraction, for use in all decision Flows, as well as Lex-detected slots.

Final Steps

As a recap, the steps to utilize Cognigy.AI from Amazon Connect, in the most convenient sequence to deploy, are:

  1. In virtual agent, create a new NLU Connector, of type ‘Cognigy’, insert the NLU Transformer code from the Gist, and be sure to enable the ‘post NLU Transformer ’ with the toggle above the code
  2. Create a REST Endpoint, using the NLU Connector created above, insert the Endpoint Transformer code, and enable the ‘Input’ and ‘Execution Finished’ Transformers with the toggles
  3. Create an AWS Lambda function, using the code from the Gist, and alter the doPOST() URL to be the ‘Endpoint URL’ from the REST Endpoint configured above
  4. Create/Use a Lex bot — Each intent should use the above Lambda function for fulfillment
  5. Ensure Connect can access your Lex bot. Head to the Amazon Connect service console, select your instance, and navigate to ‘Contact Flows’. There should be a section called Lex where you can add your bot.
  6. Now finally: Access the Lex bot from your Connect ‘Contact Flow’. See below.

The final step is to integrate the Lex bot into a Connect Flow. In the Connect ‘Contact Flow’ editor, add a ‘Get customer’ input block. It will end up looking something like this:


Where ‘LexTest’ will be replaced by your Lex bot’s name, and the region may be different. Configure the block as shown below. Note that the very first prompt played by the bot is supplied in this block in Connect; also select your Lex bot in place of ‘LexTest’.


To pass through some useful session attributes from Connect, through Lex, and to Cognigy.AI, also add these session attributes exactly as shown to the block: 


Was this article helpful?
0 out of 0 found this helpful



Please sign in to leave a comment.