The Cognigy.AI Foundation Training will provide you the knowhow you need to successfully implement your own Conversational AI projects with Cognigy.AI. It is designed for participants with no or limited prior bot-building knowledge and gives insights into Cognigy.AI's capabilites and design best practices for Conversational AI.
The Foundation Training covers the basic concepts of Cognigy.AI, building Flows and logic with various Nodes and using the Cognigy NLU.
If you want to learn more about advanced bot building, check out our Cognigy.AI Flow Developer Training that is also available in the Help Center.
To get the most out of the training, watch the videos once and then follow along with the trainer to get hands-on experience.
Trainer: Derek Roberti
Playtime: 120 minutes
Last update: March 2021
Version: Cognigy.AI v4.1
Chapter 1: Introduction & Conceptual Overview
Get to know concepts and terms of Cognigy.AI
Most of this training will be hands-on, meaning we'll cover topics in the context of the product rather than spending time discussing system components more abstractly. The assumption here is that you already understand the value of conversational automation and know the why of chatbots and voicebots. At this point, you're more interested in the how. But some conceptual understanding is valuable, and you may want to refer back to this video as you go through the other training materials to see how your understanding has improved.
This overview will briefly cover the key building blocks of automated conversations and give you some concepts to look out for as we make our way through the rest of the training.
And here they are, we'll talk about Endpoints, Intents, Slots, Flows, and Extensions.
Endpoints are where conversations begin. Conversations can start over voice or chat channels. For voice channels, Cognigy works with major telephony and contact center applications to allow users to call a contact center and have their interaction handled solely through an automated Agent. For chat channels, Cognigy connects to a variety of Endpoints from a Webchat on your website, to SMS, to Facebook Messenger and internal channels like Teams, Slack, and Skype for Business.
When Cognigy receives an input from a user, it's processed by our industry-leading language AI. The Natural Language Understanding engine that takes a free form user input and determines the user's intention.
The NLU has two components: Slots and Intents. Let's take Slots first. Slots are categories of words that add meaning to a user utterance. For example, if a user says, I would like to order a veggie pizza with extra cheese or I ordered a pizza with my account Derek@cognigy.com an hour ago. Cognigy can identify that pizza is a "Food", cheese is a "Topping", Derek@cognigy.com is an "e-mail", and an hour ago is a "Duration".
Slots is a term of art in NLU discussion. But just think of it as a category. If I say pizza is a Slot match for food. I mean that pizza is a match for the food category. This is important because the NLU may know that the user wants to order something, but it needs to figure out what specifically the user wants, "A pizza with extra cheese". Some Slot types will be domain-specific, like pizza and all the possible toppings it cvould have. Other Slots are more conceptual or pattern-based, like dates, emails, durations, or temperatures. Dates, for example, can be expressed as tomorrow, Friday, the twenty-third, or next week, and they all mean the same thing. These conceptual or pattern-based Slots are built into Cognigy. The domain-specific ones may be unique to your business, whether you're an airline, an auto parts manufacturer, a pharmaceutical company, or a pizza restaurant. You have product names and business processes that are particular to you.
Another ingredient and arguably the main ingredient of NLU is Intents. An Intent reflects what the user means. The main idea of their sentence. If the user says, I would like to order a veggie pizza with extra cheese, the main idea is that they want to "order a pizza". If the user is asking why their order hasn't arrived yet, the main idea is that they're "making a complaint". For many, Intent detection is at the heart of conversational automation. When a user calls your contact center, they ask a question and you can't predict what they might say. Detecting a user's intent, even when they're expressing themselves in ways you might not have anticipated is key to fulfilling on a user's request. In short, if you don't understand the user's question, you can't give a good answer. Detecting the user's Intent and identifying Slot matches helps Cognigy guide the user to a helpful response. We respond to users in Flows. You might hear terms like fulfillment or dialog management as other ways of describing Flows. At heart, the idea is now that we know what the user wants, how do we fulfill on their request? Responding to the user may involve simply providing information. So if the user says, "Do you have vegetarian pizza?" You may just respond: Yes, we have both "vegetarian" and "vegan pizza" options. Sometimes the response may include a follow-up question. If the user says, "What are your restaurant hours"? I might ask, which locations are you asking about? Conversational automation is most powerful when it's accessing data from other systems or helping users perform actions. So if a user says, "I want to order a veggie pizza", the bot understands their Intent and places an order in an order management system. If the user says, "Can you add a bottle of water to my order", the system can find the user's order and add an additional item to it.
Cognigy ships with a wide range of pre-built connectors to common enterprise systems and automation platforms. We also make it straightforward to build integrations to systems we haven't heard of yet, like a homegrown system for managing pizza orders. We call these connectors Extensions. To recap, if my friend Laura wants to order a pizza, she starts off in a Channel like a Webchat or a phone call. Cognigy understands her Intent to place an order and it understands the components of her order, like the "type of pizza" and its "toppings". Cognigy then formulates an answer and adds dynamic information like the total price for her order and sends a response. In conversational automation terms, Laura starts the conversation on an Endpoint. Cognigy figures out her Intent and the relevant Slot matches. In the Flow, Cognigy manages the response and uses Extensions to retrieve data from systems of record.
The goal of this foundation's training is to provide a hands-on walkthrough of each of these topic areas. We won't cover Extensions in the foundation course, but we'll cover Endpoints, Intents, Slots, and Flows in some detail over the coming videos. By the end of this course, you should be able to build a basic Flow that can identify Intents and Slot matches and be published to an Endpoint. I'm looking forward to sharing the Cognigy.AI platform with you.
Chapter 2: Talking to users with Say Nodes
Learn about the Flow Editor and create rich-media messages
- HTML Hyperlink Generator
- Sample Video Embed Code:
<iframe width="227" height="119" src="https://www.youtube.com/embed/bMy3ZpNApcY" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
- Image Dimensions for Gallery: 909x476 or anything with the 1.91 to 1 aspect ratio.
- HTML Link with Style:
View our <a href="https://www.cognigy.com/" target="_blank" style="color:yellow">Terms and Conditions</a>
- Practice with free images https://www.pexels.com
- Say Nodes in the documentation
I booked my flight tickets through Checkline and I need to get my flight status. Imagine I'm in line at the airport with my bag in one hand and my phone in the other. Hello and welcome to Checkline. What flight do you need to check on? I need to check my flight status from SFO to LAX. What airline are you flying on? United. I'll check flights on United Airlines departing San Francisco Airport and arriving at LAX. Flight UA1069 is scheduled to arrive at 09:52. Great, thanks. Now imagine I'm sitting in the airport lounge with my laptop open. I'll ask the same question. And on the web, I get more information in a visual format.
We create conversations like this one in Flows. Let's take a look at the Flow Editor so you can start learning the building blocks of Flow development. I log into Cognigy and create a new Agent by clicking Create Agent here. I'll name my Agent "Aviation", and click create to create the Agent. Now I see the Agent Dashboard that is basically empty at this point. To create a Flow, go to the left bar and under build, choose Flows.
Next click, a new Flow and name it "Main". Here's a shortcut, once you've been working on your Flow, you'll find it listed on the Agent Dashboard under recently changed. This takes you directly to your Flow, open to the Flow Chart view. You can see the Flow starts off empty with the Start Node at the top and End Node at the bottom. When a conversation starts, the input arrives at the Start Node and progresses through a set of logical decision points before a response is sent back to the user.
Cognigy has a powerful AI-driven Natural Language Understanding engine that can understand what a user is trying to accomplish and learn over time. But for now, we're focusing on implementing a set of business rules as they're applied in conversation. I'll add a Node to the chart by clicking the plus sign and I'll create my first Node, a Say Node. A Say Node outputs something to the user. I'll click on this Say Node and Cognigy presents the Edit Node panel. I just want to output something simple. So I'll enter "Hello World" in the Text field and I'll click Save Node.
To test out my Flow, I can open the interaction panel by clicking the bubble icon in the top right corner. There's a lot to say about the interaction panel, and we'll cover it later in greater detail. But for now, you can just open it and say hi and it says "Hello World". In the Flow Chart, you can see a green line that highlights the path that Cognigy took to determine how to respond. Now, I'll add another Say Node beneath the first one. I'll make this one say "Good to meet you". If I tested out in my interaction panel now, it will execute the Say Nodes one at a time. What if I want to say good to meet you, first. I can click and drag the bottom Say Node into the plus sign above the top Say Node. Note that when I start to drag the plus signs turn into circles. Drag your mouse pointer into the circle, and release. You can test the bot now and it will say "Good to meet you" before it says "Hello World".
As your Flow Chart gets bigger, you'll need to navigate around the screen and Cognigy makes it easy to find your way around. You can click and drag to move the Flow Chart or you can just scroll up and down or left and right using your trackpad. You can zoom in and out using pinch and zoom, or you can just use the zoom controls at the bottom of the screen to zoom in out and to re-center. You can also copy and paste Nodes by using right-click, control-click, or command-click. I can copy a Node and then paste on one of the plus signs and I can delete it as well.
We looked at how to output a simple message, but what if we want to create some variation? I'll type in a few options like "Welcome World" and "Greetings World". Now, when I interact with the bot, I'll get a random response from one of the three I've entered. The purpose of this feature is just to introduce some natural variability in your user interaction so that it feels a little bit less robotic. Here's a trick to re-enter an input into the interaction panel. Just click on the bubble and send the message. Now I'll enable linear, instead of randomly choosing an answer, Cognigy will go through the lists from top to bottom and then stick with the bottom output. If I enable loop, Cognigy will go from the top of the list to the bottom and then back to the top again.
As you saw in the airline demo, Cognigy can output more than just text. In the Webchat example, we output visual elements like Quick Replies and images. Let's learn how to do these and we'll start with a Quick Reply. A Quick Reply appears as a button in the Webchat with some text on it. Users can click the button to send a message to the bot instead of typing. Imagine you were just using your phone to interact with a chatbot. Sometimes clicking a button with your finger is faster than typing a response. I'll open my Say Node and change the output type dropdown from text to text with Quick Replies. And I see a text field as before, but I also see an add Quick Reply button. In the Text field, I'll type, how can I help you today and I'll give the user some options. I'll click, add Quick Reply twice because I want to create two options. If I click on empty Quick Reply, I see a set of fields that Cognigy needs to render the Quick Reply. The two required options are Button Title and Postback value. Let's start with those. The Button Title will be "Check flight status" and the Postback value can be the same. For my second button, I'll enter a new reservation in both the Button Title and Postback value fields. I'll save and then we can test it out. If I click check flight status, it shows an input with the same value. Clicking the Quick Reply is the same thing as if the user typed in the text themselves. Sometimes you might want the button to have a short name, but the input to be longer. Let's see how that works with check flight status. I'll edit my Node and change the Postback value to "I need to check on the status of my flight". Save that and let's test again. Now, when I click check flight status, the Postback value appears. There are different reasons why you might want to do this, for example: to illustrate for the user the actual input from clicking the button or to provide a sentence that will trigger a specific Intent. But what if this interaction were happening by phone? You might want the buttons to appear in a Webchat. But different messages should be spoken on the phone. There are two ways to address this: First, I can add Fallback Text. If I'm interacting with Cognigy on a channel that doesn't support Text and Quick Replies, Cognigy can output the Fallback text.
Let's try that.
I can help you check your flight status or book a new reservation. Now, in the interaction panel, you'll see an orange bubble that shows the Fallback text. This won't appear if the user is interacting in the Cognigy Webchat, but it will be used if interacting on any channel that doesn't support Quick Replies like SMS or voice. I'll make a call to show you how it sounds on voice.
I can help you check your flight status or book a new reservation.
Another way I can deal with this is the new channel-specific output. If I open the Say Node, I can click the plus next to channels and select a Voice Gateway, for example. For the text I can add, "Thanks for calling, I can help you check your flight status or book a new reservation". Now, if the call is coming in by phone, Cognigy will output this message. If it's coming in through our Webchat or Facebook Messenger, it will use the Quick Replies. And for any other channel that doesn't support Quick Replies, it will use the Fallback Text. You'll see these different channel outputs in the interaction panel, the Main output, the Fallback output, and the Voice Gateway output. Let's look at another output option, a Gallery. Galleries include one or more images with optional titles and buttons beneath them. When I choose Gallery as my output type, I get an Add Card button, I'll click it and I get a template I need to fill out. I'll have my title say "Welcome to Aviation airlines" and my subtitle will say "The world's highest-flying airline". Where it says Add Image, I'll paste a URL. Now, if I save, I see this nice image with the title and subtitle. You might wonder how can I upload my own image? Cognigy doesn't replace your content management or digital asset management system as a repository for digital assets like images, videos, PDF, etc. Instead, you can add a link to a file from your website and it'll be rendered in the Webchat. Now, I'll choose a different image. First I'll show you in a web browser. And now I'll show you in the interaction panel. You can see the Webchat crops the bottom part of the image, so all we see is the tail of the plane. Why is that? The image in the Webchat needs to match the aspect ratio of the Webchat bubble. Otherwise, your browser will center the image rather than focusing on what's important to you. If I cropped the image a bit or ask my graphic designer to do that for me. I can have the image exactly how I want it, with the nose of the plane at the bottom of the picture and the reflection of the skyline in the water. The dimensions I use for this image are 909 by 476. You can try anything with a 1.91:1 aspect ratio and see how it looks. Now we can do more than just have an image, we can add buttons by clicking the add button button. It will add an empty button and I'll click on it to make it do something. I'll make the buttons say "Terms and Conditions". And for the Postback value, I can choose Send Postback and it will work like a Quick reply. But I can also choose, open a new URL and add a link. Let's try that.
Now, when I test it out, my card has a nice button underneath it. When I click on it, my browser opens a new tab with the URL I specified in my button. I can keep adding buttons or I can add a new card. To do that, I'll expand the Edit Node panel so that it takes up most of the screen. I'll click Add Card to add a new card and I'll populate it with some values. Now I have two cards. Let's see how it looks. I see the first card to the left and I get a teaser of the second card. If I click to the left, I see the second card. If I viewed the Webchat on my phone, I could just swipe left and right to see these different options. Now, let's take a quick look at Text with Buttons. These work just like the buttons in the Gallery, but they have a text header instead of an image. Let's take a look in the interaction panel. I can also add a video. Let's see what that looks like. I get a nice preview in the Editor, and when I run it in the interaction panel, you'll see basic controls for the video. This works great when the video is small, like this one. I just add a link to the file on my server. But what if you want to add something longer that's best served from a streaming service like Vimeo or YouTube? To do this, I'll go to YouTube and copy the embedded code. Just find your favorite Cognigy video and choose, share and embed. I'll now copy that into my Say Node using a regular Text output type. Note that I'm changing the width and height parameters so that it fits nicely into my chat bubble. When I preview it in the interaction panel, I just see the embed code itself. Cognigy does this intentionally to prevent any unwanted code from executing within the Cognigy user interface. But when you view the chat in a web browser or on your phone, the video renders as expected. How do you preview your chatbot in the Cognigy Webchat? You do this by creating an Endpoint. I'll talk much more about Endpoints in another video. But for now, just go to Deploy-Endpoints in the left bar. Click a New Endpoint and call it Web. For the Flow, choose the Flow we've been working on, and for the Endpoint Type, make sure Webchat is selected. Now save and click open Webchat. If you say Hi, you'll see your YouTube video. We've talked about writing content in Say Nodes, but what if you want to output something dynamic. We'll cover this topic in-depth in later videos. But for now, I'll introduce the topic of Tokens. I'll go back to my Say Node and in my Text field, I'll click on the AI icon. You'll see a long list of Tokens that will output dynamic content. In later videos, we'll talk about how to create your own Tokens. I want the user to be able to input a monetary value and I'll confirm it for them. In my Text field, I'll say I'll show you flights under dollars and then I'll choose the first currency Slot Token. To test, I'll say show me flights to Las Vegas under two hundred dollars. The Token is replaced by the match from the user's input. There's a lot more you can do with Tokens and you can test them out to see the different values that are available. The last feature of Say Nodes I want to cover is adding links and styles to your output's using HTML. Why would you need to do this? Well, we saw one example of embedding HTML earlier when we added a YouTube video. You may also want to do things like adding hyperlinks to your text. Even though you can do this in buttons, sometimes it makes more sense for users to present traditional links like you would see on a website. The easiest way to get your desired HTML is to use a hyperlink generator. There are many of these available if you search on the web, but I'll use this one. I'll enter the URL I want to link to, like our Terms and Conditions. I'll enter the target. This is important because you often want to open the link in a new tab rather than the current one. I'll select blank. For the text, I'll enter Terms and Conditions. Now, I just need to copy my code by clicking the copy button. Back in Cognigy, I'll update my Say Node. When I interact with my chatbot, I see the HTML markup in the interaction panel, but I see the full link in the Webchat. Sometimes the colors of the Webchat widget are a little too close to the colors on your website. This can make things like links blend in too much with the bubble color to be easily visible. To address this, you can add a style attribute. You might want to check with your website team to determine the right style to apply. For me, I'll just make my link green so it stands out. To do that, I'll just add "style = color: yellow" to my HTML tag. Be sure you add it before the closing greater-than sign.
I'll save and preview again.
Now it stands out against the background. There is a tremendous amount of customization you can do to the look and feel of the Webchat. But this is just a quick workaround if your links are blending into the background and you don't know much about styling web pages. In our last topic for this video, I want to go over some general features that apply to all Nodes, including, Say Nodes. If I look at my Flow currently I have a Say Node and I see a small preview of what it says. But someone else looking at this Flow may not understand what the Node does without opening up and looking at it. Cognigy lets you add comments and labels to Nodes to make their purpose clearer. I'll open my Say Node and look at the styling section. I have options for Label and a Comment. Right now the Label says "Say" indicating that it's a Say Node. But I want to make it clear that this is a welcome message. So I'll put "Welcome Message" in the Label field and a more detailed comment in the Comment field. When I click save, I see the Node is now called "Welcome Message", and it has a little icon indicating that a comment is attached. When I click on the icon, I can view and edit the comment. Believe it or not, there's, even more, to say about Say Nodes. But for now, you have all you need to get started. We will cover more topics related to Say Nodes in our Flow Developer series.
Chapter 3: Asking questions with Question Nodes
Collect information from users with various question types and validate the input
- Basic Regex for Social Security Number:
- Question Nodes in the documentation
- Watch Cognigy Session Episode "Question Nodes" for a technical deep dive
We've learned how to use Say Nodes to output something to a user. Let's learn how to ask questions using Question Nodes. Question Nodes outputs something to the user, and then wait for a response that matches the validation rules configured in the Question Node. Once a valid response has been received. The Flow continues its execution with the Node following the Question Node. Let's look at an example. I'll say hi to start the Flow. Yes, I'm checking bags. Note the green caret next to the Question Node, this indicates the current entry point for the Flow. That means when the user answers the question instead of starting with the Start Node, Cognigy will start with this Question Node. This is one of the key features of a Question Node. It's a bookmark in a Flow that tells Cognigy where it needs to start when the next user input comes in. Entry points are an important concept in Cognigy and we'll learn more about them in other lessons. I'm a heavy packer, so I'll say three. When I enter three, Cognigy outputs the Say Node then outputs the Question Node, then waits for the next user input. And I'm flying to San Francisco or SFO and Cognigy celebrates my decision. Now, I'll start over and provide some different answers. When it asks, will you check any bags? I'll say three. Cognigy says, not sure I understood this correctly, will you check any bags? If I say yes, the conversation will continue. Similarly, when it asks how many bags will you check, I'll say, I don't know, and the system will again tell me that it didn't understand me. I'll say three and continue the conversation. This may seem limiting, but it's by design.
We also have ways to handle inputs that don't have to match a validation rule. But in this case, I've configured the Question Nodes to look for a specific answer types. Let's start from scratch and look at the different configuration options for Question Nodes. I'll create a new Question Node and click on it to open the Edit Node panel. At the top, you see the Question Node type, which I'll leave it yes or no for now. The next few fields are the same fields you'll see in a Say Node. The Output, the Output Type, and the Options work exactly the same way as a Say Node. Let's take a look at the re-prompt questions, these settings tell Cognigy what to do if the user says something that doesn't match the Question Type. So if it's a yes, no question and the user says maybe. We can prompt them to give a more precise answer. I'll add please answer yes or no for my re-prompt message. I also have the option to repeat the question after the re-prompt message. There's also a field for a re-prompt condition. We'll talk about Conditions in the Flow developer training. For now, let's see it in action. I'll say hi then I'll say maybe. Note that it sends the re-prompt message and then asks the question again. If I turn off repeat question, it will just give me the re-prompt message. Now, let's output the answer to the user. I'll create a Say Node and I'll enter, you said, and then I'll choose the Last Question Result as my Token.
When I test it out, it says you said true instead of you said yes, that's because Cognigy stores the result of the yes, no question internally as true or false. I'll leave the Say Node there so we can see other examples for different Question Types. What if the user doesn't say yes or no, they might say sure or that's right or yep. Let's try it again and I'll say sure. Cognigy recognizes "sure" as an affirmative answer and continues with the Flow. I'll try again and I'll say, that's right, Cognigy doesn't recognize this as an affirmative answer, and it asks me for a yes or no. If I want to add to Cognigy a list of affirmative or negative answers, I can. Go to the left bar and find Manage then Settings. You'll see the option to add additional Positive Confirmation Words. I'll add "that's right" to the list. I'll test my Flow again and say, that's right. Now it recognizes this as an affirmative response. Now, let's change the Question Type to a date question. I'll change my text to what is the date for my Say Node I'll choose the first date Slot instead of Last Question Result. I'll save and let's see what happens. Note that it asks me the date and also presents a date button. Cognigy will show the date picker if the user is chatting through our out of the box Webchat client.
The Say Node responds with Cognigy internal representation of the answer. Let's look at different date inputs Cognigy will recognize. I'll start the Flow and say next Tuesday, that works! Now, we'll say, two weeks from now. And that works too. How about a Number Question Type? I'll make the Text be pick a number. In my Say Node, I'll choose first number Slot. That's straightforward, how about a Money Question Type. I'll change my text to how much money? And my Say Node to use the First Currency Slot. When it asks how much money, I'll say "4 stacks of $600". Note that it returns 600, it sees the 4 as a general number and the 600 as the currency value. I want to show three more Question Types that are unique. The first is a Slot Question Type, this will match any Slot name you've entered in a question. We haven't learned about Lexicon yet, but when I created this Agent in Cognigy, I uploaded the airport code to Lexicon. I'll enter the special key phrase name under Slots name "airport_code" and I'll change the question to "Which airport?" In my Say Node, I'll just output the Last Question Result. I'll test it out by saying hi, then SFO. You'll learn more about Lexicon and Slots names in later chapters, but it's good to remember this Question Type is available. It can be useful in cases like airport codes where you're looking for a specific answer from a large list of possible answers.
The second I want to show is a Regex Question Type. Regex stands for regular expression, and it refers to a standard for describing patterns in text. For example, you might have a regular expression for a phone number or an email address. Cognigy will check to see if it gets the expected pattern match and treat that as an answer to the question. Let's imagine we need the users social security number. First thing I'll do is go online and search for a social security number regex. In the regular expression field, I'll enter this. Note that I start and end the regular expression with forward slashes. If you enter something that's not a valid regular expression, you'll get an error message. And for the text, I'll say, what's your social? If I test it out, I'll say "12345", and I'll get my re-prompt answer. If I say "123-45-6789", I get a confirmation. Lastly, there is a Text Question Type. This will accept anything the user enters, so if I say how are you? And I answer, good, I will hit my Say Node regardless of what I enter. We know how to ask questions now, but let's look at what to do with the answers. By default, the answer is stored in thCognigy Inputut. That's why I've been able to use those Tokens in my Say Node like Last Question Result or first date Slots. Let's look under Advanced to see additional options. If I select Store detailed Results, Cognigy won't just store the answer. It will also store the value, question, time stamp, and the answer in the Cognigy Input.
I can also enable Skip Advancer in input. Let's take a look at how this could be useful. I'll add another question at the top and make it a Number Question and I'll say "How many tickets?" If I go through the Flow, I'll first say 3 and then say SFO. That makes sense. Now let me do it again and I'll say 3 tickets to SFO. Note that it skipped the airport question because it already had the answer in my input to the prior question. That's what skip if answer an input does for us. So far, I've said that the answers are stored in the Cognigy Input. We'll learn more about the Cognigy Input in later chapters. One limitation of storing the answers in the Cognigy Input is that they go away when the user inputs something else. I'll copy my Say Node under my first question to show what I mean. Notice that the Say Nodes output the Last Question Result from the Cognigy Input. So prior to responding to each question, it notes the answer from the previous question.
But what if I reach the end of my Flow and want to know how many tickets? That information isn't available anymore because the Cognigy Input has changed. We can persist answers through the conversation by writing them to the Cognigy Context. In my Question Node, if I go to Context, I can choose Store Result in Context. That presents a new field Context Key to use. Think of this as an identifier for the answer that I can reference later. So for this question, I might make the key be airport. That way I can refer to this in the Cognigy Context later in the conversation. Maybe when I'm looking up the user's flight status in an external database. We'll talk about the Cognigy Context and how to access it more in future lessons.
The last thing I want to share about Question Nodes is the option to store values in profile storage. As I mentioned before, the Cognigy Input persists just until the user enters a new input. The Cognigy Context exists throughout the conversation. The Cognigy profile persists throughout the user's history, across multiple conversations. If I write the result to the contact profile, I see a new dropdown "Profile Key to use". This lets me assign the user's input to the user's profile. This is helpful. for example, if you're asking for the user's name and you want to use that information in future conversations. Like with the Cognigy Context, we haven't yet talked about how to access profile values, but we will. In the following chapters, we'll learn how to apply logic that allows the system to respond differently depending on the user's answer.
Chapter 4: Making decisions with If Nodes
Build logic into your Virtual Agent and guide users through processes
Towards the end of this training, we check to see if the First Name exists in the Cognigy profile. We use the “First Name” token and the “exists” operator to determine if the first name exists.
This behavior has changed as of Cognigy 4.1.2. In 4.1.2 or later, use the “First Name” token, the “not =” operator and "" as the value.
- Not available for this chapter
We may want to respond to the user in different ways, depending on what we know about them or what they tell us. One way we can do this in Cognigy is through an If Node. If Nodes provide basic branching logic. If you aren't familiar with the concept, If Nodes test for a Condition and either follow a Then path or an Else path. So for example, if it's morning, then make coffee else drink water. If it's raining, then bring an umbrella else wear Sunglasses. Sometimes there's a 'Then' without an 'Else'. If your battery is low then charge your phone else you don't need to do anything. In Cognigy I can add If Node to my Flow Chart and Cognigy automatically creates the If Node, a Then branch, and an Else branch. You'll see that the If branch has a default Condition of one equals one. You'll always ignore that and add your own Condition. We'll spend all of this session talking about the Rule Editor, which is what you see in the Edit Node panel when you click on an If Node. The Rule Editor is the tool we use to create Conditions. A Condition is the thing we're testing for. Like if it's morning, if it's raining, or if your battery is low. You'll see at the bottom of the Rule Editor, there's the option to use the Advanced Editor. We'll talk about the Advanced Editor in a later course when we learn about CognigyScript.
The Rule Editor gives you the power to check for Conditions using Tokens. But before we get there, let's test out some basic operators. To help us know how the Condition is evaluated. I'll add a Say Node after the Then and Else Nodes. If I start the Chat, I see the Then branch was hit. That's because the Condition is one equals one and we know that one does in fact equal one. I'll change the Condition to one equals two. And when I test that out, the Else branch gets triggered because one doesn't equal two.
If I change the Condition to one is less than two. The Then branch gets triggered. Same is true if I change it to one, not equals two.
Let's try another one. I can test if one word contains another, so, for example, I can test if Cognigy contains "Cog". And it does note that contains is case sensitive, so if I test if Cognigy contains small "cog". It triggers the Else path. Let's look at some practical examples. To start the Flow, I'll add a Question Node and make it a yes, no question.
Is it raining outside?
In my If Condition, I want to test if the user answered yes or no. I'll choose the last question result Token and I'll choose "is yes" from the dropdown list. I'll give it a try if I say yes, I go down the Then branch if I say no, I go down the Else branch. Let's use another Question type like a number. I'll ask how many bags did you want to check? For my check, I'll see if the first Numbers Slot is greater than zero. If I answer three, it triggers the Then branch. If I answer zero, it follows the Else branch. What if I want to greet someone by their name if it exists. I'll remove my Question Node. In my If Node, I'll choose First Name as my Token and I'll choose exists as my operator. Let's test it out, Cognigy doesn't know my first name, so why don't we ask. On the Else branch, I'll add a Question Node. I'll make it a Text Question Type and I'll ask, what is your first name? Under profile storage, I'll choose "Write Result to Contact Profile" and I'll choose firstname as my profile key. On the Then branch, actually want to greet the user by their name. I will add a Say Node and I'll say "Hello First Name". Let's test it again. Now that Cognigy knows my first name, it can greet me by name. Remember that the profile values are persisted across conversations. So next time I talk to the bot, it will still remember my name. That covers the basic topics for If Nodes. If Nodes can become more powerful in combination with CognigyScript and we'll talk about that in the Flow Developer course.
Chapter 5: Understanding users with Intents and Slots
Leverage the power of AI and process human input via Natural Language Understanding
- Watch Cognigy Sessions Episode "Cognigy NLU" for a technical deep dive
- See the attached document – Airlines.csv – for a lexicon of airline names you can upload to Cognigy. Open the document in Excel or a similar application to make sure you understand the structure of the csv file so that you can create your own files in the future.
- Example sentences for the Flight Status intent:
Is my flight on time Is my flight late Is my flight running late How late is my flight Is my flight leaving on time Is my flight departing on schedule Is my flight delayed
- Example sentences for the Lost Baggage intent:
is my baggage delayed is my baggage late i lost my baggage can't find my bags where are my bags
- Example sentences for the Baggage Carousel intent:
can't find my bags where are my bags baggage carousel number where can i find my bags which carousel are my bags at
- Large intent dataset on GitHub
- Cognigy NLU documentation
Cognigy uses Natural Language Understanding or NLU to analyze user inputs and map them to an Intent that the bot can fulfill on. For example, if I want to upgrade my seat, I might say I want an aisle seat, can I get a window seat, or are there upgrades available to first class? We can't predict all of the ways a user might express their desire for an upgrade. We need some way to teach Cognigy enough variations on what a user could say so that Cognigy can understand all of the unpredictable ways users actually express themselves. We do this by creating a machine learning model that's intelligent enough to interpret what our users say and map it to an Intent the bot can handle. Cognigy makes all of this straightforward to set up, and you don't need to be a data scientist to achieve a high level of accuracy in your Intent detection. One way to enrich your NLU is by teaching its special key phrases that are unique to your brand or to your knowledge domain. For example, for an airline, important key phrases could be arrival, departure, and upgrade. For a product company, important key phrases could be your product names. For a technical support bot, important phrases could be computer parts like screen, power supply or drive. Cognigy organizes these key phrases into Lexicons that lets you define and categorize these domain-specific terms. Why do we need NLU and what are its alternatives? Could we make bots that address customer needs and create a good user experience without any machine learning models at all? Well, in some cases, giving users predefined choices could be a more efficient approach to serving them rather than making them provide natural language inputs.
For example, if I'm checking on my flight status and the bot needs to confirm that the flight is UA 7371, it might be easier to just present a yes button and a no button. Particularly on mobile devices, tapping is faster than typing. Or for booking movie tickets. It might be easier for users to click a movie title, a showtime, and the number of tickets rather than typing out I want three tickets to see a specific movie at a specific time. But imagine the same interaction through a voice channel. On the phone I don't always have visual choices in front of me, so it may be much more efficient to speak my request rather than going through a questionnaire-style interaction. So when and how you use NLU will be determined by the nature of the task the user is trying to accomplish and the context in which they'll be interacting with the bot. NLU models are often optimized for specific languages. When you create a new Agent in Cognigy, one of the first things you do is choose an NLU language. Cognigy.AI onboard. NLU is pre-trained with curated data from over a hundred languages to support Intent recognition and key phrase detection. Any other natural or artificial language is supported based on language-agnostic NLU algorithms. For 20 of the most common languages, Cognigy.AI provides pre-built entities that allow automatic processing of inputs like dates, currencies, and others specific to a language that's defined in the Flow. The Generic Language option extends the Cognigy NLU compatibility with over 100 languages by employing an advanced multilingual machine learning model. The Generic Language allows you to do Intent mapping in any language which along with key phrase detection and our powerful rule Intent engine enables you to build Flows in any language.
Let's make this more practical by looking at how to build Lexicon that Intents in Cognigy.AI. We'll start with Lexicons. When you create a new Agent in Cognigy, you can choose one of our pre-built Lexicons in the premade skills page of the wizard. You can choose things like Cities, Airport codes, Holidays and more. You'll find those Lexicons and have the ability to create your own if you go to the left bar on your Agent dashboard and select build the Lexicons. If I look at my Airports Lexicon, I'll see a list of every airport code and I can sort these and search them. If I search for SFO, for example, I'm taken directly to that entry. You'll notice that each airport code is associated with a Slot. A Slot is a technical term used when discussing natural language processing that refers to types of attributes. So cities could be a Slot as our airport codes or flight status. Think of Slots as just a category name for a group of words. You'll see in Cognigy that we sometimes name things in ways that exclude spaces, so the airport code Slots name has an underscore rather than a space.
You may also see times where two words are combined into one with the second word capitalized. Most of the time this is not necessary, but there are circumstances where having a single word makes it easier to work with CognigyScript, for example. So you'll often see that as a naming convention. I'll create a new Lexicon by going to Build-Lexicons and clicking new Lexicon. Sticking with our aviation theme, I'll call it Airlines. To start populating my Lexicon, I can type in a new key phrase and press enter. For my Slots name, I'll call it airline. I can also add synonyms, which could be all the different names a user may call an airline like United or UA. If Cognigy sees that the synonym is used in a user input, it will treat that as a match for the key phrase. A user who says I have a flight on United will get a match for the full key phrase United Airlines. Another example could be adding city names as synonyms for airport codes. So the airport code SFO, I might add, San Francisco as a synonym. That way if the user says I have a flight departing San Francisco, the bot will see that as a reference to the airport code SFO. After I type each name, I press enter to submit the new word. To delete a word, I can click the X next to it. I could build my Lexicon one word at a time in this UI or I can upload a Lexicon from a CSV file. This can make it easier to work with when you want to copy-paste a large number of key phrases for example.
Let's build a Lexicon and upload it. First I'll find a list of airlines by just searching the web.
I will copy and paste this list into Excel and filter out the rows I don't want.
For column B, I need to put the Slots name, I'll make it be airline. And I'll save it as a CSV.
And now I'll upload it to Cognigy.
So in just a few minutes, I have a new Lexicon with 175 different airline names. I could have added synonyms to column C in my spreadsheet, or I can enter a few here in the user interface. If you want to add multiple tags or synonyms to your spreadsheet or if you want to include data with your key phrases, you can do so. Search for CSV in Cognigy documentation to find the detailed format for Lexicon uploads. With this long list, I can page through the results and control the number of rows per page. To delete an entry, you have two options: You can delete a single entry by clicking the down arrow on the far right side of the row and clicking delete. You can also click the checkbox to the far left of the row and then click delete at the bottom of the screen. Cognigy also has some built-in Slots types, they are similar to the built-in Lexicons, but are more rule-driven rather than having a specific vocabulary. Cognigy automatically recognizes dates, numbers, durations, temperatures, ages, emails, and percentages. You might recall we saw some of these as built-in Tokens in our Say Nodes and If Nodes. I'll talk about how to use your Lexicon in a few minutes when we build Intents and test them out. Before we do that, I want to raise a point about organizing your Lexicons. In the example I've shown, I had a separate Lexicon for airport codes and airlines, but I could have combined those into a single Lexicon. Because I can categorize key phrases using Slots, I can have any number of Slot types in a single Lexicon. Lexicon aren't inherently tied to a single topic. They're just a convenient way of organizing our lists of key phrases. You can even use the same Sot in different Lexicons if you want to. So what should your approach be? A single Lexicon with all of your key phrases or multiple Lexicon organized topically. In general, you should organize your Lexicon topically. The reason to do this is that you may want to export and import your Lexicon into another project. and having multiple topic-oriented Lexicons lets you export just the category you want. If you are editing all of your Lexicon in excel, it also lets you have different subject matter experts managing their own list of terms without having to merge them back into a single Lexicon. That said, you might have one Lexicon that does all of the various minor topic areas that just have a few key phrases. In terms of Slots names, it's helpful to have a standard you're working with as a naming convention. I recommend Slots names that start with a category word, if it makes sense, have no spaces, and describe one row of the Lexicon. For example, I would generally prefer airport_code and airport_name to Airport Codes and Names of Airports. How you organize and name Lexicon, Slots, and Key Phrases is up to you, and it's helpful to give it some thought in advance so that everyone on your team has the same rules in mind. Key phrases are one ingredient of Cognigy NLU. Let's turn our attention to the other ingredient "Intents". An Intent captures the main idea of a user's objective when they type or speak something to your bot. So if the user says, is my flight on time or is my flight running late, they need the flight status Intent.
Or if they say I want to go to San Francisco on the 28th or I need to go from SFO to DUS, they need the book a flight Intent. In Cognigy we create Intents using the Intent Editor to find the Intent Editor go to Flow and click NLU at the top of the screen. Let's start by creating a flight status Intent. I'll click Create Intent and name it Flight Status.
Now, I need to give Cognigy some examples of how users might ask about their flight status. Things like is my flight on time? How late is my flight? Is my flight delayed?
I'll save and click Build Model. We need to build the model in order for Cognigy to take a free form user input and match it to an Intent. We can test this out in the interaction panel. I'll open the interaction panel by clicking the bubble in the top right corner. Now I'll do something new. I'll click on Settings here and then I'll turn on Expert Mode. Expert Mode gives you feedback on things like Intent matches and Slot matches so you can see how the bot is interpreting a user's input. All of the Expert Mode messages only appear here in the interaction panel and won't show when an actual user is on the Webchat on your website or on any other channel. Now go back to Chat here to talk to my bot and I'll ask I need to check if my flight is on time. Cognigy gives us the feedback that the flight status Intent was matched and it gives us the Intent score. The Intent score describes Cognigy level of confidence in the Intent match. Let me try something else like how late is my flight?
That got a one as an Intent score because it actually matches an example sentence.
I want to enable Cognigy to use the Lexicons I created, so I'll go to attach Lexicon and enable the airports and airlines Lexicons. I need to build the model again for these changes to be applied.
What if I say " Is my United flight to SFO on time", you'll see that I get two key phrase matches for both the airline and the airport.
Let's create a second Intent called Lost Baggage. And add some example sentences like is my baggage delayed or where my bags. I'll save and build the model.
And I'll test. We still get the right Intent match, even though there is some overlap in the example sentences. Flight status has the phrase "my flight was late" and lost baggage has the phrase "where my bags?". But Cognigy is smart enough to know the better match. I'll create one last Intent called Baggage Carousel.
And I'll save and train. To test, I'll say I went to the carousel, but my bags aren't there.
One thing you will see is that there are many common English words that are also airport codes. We can ignore that for now as it won't affect the Intent detection. More importantly, however, we got the Baggage Carousel Intent, but this really sounds like it should match lost baggage. The passenger knew the right carousel but still couldn't find their bags. So we need to look at our Intents and figure out what went wrong. Cognigy provides indicators that can guide you to fix misunderstood Intents. You'll see red, green, and yellow lights that characterize the quality of the example sentence or the Intent. Note that red or yellow lights aren't necessarily bad, but they do give you good starting points when you aren't getting the accuracy you expect. If I look at baggage carousel in lost baggage, I see that they both have red lights next to them. Baggage carousel and lost baggage both indicate that there are strong overlaps with each other. There are different ways we can solve this. First, if I look at my Lost Baggage Intent, I don't see the word carousel in my example sentences. So I could add a sentence here that includes that word. I can also look at my red light example sentences and I see that where are my bags and can't find my bags are example sentences that are present in both Intents. This could be bad and indicate that we need to distinguish the example sentences where it could be ok because I can't find my bags could potentially mean two different things in different context. And we can control this with States in Cognigy, a topic we won't touch on in this session. I'm going to do two things: I'll eliminate can't find my bags and replace it with where my bags coming out from my baggage carousel example sentences and I'll mention the carousel in my Lost baggage Intent. For the lost baggage Intent I will add mention of the carousel.
Now, I'll build the model and test.
Now I get the Lost Baggage Intent, which is what I expect.
The point of this exercise is to give some guidance about your example sentences. When you're creating your sentences, think about what makes a sentence unique for this Intent versus other Intents. What content distinguishes these example sentences as fitting for this Intent. Note that example sentences don't have to be sentences, they can be phrases or sentence fragments as long as they are useful data that can help Cognigy build an accurate model.
I want to show one other way you can improve your Intent detection. If I enter this example sentence.
I get the right Intent match, but the Intent score is fairly low, less than 50 percent. How can I address this? I could add this example sentence, but there are many variations of this. I can't have one example sentence for every combination of airport codes. I can address this with annotation. I'll add these example sentences, check the status of my flight from SFO to ORD, and then I'll highlight SFO and click the plus sign next to it. This lets me tell Cognigy that SFO is a match for my Lexicon Slots airport code. I will do the same thing with ORD.
I can do this with airlines as well.
Now, I'll build the model and test another example sentence.
I get a very high Intent match, even though LAX and DUS aren't anywhere in my example sentences.
I've been editing all of these Intents in the Intent Editor, but I can also upload them and download them as I did with Lexicons. I'll upload a larger set of Intents now so we can see what it's like to work with a larger data set. I'll turn off my airport and airline Lexicon as I won't need them for these new set of Intents.
One thing you'll note is that the Intents have been categorized by naming convention. So the alarm-related Intents all have alarm at the beginning and the audio volume Intent start with audio volume.
This is a good approach as the number of Intents you manage increases. In Cognigy, you can also use hierarchical Intents to manage large numbers of Intents. As an example, I'll create an alarm Intent. Now, I'll drag all my alarm Intents under that Intent. I just drag the Intent name onto the alarm Intent and it creates a hierarchy.
I'll do that for audio as well.
As you can see, this creates a nice visual categorization that can make it easier to find Intents. I can create new child Intents by going to any Intent and choosing create child Intent. Organization is one reason to have Intent hierarchies, improved Intent detection is another.
Let's look at an example, I'll say cancel this Friday's ballet class.
I get an 87 percent Intent match, but it's actually for the wrong Intent. It identifies calendar set for the Intent rather than calendar remove. Let's organize it hierarchically and see what results we get. This time for the calendar I will turn on inherit example sentences from child Intents and save, then I'll build the model.
Now, when I repeat my statement "cancel this Friday's ballet class", I get the expected Intent match calendar remove.
How did creating hierarchical Intents take me from a misunderstood Intent to a correct Intent match? With hierarchical Intents Cognigy takes a somewhat different approach to Intent matching. When I enable inherit example sentences from child Intents Cognigy treats all of the example sentences from the sub Intents as part of the parent Intent. Cognigy first tries to find an Intent match from all of the highest level Intents. Once it gets a match at the parent level, it then goes to the child level to do a more granular analysis. Think of it as a two-step process: Cognigy NLU first tries to find a category-level match. Once it finds the right category, it does an Intent match just on the Intents that are part of that category. Sometimes Cognigy finds an Intent match that is close but doesn't quite meet the confidence threshold. In those cases, we have the ability to ask users to confirm that we have the right Intent match. Let's look at our cooking recipe Intent as an example. If I go down to the Advanced section, I'll see the option to add a confirmation sentence. When a user sends an utterance to Cognigy, the NLU will attempt to find an Intent match and attach a score to it as we've seen. If the score doesn't quite reach the confidence threshold but it does meet a reconfirmation threshold, Cognigy will output the confirmation sentence, typically a question. If the user responds to the confirmation sentence affirmatively, Cognigy will remember the confirmation throughout the rest of the conversation. If a certain number of users confirm the sentence, the sentence will be added to the list of example sentences trained to this Intent. Let's see a concrete example of this. I will add the confirmation sentence to my cooking recipe Intent "you have a recipe question" and I'll build the model. To demonstrate this feature, I need to make some changes to the default thresholds Cognigy has for confidence and reconfirmation. To do this, go to the Manage and then Settings in the left bar. Under Threshold, I see a couple of sliders and I'll make some adjustments to them. I'll move the confidence threshold to 90 percent and the reconfirmation threshold to 80 percent.
And then I'll save my changes. I'll ask, how do I make five-alarm chili. It now asks a reconfirmation question.
This is because I have an Intent match greater than 80 percent, but less than 90 percent. I'll answer yes and now I have a match for the cooking recipe Intent. This feature helps you address those borderline cases we have a close but imperfect Intent match. An important note here, if you haven't defined a reconfirmation sentence for your machine learning Intent, the reconfirmation threshold is used to determine an Intent match. So the reconfirmation threshold is used in two different ways depending on the existence of a confirmation sentence. With a confirmation sentence, the reconfirmation threshold triggers a reconfirmation sentence. Without a confirmation sentence, that reconfirmation threshold serves as a confidence threshold. We've talked a lot now about Intent detection, but how do we respond to users once we know their Intent? You have two options: handling responses and Flows and handling them and Default Replies. You handle Intents and Flows when you have some logic you want to execute. So, for example, you may want to ask a follow up question or retrieve data from an external system. But if you just have a simple response to a simple question, you can just put it in a Default Reply. As an example, let's look at the genera praise Intent. This Intent lets the bot receive a compliment. Let's respond by saying thank you. I'll go down to Default Reply and I'll see something that looks very similar to a Say Node. I'll just say thank you and save.
Now, if I tell the bot, you are great. It says thank you, which comes from my Default Reply.
One last note about the Intent Editor, you'll see the build model button has a dropdown arrow next to it that lets you do a quick build. This option will build a machine learning model for your Intents, which doesn't contain specifications for states and Intent conditions and is thus up to ten times faster in building your model. We could have used this option throughout this session as we aren't using States or Intent conditions. There are a number of topics we don't cover in the foundation's course that relate to Intents. Those include Rule-based Intents, Intent Entry points, Attached Flows, maintaining Intents and different Flows, triggering Intent automatically , and using external NLU. We'll cover those in other courses and you can learn more about these in the documentation. Before we leave the topic of Intents, I want to talk about how to monitor and improve Intent performance. The reconfirmation threshold feature helps Cognigy learn on its own. But how can you take part in the NLU continued education? There are two tools for doing this in Cognigy: The Intent trainer and Cognigy's analytics. You can find the Intent trainer in the left bar under the Tweak category.
The Intent trainer presents the collected user inputs and gives you the option to update your example sentences based on how users are actually interacting with your bot. For example, I can see that two people have asked "How do I make five alarm chili?" It triggered the correct Intent, but I can consider adding it to the cooking recipe Intent to make sure there's still a match. Even if other example sentences change over time.
I can click add to Intent and I can choose the Flow in the Intent I want to associate this example sentence with. I can also make adjustments to the example sentence. Sometimes the Intent trainer can serve as inspiration, but I don't want to add the user's input verbatim to my example sentences. I can change the text and even add annotations, create new key phrases or create new synonyms. When I save my change, Cognigy queues it up and lets me keep working through the list. I can continue one by one, adding, skipping, or ignoring user inputs. If I skip an utterance, it moves to the skipped record list, but it will show up here again if a user enters the same input. If I click ignore, it will move the input to the ignored records list and that won't show up on this list anymore. When I click apply Cognigy will save my changes and I can continue to work through my list. I can use the helpful filters at the top of the list to target specific Flows or Intents or to move from my unhandled list to my handled, ignored, or skipped lists. One final feature worth mentioning, you can click on the vertical ellipsis to import or export trainer records. This can be helpful if you have a non-production environment that you use for reviewing training data and training your model. It lets you do the more data-intensive work of training your model in an environment that isn't accessed by your end users.
The other approach to analyzing your Intent performance is to view the raw records in excel. To do this search for OData in Cognigy documentation. This will let you view every user input and filter by records, for example, that don't have Intent matches. And that's all for now regarding Intents in Cognigy. We've covered a lot of topics and I hope you'll look for more in Cognigy documentation and Help Center.
Chapter 6: Creating conversations with Flows, Intents and Slots
Combine the learnings of all previous chapters into one smart & powerful Virtual Agent.
You'll find an example CSV file at the bottom of this article if you want to follow along.
- United Airlines Carry-on Baggage Rules
- Example sentences for the Baggage Rules - Carry On Intent:
Can I take my bag on board What are the carry on rules What are the limits for carry-ons Carry-on limitations Size limits for carry on bags
- Example sentences for the Baggage Rules - Checked Bag intent:
Can I check a bag What does it cost to check a bag Bag check fees Cost for checking a bag Cost extra to check a bag?
- Baggage (see also the attached file baggage_type.csv):
Hand bag Hobo bag Tote bag Duffle bag Messenger bag Backpack Satchel Doctors bag Laptop Bag Bucket bag Bowler bag Wristlet Pouch Clutch Beach Bag Shoulder bag Miniaudiere Shopping bag Grocery bag Drawstring bag Make up bags Foldover Bags Phone bag Camera case bags Baguette bags Barrel Bag Basket Bag Fanny Pack Kelly Bag Lunch Bags Golf Bag Ski Bag
- Baggage images: Measuring a bag, handbag
- Carry-On Bag Description:
Your carry-on bag should fit in the overhead bin. The maximum dimensions for a carry-on bag are 9 inches x 14 inches x 22 inches (22 centimeters x 35 centimeters x 56 centimeters). This includes handles and wheels.
Your personal item should fit underneath the seat in front of you. The maximum dimensions for your personal item are 9 inches x 10 inches x 17 inches (22 centimeters x 25 centimeters x 43 centimeters).
- HTML Hyperlink Generator
- Link to United Airlines Baggage Calculator:
<a href="https://www.united.com/en/us/baggage-calculator/any-flights" target="_blank">Baggage Calculator</a>
- Luggage image: Conveyor belt </li
Let's bring it all together by building out a conversation about baggage to the traveler who wants to take on a flight. A traveler wants to know if they can bring a bag onboard or if they need to check the bag. If they do need to check the bag, they want to know if there are any fees involved. To build this out, we'll introduce a new concept, a LookUp Node. I'll talk about LookUp Nodes later in this video. How do we get started? A good way to approach building out conversations is to actually write out the dialog. You can start by writing what is often called the "happy path", meaning how does the conversation look if everything goes right and the user & the bot understand each other clearly. For this scenario, it will go something like this. The user will say, can I bring my bag onboard, or do I have to check it? The bot will say you can bring one bag and one personal item on the plane. Your carry-on bag should fit in the overhead bin. And then a lot of details about what a carry-on bag and a personal item should look like. The user might respond. It's a golf bag. Do I have to pay extra? If I need to check it. The bot can ask are you flying first class? And the user will respond, yes or no? The bot can ask how heavy is the bag? And based on the weight that the user specifies, it can either be free or there could be extra fees involved, or they might not be able to bring the bag on the flight after all. In real life, we might make this more of a personalized interaction, finding out the user's individual bag allowance for their flight, knowing their status and our frequent flier program, and giving them the option to pay any additional baggage fees from within the chat itself. But if this is an anonymous user and we don't know if they've even booked a flight yet, this is a potential interaction we could have that can provide better service than calling our contact center and waiting on hold.
It's helpful to write all of this out so you can start with the dialog in mind before you build your Flows. We'll ask a variety of questions and provide intelligent responses in the Flow we'll build in this session. Imagine that we've written out the happy path. Now we can identify the Intents and Lexicon we will need for our Flow. We have two Intents here, we'll call the first one Baggage Rules-Carry On and the second one Baggage Rules-Checked Bag. Note that I'm including the category name in the Intent name to help organize my Intents in the Intent Editor. I don't necessarily need any new Lexicons here, but I'll continue to use my airport codes Lexicon in case users say something like "Can I bring my suit bag on my flight to the US" or "Can I take my skis onboard for my trip to DEN"? It's also good to think about the Channels I want to share this on. Will it be Webchat, SMS, Voice. For now, I'll assume we want this just for Webchat, but I want to keep other Channels in mind as I'm building out the Flow in case I want to expand my Channels in the future. That's all I need to think about for now. So I'll start building my Flow. I'll go to NLU at the top to open the Intent Editor. My first Intent will be the Carry On Intent. And I'll create some example sentences. I'll save and build. For my Carry On bag Intent, I just need to give a response to the user. You may recall that I can handle basic question and answer scenarios in a Default Reply. This means I don't have to add to my Flow to respond. I'll add my response, save and build. Now let's test it in the Interaction Panel. I'll ask, can I take my golf bag onboard? Note that nothing happens, I have Expert Mode on which means I would see if there's an Intent match. How can I improve the Intent? Well, I could simply add the example sentence to my Intent and that would give me a 100 percent match. I could add a number of variations for different kinds of bags like suit bags, purses, and ski bags. Or I could create a Lexicon with all of the potential bag types. It's a little more work, but I might find this Lexicon useful as I implement other dialogs. So go ahead and create it. I'll go to Build and then Lexicons and create a new Lexicon called Baggage. I searched online and found a list that I'll use as a starting point.
I'll put these into a CSV file. I'll add my Bag Type Slots name. And upload.
Now, I'll add a couple of synonyms, I'll add camera bag for camera case bag and computer bag for laptop bag. Now we'll go back to my Flow and click on the NLU tab. To use the Lexicon I just created, I need to select it under attached Lexicons. I'll also make sure my Airports and Airlines Lexicons are enabled. Now, I'll update my Intents. For the first example sentence, I'll annotate it, I'll select bag, click the plus sign and then select my bag type Slots. While I'm at it, I will also add an example with airport codes. Let's build the model and test again. And I get the Intent match, I expect. I'll try with another bag type. This is good, I'm getting a high Intent score and Cognigy is finding my Slot matches. We also see a false positive, Cognigy sees the word "Can" and matches it to an airport code. That doesn't affect the Intent match at all so we can ignore it.
We're giving some useful information to the user, but it's a lot of text presented all at once. It could also benefit from some images if I need help visualizing what a compliant bag might look like. So instead of just plain text, let me add some images. I will change my Default Reply to a Gallery. And I'll add two cards, one for carry-ons and one for personal items. I'll add my image, title, and description for each. I'll save and let's take a look. I like how this breaks up the information and provide some visuals for the user, but I want to provide a little more context for the user when I present these options. It could be that a Default Reply isn't going to provide the user experience I want. So I'll handle this in a Flow instead. That will let me provide multiple outputs to the user and a complete contextual response. In my Default Reply, I'll click reset to clear the Default Reply, and then I'll save. I'll switch over to my Chart view and see what I can create here. When I was just using Default Replies, Cognigy would detect my Intent, respond and not bother with the Flow. But now that we're working in a Flow, we need to make sure to only give a response to the user who triggers the Carry On Intent. I could start off my Flow with an If statement. Let's try that. I'll add an If Node, I'll make the condition be Intent equals Baggage Rules-Carry On. If I get an Intent match, I'll output some messages to the user. I'll create two Say Nodes. In my first Say Node, I'll give the user a general response, and then I'll provide some details. Now, I'll add my nice visualizations. And my detailed information. Let's test that. I'll ask, can I bring my ski bag? I get both Say Nodes now, and I'm pretty happy. One small change could be to add a little pause between Say Nodes. To do this I'll add a Sleep Node. This just puts a pause in the conversation. I'll add the Sleep Node after my first Say Node. Now we'll enter a number of milliseconds for the pause, a millisecond is a thousandth of a second, so just imagine the number of seconds you want and multiply it by a thousand. I'll add a three-second pause, so I'll enter three thousand here.
Let's see how it works.
Now you'll see there is a little bit of a pause giving the user a chance to start reading the first bubble before Cognigy presents the next ones. Ok! I like that. Let's take a step back as we think about our next Intent. I've created an If statement here that handles the Carry On Intent. I could handle my other Intent on my Else branch and check the Intent again. But this Flow could have dozens of Intents and my Flow will be really big and confusing if I have to have dozens of If Nodes. An alternative to an If Node is a LookUp Node. LookUp Nodes are good when there are more than two options. LookUp Nodes are ideal for handling Flows with multiple Intents. I'll add one after my Start Node. What you'll see is the main LookUp Node with three Nodes underneath it. The LookUp Node itself specifies a value. I'll click on it so you can see. In the type field, I have Intent selected that means that Cognigy is going to check the Intent that's been identified. Each of the Case Nodes underneath the LookUp Nodes will address a particular value for the Intent. I'll click on the first Case Node and choose the Intent I want to handle. I'll choose "Baggage Rules-Carry On" and save. Now I'll drag all of those Say Nodes under my Case Node. And I'll delete my If Nodes since we don't need that anymore. When I test, can I bring my backpack on board, I get the same result, which is what I want? Now I want to handle my checked bag Intent. I'll click on the second Case Node and add it there. What do I do if I want to handle additional Intents here? I can just click the plus sign next to the LookUp Node to add new Case Nodes and I can keep adding as many as I want. Note that the first Node is called Default. Under the Default Node, I can put any message or logic that I want if I don't get one of the Intent matches. So I'll add a Say Node here that can handle the case where we don't get a match. For now. I'll just say sorry I didn't get that. You can ask me about carry-on bags or checked bags. So testing it out one more time.
Can I bring my backpack onboard? And if I say, is my flight late, I'll get the default message.
Alright, now let's address the Checked Baggage Intent. For this one, I'll give different responses depending on whether or not they're in first class and how heavy their bag is. So I'll add a Question Node and it will say "I can help determine your baggage fees, are you flying first class?" I'll keep it as a yes, no question and save. I'll add an if statement to determine whether they said yes and I'll respond if so. I'll use my Last Question Result Token, and select is yes, as my condition. On my Then branch, I'll let the user know the bag policy for first-class travelers. If they aren't flying first class, I want to know the weight of their bags, so I'll add a Number Question. For my responses, there are three different categories, 50 pounds or less, between 50 pounds and 70 pounds, and then more than 70 pounds. I'll add a series of Ifs to address these. In my first If, I'll see if the result is greater than 70. Then I'll add a Say Node and let the user know that the airline does not accept bags over 70 pounds/ 32 kilograms. Now I'll add another If to see if the bag is 50 pounds or greater. And now I'll add my remaining Say Nodes to complete the dialog. If the bag is less than 50 pounds, I'll give the users a link to view more information. You may recall from the session on Say Nodes that I can either do this with buttons or with HTML links. Since I haven't presented buttons for any of my other answers. I'll just embed an HTML link. I'll go to the website that helps me create links. I get this link and I'll add it to my Say Node. Hopefully, this makes sense. I first check if the bag is greater than 70, if not, I take it was greater than or equal to 50, and if not, it must be less than 50. Let's test it out. Then I'll say no. Then I'll say 10 pounds, and here I get an unexpected message "the bot doesn't understand 10 pounds as a Slot match for a number". What if I say 20 pounds? I get the same error. Maybe it needs metric notation, I'll try 10 Kg and it works, Why is this? Does Cognigy make you use the metric system? It's actually not that. I looked at the detailed Cognigy Context, something we'll look at in-depth in the Flow developer course, and found that Cognigy sees 10 pounds as a Money Slot. It's a match for the British currency type rather than a number.
So how do I get around this? There are several approaches I could take. I could go to Manage and then Settings and disable the Money Slot. That worked If I won't need to use the Money Slots in this Flow. But just in case I might want to do that in the future, I need another approach. I could present Quick Replies to the user. I can make three Quick Replies buttons and let the user click on an option. For my Postback value, I could send a number that matches each range. This is a nice solution, but it only works if the user clicks the buttons. If this were a voice scenario, for example, I wouldn't see those options. The simplest and most comprehensive approach would be to change the Question type to a Text question. This will let the user type in anything and will use Cognigy built-in Slots matching to check the result. I'll go to each of my Ifs and change the Last Question Result to the First Number Slot. So if the user says twenty pounds, the first Numbers Slot will be 20. Let's test it out one more time. What have we lost by changing this from a number to text? Well, since the user can enter anything now, we have to account for them not entering a number. To keep things simple, I'll check to see if we have a number. And if not, I'll just give them a link to the website. I'll add a new If under my Question. For the condition, I'll choose First Number Slot and then I'll choose exists. This means that if the user entered a number, then continue otherwise we'll just give them the link. I'll add a Say Node off of my Else Node with the link information. Now I want to drag the next If onto the Then branch. All right, we've got it. First, we check if there's a number and if there is, we check which range it's in and respond to the user accordingly.
The reason I wanted to show this example of the logical but unexpected handling of 20 pounds is to highlight that we're working with language and collaborating with our A.I. to understand what users mean. Sometimes there are surprises and we need to adjust our Flows to account for real-world dialog and interpretation. Ok, I want to add two more refinements to this Flow. The first is to add a welcome message that greets users and explains the scope of the bot. We can do this easily with a Once Node. A Once Node executes a set of Nodes one time in a conversation which is perfect for a greeting. After my Start Node I'll add a one Say Node and it automatically creates an On First Time branch and an Afterwards branch. After the On First time Node, I'll add a Say Node that welcomes people to the Flow.
I'll create a Gallery and add a card. And I'll save. I'll test it out by just saying hi.
It gives me a nice greeting, but then it tells me it didn't understand what I said. The reason, if you look at the Chart, is that the LookUp Node is executing immediately after the Say Node. If we drag the LookUp Node under Afterwards, we fix that. The Say Node executes and then goes all the way to the end. The second refinement I want is to handle the Default Intent a little differently. Right now, if I say something like I need to check my flight status, I'll get the sorry, I didn't get that response. I actually want to guide users towards questions that bot can actually handle by adding some Quick Replies that automatically trigger an Intent. I'll show you. I'll open the Say Node and change the output to Text with Quick Replies. I'll add two Quick Replies. For the first one, I'll make the button title, Carry-On Rules. For the Postback value, I'll put cIntent: Baggage Rules-Carry On. I'll show you what this does in a minute. For the other Quick Reply I'll make the button title Baggage Fees. For the Postback Value I'll put cIntent: Baggage Rules-Checked Bags. This notation cIntent, followed by the name of the Intent tells Cognigy to automatically trigger the Intent match. When we test it out, I'll say hi then flight status. Then I'll click Baggage Fees and it triggers the right Intent. You'll also notice the bubble that says "cIntent: Baggage Rules" Don't worry, this doesn't show up in the Webchat. Let's take a quick look. In the Webchat, Cognigy will output the button title and take the user to the right Intent. If I were to improve this Flow one step further, I would take all of the logic we created for the bag check fees, all of those nested If statements, and put it in its own Flow. In the Flow developer course, we'll learn how to do this so we can keep our Flows simple, easy to manage, and straightforward to understand. There's so much more to learn about creating Flows in Cognigy. The objective in this video was to take some of the basic concepts: Intents, Lexicon, Flows, Say Nodes, Question Nodes, If Nodes, and LookUp Nodes to create a Flow that provides a complete experience. I hope it was helpful to see how to start with use cases and make your way into Flow development.
Ready for some advanced bot building? Continue with our Cognigy.AI Flow Developer Training that is also available in the Help Center.