In this episode we will introduce and explore the analytics capabilities of Cognigy.AI, and explain how to get the most out of Cognigy Insights and how to access the virtual agent data.
We will explain the core features of the Cognigy Insights dashboards and show how to build your Virtual Agent to get the most analytical benefit. We introduce the Flow building tools for assisting with analytics customization and what the best practices are for gaining insightful and uncluttered data.
Furthermore, we will introduce Cognigy’s conceptual approach to analytics records and demonstrate how to build an OData query. We show how to use the OData query to import the analytics records from a Cognigy.AI virtual agent into a BI tool. We also explore multiple options for masking sensitive information that is collected by the virtual agent across both the analytics records and project logs.
Join the Cognigy.AI User Community
Do you have a question or want to share your experiences? Join the Cognigy.AI User Community to connect with your peers or get help from our Conversational AI specialists!
Welcome to Cognigy Sessions, our Techinar series from Conversational AI experts for experts. In every episode, we will do a deep dive into another Cognigy.AI Topic. We will go much deeper than any marketing collateral and help you get the most out of your favorite Conversational AI platform. This session is about Cognigy Insights. Our fully integrated Conversational AI Analytics Suite for 360° Insights. We will show you how to get easy and intuitive access to conversational KPIs, how to gain a deep understanding of customer journeys, preferences, and behavior, and how to optimize your bot and contact center operations based on reliable data. This recording is also available in our Help Center. Please follow the link below to access additional resources, go to our community for questions and discussions, or to start a free trial if you're not already a Cognigy.AI user. And with that, let's start today's session.
Hi, my name is Matt from Cognigy, and in this video, we're going to be taking a look at Cognigy Insights and Analytics and what you need to do to get the most out of these features when building your Agents within Cognigy.AI. If you want to follow along, all you need is access to Cognigy.AI, which you can find at 'signup.cognigy.ai', and get access to a free trial. I also highly recommend the Cognigy.AI Foundation and Developer training course available at 'academy.cognigy.com'. These are free courses available for anyone to sign up and join at any time. And if you're wanting to join in with the tech deep dive section of the video, towards the end, we're going to be taking a look at how to build analytics in a BI tool, so it helps to have access to Microsoft Excel or something like Power BI and Tableau to build your own dashboards.
So, a short overview of what we are going to cover in this video, we're going to take a look at an overview of what features we have for analytics within Cognigy. We're going to take a little look at Insights itself and to see what dashboards and another tooling we have available there out-of-the-box in Cognigy. We're also going to take a look at what features you need to take into account when you are building agents to best suit your analytics needs. And then finally, we're going to have a deep dive into OData, what it is, how to access it, and how to build a dashboard with it. So, what features do we have for Analytics in Cognigy.AI? Firstly, as I'm sure you're aware, by now, we have Cognigy Insights, which is our dedicated analytics package that comes built-in with the tool and has a lot of integrated features with your Cognigy Virtual Agents. We also have an OData Analytics Endpoint, which is the raw data that feeds into Cognigy Insights. And this is also available to access away from Insights so that you can pull it into another tool for your own analysis. In addition to these two features, we have a native connection with Dashbot and Chatbase, so you can pull your Cognigy.AI Virtual Agent information into third-party analysis tools such as these two providers. I'd also highly recommend you check out our documentation for the latest terminology and the latest URLs. It could be in the future that we change things as we evolve the feature in the coming months and weeks. And, of course, our Help Center has some great articles to help you get the most out of the features that we have available, not just with Analytics but across the entire platform. So, a brief dive into Cognigy Insights, and we also have a product overview video available on our YouTube channel. So, feel free to check that out if you're looking for a high-level overview of the tool. And in this section, we're going to be taking a quick look at what features we can use as a best practice to integrate with our bots and get the most out of improving them, and also understanding what they are doing. The key features of Insights, we have, of course, some key dashboards that give you an overview of how your bot is performing. We have tooling for Conversation Flow monitoring to ensure that you have a good understanding of where the conversations are taking and what direction they're heading in with your deployment. We also have some great options for viewing transcripts and seeing how conversations develop in bulk. So what messages are being sent, how your bot is responding, and even how your agents are also communicating with your end-users. Across the tool, we have a great global filtering feature that allows you to really drill down on specific areas of your bot. And of course, we have features for collecting user feedback and identifying user achievements throughout the Virtual Agent in the conversations. And throughout the tool, we have options to export this data via CSV. So, let's jump in and take a quick look at Cognigy Insights and how it all works. So, this is the Cognigy.AI Virtual Agent Dashboard that you should be familiar with by now if you have been building agents within Cognigy and I'm going to show you how to access insights. So firstly, you can access it from the user menu in the top right-hand corner via the Insights option. You can also click on the Agent Analytics arrow on the home dashboard, and most commonly, you can access it through the Insights option in the left menu bar. So, after clicking on Insights, you will be taken through to the Insights home dashboard, you can see on the left-hand side, we have a menu here with three main dashboards: the Overview, the Engagement Dashboard, and the NLU Performance Dashboard. And we also have additional three tools the Step Explorer, Transcript Explorer, and Message Explorer. So, let's take a quick overview at each of these three dashboards and the tools that we have.
Our Overview Dashboard will give us some key insights into the general performance of our bot, and these include things like conversation count, the number of conversations that are taking place daily, the user channel, the top Intents and locales, goals, etc. Now, most of these visualizations have an option in the top right-hand corner to be able to export the data that you see in these as a CSV file. So, if I click on this, I am given a downloadable file and that includes the data that I see on screen in the data points for those. Yeah, for that, metric. You will also notice that we have our filters in the top right-hand corner. If you click on the filtering button, you'll be given some options to drill down and basically narrow down your data to identify certain time frames or certain Endpoints, or maybe even a given Snapshot or a Locale. So, let's say I wanted to view the data across the last 365 days, I can very quickly do that and I can see basically when I launched my bot, how it took off with my market. So, I'm going to navigate back and change this back to the last thirty days, just so that we have a nice series of data to view throughout this video. And of course, these are global filters, so as I navigate through Cognigy Insights, all those filters are applied to all the data that I am viewing. So now we have the Engagement Dashboard, and this is really about assessing how our users are performing or how our conversations are performing throughout the deployment. And we, of course, track how many different unique contacts we have, how many live Agent escalations we have, top goals. And of course, we have a feature for ratings.
Ratings is a new feature that's been introduced for collecting basically NPS feedback. So now Webchat widget has a built-in, thumbs up or thumbs down feature for allowing users to rate the conversation. And there are also options if you're not using the Webchat widget, you have the ability to pass in user ratings from other means, either by conversationally collecting the user feedback or just by using other features and tools within your conversations. And of course, as you navigate through the dashboard, you do have tooltips to be able to extract the key metrics or the more detailed metrics for any given visualization. Here, for example, we have the number of contacts on Monday, the 16th of August at 544, so you can really narrow it down and get the detail that you need to extract from these visualizations. On the NLU Performance tab, we have some additional information that really narrows down to the NLU performance of your bot. Of course, we have options for Understood and Misunderstood messages. We have our Top Intents, our Top Slots. And of course, we have some nice visualizations, like a chart here that shows you how your Intents are scoring, where your average and max scores are, and you can also download that as a CSV file as well. And one of the options you may have seen throughout the dashboard is that a lot of these visualizations have options for navigating within the tool. So, these options here directly allow you to navigate to a part of the platform that will help you improve the metrics that you see in this given graph. So here, for example, in my Intent score range, I'm given an option to navigate directly through to the Intent Trainer, which takes us back into Cognigy.AI and I can go through and start assessing my user inputs sentences to ensure that I'm getting the best out of my NLU model. So, by using those quick links to navigate from Insights directly into the platform, it makes it very easy to be able to identify an update and make improvements to the performance of the bot.
Now we travel through to the Step Explorer, and this is a great feature for allowing us to view the overall Flow and conversation direction that our users are having with our Virtual Agents. So, you can see here that I have a range of steps and the steps are basically executed throughout the conversation, depending on which questions are being asked and what the answers to those questions are. Later in this video, we're going to have a look at what it takes to enable these. Many of these are available by default as you're building your Agent, your Intents and your Question Nodes already have steps built-in and available by default, but we can also add those in to make a custom Flow path so that we can ensure that every turn and different direction that a conversation takes is collected within this data. So, this is a nice navigatable interface with basically we can click through each step and expand it to see the detail of which steps come afterward. And we also have some nice tooltips there to give us a little bit more information about how that step is being reached in conversations. And a slightly hidden feature is, if you right-click on these steps, you're also able to navigate directly to the Flow where that step exists. So, this email question here, if I open that Flow, I'm taken directly through to the conversation and then I can see here that I have the Node where that step exists in Flow. So that overall is the Step Explorer, and of course, you've seen that our filters are also applied on a global level to this particular tool. Also, have the ability to change where the overall diagram starts. So, if I wanted to see maybe what happens from an order being placed onwards, of course, that's kind of the end of our conversation. But you can also change the direction of the flow of the conversation transcript as well. So, you can see what happens before an order is placed, we have the confirmed step, confirmation question, database search, and so on. We can navigate all the way back to the start of the conversation. That is the Step Explorer.
We also have our Transcript Explorer, which allows us to view all of the transcripts that we have available throughout our deployment. And we can search through transcripts, we can also find the sources, Virtual Agent messages, User messages, and also Agent messages. So, we have quite an in-depth search option, and when we open a conversation, you can see the transcript pops up and we can obviously scroll through the transcript. And we also have the profile details and conversation details that are associated with that particular conversation. That is the Transcript Explorer and then finally, we have the Message Explorer, which is all about message funnels, so it's about understanding what message came first and then how that message was responded to in bulk by the most frequent occurrences. So, we have a welcome message here. And then, as we can navigate through, we can see that the following message was always this 'How may I help you today?' And we can see all the different responses that were given. We can also see the volume of times that particular response was provided to the bot. We also have an integrated search feature here, so you can search through all of your conversations to see where particular key phrases were mentioned. So, for example, when a user is saying, I need to speak to an Agent, we can see, what messages from our bot were occurring before this user responded with that message. That allows us to identify where we can potentially make an improvement in our bot to ensure that we have met, particularly in this scenario, less human Agent handovers, and our bot can handle that conversation more directly. So that's a quick overview of what we have in Cognigy Insights.
Next, we're going to take a look at the Analytics Flow Features that you have available to you within Cognigy.AI to get the most out of the Analytics and KPIs options available within Cognigy Insights and also your OData. So, a quick overview of what we have available. Firstly, the Analytic steps, a very key feature, and they will help you build analytic steps with into your Flows so that they can be visualized in the conversation, Step Flow visualization tool within Insights. We also have a Node for completing goals so you can see how many times users reach a certain stage within a conversation. We have our Set Ratings Node to handle our user feedback. We also have an Overwrite Analytics Node, which can be used to customize what data is written to your analytics information. And finally, we have Blind Mode and other Data Masking features that allow us to blank out certain information from being obscured in our data, and it could be for sensitive data collection. So, let's take a look within Cognigy at the features we have available. And we'll start off with the Step Explorer, this is one of the most important features that we're going to be able to build our bots around to ensure that this visualization is the most useful as it possibly can be. The good news about the Step Explorer is that steps are added to our Flows almost by default. In fact, every time you create a Question Node or an Intent, it automatically has an analytics step associated with it. So, let's navigate through to the Flows. I'm going to right-click on one of my steps here, and this will take me straight through to my Cognigy.AI Flow. Now you can see that we've got a range of questions in this particular Flow which collect user information, we have a date of birth, we have something about coverage, and we also have a question about emails. So, in this Node here, I can click on the Nodes to open the editing menu and I can navigate down to the settings. In the settings, we have a label field, which is simply the visual label that you see in the Flow Editor. We have a comment which will be able to appear as a logo, a comment logo attached on the front of the Node that can contain any information that you need to save. And we also have the Analytics Step Field, and this is the name that is associated with the step when it is passed through to the step visualization tool in Cognigy Insights. As I mentioned with Question Nodes, this field is automatically populated when you create the Node in the first instance. However, this field exists for all of the other Nodes that you can add to the Flow Editor. So, for example, if I move down to let's take this 'If statement', for example, if I open up the settings, I can see that the analytics step field is currently blank on this Node because it doesn't automatically get created at the start. But I can add something into this field and that will be assigned as a step in my Flow. Now the best practice with this feature, because it is enabled for Question Nodes and Intents. The best practice to follow is the following, you don't want to have a step on every single Node in your Flow. You want to be efficient and minimize the amount of steps you have, but you also want to capture as much information about where the conversation can potentially change direction. That means as a base, you start with your Question Nodes and Intents, but also if you have any diversions in the Flow that are created by logical conditions in your Virtual Agent, it also makes sense to place steps on each of the paths, and you can do that either from the Child Nodes attached to the If statement. As you can see, I've done here with the confirmed and declined. I've manually added these two options into the steps so that we can track where this conversation hits, in which direction, but in an overall sense, that is the best practice to minimize the amount of steps you have, but also to ensure you are getting as much information as possible. Now to attach steps to Intents, we can do that through the NLU tab and it's very similar. In fact, in this particular Flow, we don't have any Intents, but I'll just navigate back to our main Flow, and across to the NLU know, you have. And we have two Intents here one for a Handover and one for getting a Quotation. And if you navigate down to the advanced menu to the very bottom of the menu here, we have also the analytics step, and this functions exactly the same as the Question Node and all the other Nodes in the Flow. When this Intent is triggered, the step is written into the conversation record as the step that was executed. So, it's important to realize that not only do you have steps for Intents and Nodes, but you can potentially also have multiple steps that are executed at once. How this works is that every time a Node is executed in the Flow because you have a step associated with it, that will be assigned to the data records. So, let's just test this out in the Interaction Panel here. If I start talking to my Flow, I can see that for this single execution, I have one, two, three, four, five, six Nodes that are associated with that run of the Flow. That means potentially I could have six steps executed and associated with that particular message that was input by the user. So, it makes sense to make this nice and efficient. And in the sense, I would only have one potential step may be associated with my welcome message so that I can track those incoming messages. That is the best practice of using analytics steps within your Flows.
The next feature we're going to take a look at is the Complete Goal Node. And as we've seen in Cognigy Insights, I'll just navigate back to the dashboard, we have in our Engagement tab, a list of the top goals that are found within our Agent. And these goals do need to be added manually into your Flows using the complete Goal Node so that we can track the progress of how conversations are occurring. Goals are typically used for understanding a certain KPI that we want to monitor throughout the performance of the bot. In this case, we have a goal that's globally assigned to all of our Intents that is recorded every time an Intent is found. We also have a goal here for answering FAQ, so we have a separate FAQ Flow, and this goal is assigned to the conversation as being achieved whenever FAQ question is answered. And that means we can very concisely track how much success we have inside our bot. So, navigating back to the Flow itself and I can add a Goal Node, as you can see in my Flow. And let's just clear it, so we can view it clean. We have two Complete Goal Nodes within our Flow here, and these are the Goal Noes that are creating the Intent Match goal. You can see they both have the same name. So, we don't necessarily have to have different goals throughout our Flow. Our goals can be repeated if it makes sense to do so for our use case and also to extract the KPI that we need to. Of course, it does make sense at some stage to have dedicated Nodes for a certain place. And you can see here that at the end of the quotation Flow, we have another Complete Goal Node that indicates a quotation has been done. So, we can find the Complete Goal Node simply by searching complete goal within the Node finding menu and then adding it to our Flow by clicking on it. Yeah. And that will add a fresh Node into the Flow.
The next Node we're going to take a look at is our Ratings Nodes, so you can see in this Flow here I have a set positive rating and also a set negative rating. The Rating Nodes can also be found in the Node Finding menu by searching for rating. We actually have two Nodes that are available for using the Ratings Feature. The Request Rating is only available for a Webchat Endpoint, which when you're using the Cognigy Webchat widget, you can actually prompt the user at a certain point in the conversation to request them to give a rating to the conversation. So that would typically take place after achieving one of your goals and ensuring that the user's feedback is captured. But of course, the Set Rating Node is what actually writes that information to the conversation and the Set Rating Node can also be used across any channel simply by passing in either directly or usually from a variable within your context. The rating score, which is usually a number between -1 and 1 and also can have a comment passed in as well, which is recorded in the OData records. So, by collecting user feedback and information, you can start to populate your positive ratings over time graph within the Cognigy Insights Dashboard. To show you briefly how this looks in the Cognigy Webchat widget. You can simply click on this button in the top right-hand corner. This is the native by Default Feature that's available for giving user ratings within the Flows. Users can simply click on the plus or on the thumbs up or thumbs down button in the top right-hand corner. They can give the conversation either a thumbs up or thumbs down and write their feedback within the text box and send that back, and their feedback is recorded against the conversation within the Cognigy data. So that is the ratings feature.
The next feature to take a look at is the Overwrite Analytics Node. And this is a more custom feature that allows you to do just that. It allows you to intercept the analytics records that are being written into your conversation data and either adjust them or also add certain custom values that you want to retrieve in your OData records. The Overwrite Analytics Node has the following information attached at the top. You can see that we have some custom fields, and in these fields, they are commonly used for writing any custom information that you want to record and track in your data records. These fields are only available in the OData Records and the Inputs collection, and they can be used to pass in dynamic variables from your context. They can also be used for data tagging to associate certain records with certain parts of your conversation Flow. Now there are three fields here, but of course, every time you execute this Flow, you can write different values to those three fields. So, the amount of data that you can use to write to your conversation OData records by using these fields is really large. That's good to make use of this Overwrite Analytics Node throughout your Flows to get the most out of your OData records. You can see we also have some default fields available so you can overwrite the Intent name, Intent score, the Input text, and other information that are associated with your analytics records from this one Flow. And it's important to remember that these features are also available from a Code Node. You can find more information about how to write code that allows you to overwrite the analytics from within the Cognigy Code Nodes within our documentation.
The last feature I'd like to touch on is the options available for Data Masking within your Flows. That's very important for data security that at some points within a conversation, we're going to need to mask what the user has said in a response to a, particularly sensitive question. Or it may be the case that our entire conversation records need to be masked to people within our organization that are viewing our conversation data or the analytics records that are associated with conversations. So, within the Flow itself, you can find these Nodes under the profile option and the Node Menu. We have options, obviously for our ratings that we've already covered. We have options also to deactivate the profile of the user. And this can be done based on the conversation that takes place. You can add, deactivate Profile Node to ensure that the user's data is not collected in the future. But the feature we want to focus on is Blind Mode. So, by adding a Blind Mode Node into your Flow, you're presented with options for logging or for masking data within the logging, within the analytics, or by disabling conversations completely so that no information will be written whatsoever. So commonly, this is used with just the logging and the analytics feature within the Flow. That means you can simply turn on the masking at one point and then later in the Flow, the masking can be turned off so that the data is now visible again throughout the conversation. So, let's have a look at how this could be set up. We can put out the Blind Mode Node before asking for a user question, the user's email address. We can turn it on, the best practice is to update the label with maybe Blind Mode on. And then later in the Flow, once we're happy that we can turn it back on and collect the information again, we can simply turn these options off, and I'll rename that to Blind Mode off. And that way, when we look at all our logs and our analytics conversations, there will be no information available to view at this point in the conversation. That means all of our user responses to the email question will be completely masked. They won't be able to see that to anyone looking at those conversation records. The other masking features we have, of course, this is only available in the Flow itself, but we can also disable the conversations and mask analytics records from an Endpoint level. So, if we navigate through to the Deploy menu into Endpoints to one of our Endpoints, we will have a menu here for data protection and analytics. This Is also where our Chatbase and our Dashbot connections are created to an Endpoint within Cognigy. But you can see we also have the ability to enable and disable Contact Profiles, Collect Analytics, Mask Sensitive Analytics, which is very similar to Blind Mode but operates for the entire conversation, as well as masking the sensitive logging that is associated with a conversation. So, if I was to enable these two features and open a Webchat to have a conversation, I'll write a message to the bot now, and I went into the Cognigy Dashboards into the logs to view what messages are coming into Cognigy. We will see that this information contains asterisks where the user inputs are, and that is ensuring that that to input text and the input data associated with that user message is completely masked, both within the logs and also within the analytics data.
Next, we're going to take a deep dive into Cognigy OData Analytics and how we can use it to extract data from Cognigy Agents and use the data to build visualizations in our Power BI tools. Of course, you have access to all of the Virtual Agent data and underneath Cognigy Insights, it is the OData analytics feed that is powering all of the data being shown in the Dashboards. So that makes sense to have it available to your BI Tooling so that you can customize your own visualizations. Now within OData, we have five collections, and we're going to be talking through what's available in these collections, and which ones you need to access in order to build the visualizations that you would like to build. First of all, what is OData? So, the open data protocol is an industry standard that defines the best practice for building and consuming RESTful APIs, and it provides a uniform way to access datasets. You can find out more information about the OData protocol at OData.org, where you'll find full documentation, as well as a full list of all the formatting and query parameters that you can use in your OData queries. So how does Cognigy OData work? Well, on the left-hand side, you have your analytics software, this can be Microsoft Excel, Power BI, or any other BI tool that can access an OData Endpoint. In the middle, we have the Cognigy OData Endpoint, which is accessed via a URL. We could pass this into our software and give it access to the Cognigy database that contains all of our Virtual Agent data, and it exposes this data in a readable format for our analytics software.
So, what OData collections are available? We have five collections: Inputs, ChatHistory, Steps, ExecutedSteps, and Conversations. Let's talk about what's available in each of these different collections. Firstly, the ChatHistory, this collection contains a record for every input or output given by a bot, a user, or a Human Agent that's involved in a conversation. Commonly, it's used to collect and build transcripts and export those from your data records, and it can also be used for testing. Additionally, we have the Inputs Collection, and this is the typical collection that contains all of your extended conversational data that's associated with your conversation that has taken place. For example, your Intent that is found against any user input is stored as a data variable in each of the records within the inputs collection. Now, the Inputs collection does not contain output messages from the bot or our human agents, but it does contain that extensive data associated with every user input. The common use cases for the inputs data for monitoring and improvement and will be the key data record that you use to build many of your KPI monitoring dashboards within your BI tooling. This is also the record that contains the custom analytics fields that you can use to overwrite from your Overwrite Analytics Node within Cognigy.AI. Additionally, we have three collections used for helping us build our step monitoring visualization within the BI tool. Firstly, we have the Steps Record. This record contains a list of every single step that exists within our Agent. Now, the catch with the Steps Record is that you must have a conversation with at least one time executing that step for it to become available in this particular collection. That means if you are adding steps to your Flows and you want to see them available in your OData, you need to have at least one test conversation to make them appear here. There will be only one record for each step, so you can be sure that each step has its own unique ID that can be used to reference throughout the other records. In the middle here we have the ExecutedSteps collection, and this collection contains a single record for every time a step is executed within a conversation. So, this means that there will be obviously a lot of records contained within this because you can potentially have many, many steps executed throughout a conversation and for every conversation, this collection contains a record of all of those steps. Finally, we have our Conversations collection, and this contains a record for every single conversation that has taken place. The key thing to extract from the Conversations Record is the path that has been taken throughout that conversation. This means that we can build a visualization that shows the order in which all of our conversation steps were executed, and that is the data where it is used to build the Step Explorer within Cognigy Insights. So those are the five collections available to you as an OData user.
Now, how does Cognigy actually write data to these Endpoints, and it is important to understand this process because the Inputs Collection behaves a little bit differently to the other four collections that you see on the left-hand side. So, the ChatHistory, ExecutedSteps, Steps, and Conversations Record, all have their data written to them as soon as it is available. That means it's always written as soon as the input arrives or as soon as the Flow Node is executed, it is directly written to that particular collection. The Inputs Record or Collection, however, is initiated when the first user input arrives. As the Flow is executed, It can have manual updates made to it. And this is what enables us to overwrite the analytics data from within the Flow. Because changes within the Flow are going to affect what is written to that particular collection. And then finally, as the Flow execution is finished, the final updates are made to that particular record in the collection and a commit is made, so the record is saved. So that's just an explanation of how the different collections behave slightly differently, depending on what data you're writing to your OData records.
Now, what's really important in Cognigy is how we actually access this data. So, we need to build a query that is similar to a URL that you access the User Interface for Cognigy that simply returns a big data payload of information that contains all of our conversational analytics. What you need to build a query is firstly, your domain URL, and please note this can change depending on the Cognigy environment you are using. In this example here, we've shown the OData Endpoint for the free trial environment for Cognigy AI. Typically, you have OData and then a dash, and then the standard domain name of your environment. But of course, in an on-prem environment, this can be also a different URL. Please bear in mind, it is different to your normal URL that you access the Cognigy.AI User Interface with. Secondly, we will add a collection to that, and as we've just gone through, we've seen the collections that are available. Also, to take note here is that we have an OData Endpoint version that is added before the collection name. In this case, at this point in time, we're up to version 2.0. But please have a look at our documentation for the current version. Thirdly, and finally, we have the API Key and the API Key will be extracted within Cognigy.AI. This is required to give your analytics software access to the data that is available in the platform and then optionally you have the ability to add filters within your queries. And this means that you can restrict the amount of data that you're collecting, make it more efficient to retrieve the data, and reduce the load on the Endpoints so you can add filters for a time period or even extract data from a particular project within your Cognigy.AI Infrastructure. Of course, if you exclude filters, the request is going to retrieve all of the data that you have within the Cognigy.AI associated with your profile.
So, let's take a look in Cognigy.AI at how we work with OData. And to start with, I'm going to jump into a project. And this is the project that we're going to be using to extract our data from. Now I'm also going to pull up a query builder that shows the information, and we're going to build this query to extract the data. So here we have firstly our domain. We're using the trial environment. We're going to show that that is the OData URL. We're going to need to have the version of the OData Endpoint in here. We're using version 2.0. We're going to select a collection to access and you most likely ín your BI tooling, have a request that's going to each of our collections so that you can have all of the data available. In this particular request, we're going to use the Inputs Collection. Make sure there's a capital at the front of that particular collection name. We're not going to use any filters in this example. And we're also going to need to extract an API Key from the User Interface in Cognigy. So, let's go through how we can get an API Key within the interface. If we navigate to the user menu and into my profile, we can scroll down and we can create a new API Key by clicking the plus next to the API Keys option. Let's just call this one OData, and that will be added to the top of the list. You can copy by clicking on the API key, and then navigating back to my builder here, I'm just going to paste that into the API Key field. So that's all the information we need. Of course, we need to concatenate this all together and I'll add in my API Key there as well. And then we have our final query. Now, to test this query, to make sure it does in fact work, the easiest way to do this is to open a new browser tab and simply paste the query in. And this will mean that the data is returned. But if you can confirm that all this data is being shown, it means that your query is working correctly. And one thing to consider with your Cognigy.AI OData is that your API Keys that you use to access a particular instance of Cognigy are not only allocated towards OData, they also unlock the ability to perform all of the actions within Cognigy, such as creating new Agent, deleting Agents, adding Flow Nodes, updating your conversation data. The features in Cognigy are very extensive and you need to ensure that your API Keys are nice and secure. So as a best practice, we recommend creating what we call an OData user. You can see here that I have my normal account associated with this Agent, but I also have an API user associated. This user has just basic level access rights but also has the API Key access role as well. And this means my admin credentials are nice and safe within my standard admin account, but I can expose and use an API Key that is potentially going to be shared with other users in my organization. And that API Key can be associated with only a single project or have restricted access to only access data from certain projects within the Cognigy interface. Here's an example of what access roles are required to build up a OData user are. So, you have firstly, the base role that will give entry-level access to view a project. We also have the API Keys role, which will allow that particular account to create an API key. And then finally, we need the OData access role, which allows that API key to return records from the OData Endpoint. So, we highly recommend using this as a best practice within Cognigy AI.
Let's now jump into Microsoft Power BI and see how we can access the Cognigy OData stream from within our dashboards. So, within Microsoft Power BI, I'm going to select the Get Started option from the top menu and then navigate to OData Feed. A window will appear, and you can simply paste in your OData query that we built earlier in the session, and select OK, that will bring up a sample of the data that we've retrieved. And then after selecting load, it will retrieve all of the data from the Endpoint. Now in this query, I'm accessing the inputs records so we can take a quick look at what data we retrieve from this particular Endpoint. So, you can see now that the query has arrived. And if I open that up in the table view, we can see all of the data that is associated with that particular collection. So, at the start of the collection, we have quite a large set of identification IDs. But most importantly, here we have the input text and the input data associated with each message that is being sent to the bot. And we also have, if we scroll all the way to the right, we have our custom fields that we're also familiar from our Overwrite Analytics Node within the Cognigy Flows. So, all of our data and our inputs record is available here. If you need to find out more information about what is included inside this particular collection, you can head over to our documentation page, which is available at docs.cognigy.com, and just search for OData Analytics Endpoint. And it'll take you to this page where you'll find a detailed list of all of the collections and all of the values that are associated with every collection. So, navigating back to Power BI, we've now got a query here. I'm just going to rename that as a best practice so I can remember that it is actually the Inputs Collection. And now I'm going to create a dashboard with this data. So, navigating back to the visuals tab. I'm going to start by building a visualization of the number of messages that have come in overtime to my bot, so I've added a line chart to my dashboard. I'm going to navigate to the inputs collection and bring in the ID as the values, and I'm also going to use the time stamp on the axis. I'm going to take out the year and the quarter, so I have a nice visualization of the months that the bot has been live. Next, I'm going to add a slicer, so the slicer is available here, and I'm going to use the time stamp again in this field so that I have the ability to adjust the time frames that are available in my data. And that's going to apply to all the visualizations that I add to the dashboard. Next, I'm going to add what's called a donut chart in Power BI. And for this chart, I'm going to add in the understood value into the legend field and also account of ID into the values. This will give me a donut graph showing the messages that have been understood and the messages that have been misunderstood. And then finally, for this dashboard, I'm going to add a table of the actual input messages that have been sent to the bot from our end-users. And here I'm going to grab the input text and add it to the values field. So that's my basic dashboard. What I've got here is first, our timestamp so we can filter based on certain timeframes so we can narrow down on a particular period. We've also got our donut graph, which we can now select to highlight, and our Input Text field, the misunderstood phrases, and how they have trended over time. And of course, I can navigate to the understood phrases as well. So that's a very basic dashboard, but you can see that you can quickly build up a large array of visualizations to suit your needs and make it a custom dashboard for you. And let me show a quick example of a more complete dashboard. This one includes some additional features like filters to filter between different projects and channels, contains a summary for each month with the key KPIs. And we've also even got a world map indicating the high-volume regions. So, Power BI is quite flexible, and of course, this is how you can access all of your data and bring that and make it visible to your business users. Now I've created another sample dashboard here that I'm going to use to build the next version of this dashboard and in the next dashboard, I'm going to add our step monitoring. So, I'm going to start with a blank dashboard. But you notice in this version I've also included all five of the records, all five of the collections that we talked about earlier in the video, and that includes ChatHistory, Conversations, ExecuteSteps, and also the Steps Record in addition to the inputs that we've just taken a look at. So navigating across to the data tables and to start within this particular example, I'm going to use the Conversations Record. Now the Conversations Record, just to recap, contains a record of every single conversation that a user has had with the bot. And most importantly, in this particular collection, we have the step path. This is a comma-separated list of the steps in the order that they have been executed within that conversation. This is where we're going to start to build up the data required to visualize the step Flows within our dashboards. So firstly, I'm going to actually edit the query that was presented while in this Conversations Record. And what we need to do is split out this step path field into multiple fields for each particular step. So, I'm going to select the Step Path column, I'm going to select the split column by Delimiter, and then I'm going to select comma, change the quote character to none, and select Ok. What this would do is create a column for every single step that was executed within that conversation. Now you will notice here that the step path fields contain reference IDs and not the name of the step themselves. The reason for this, if we close and apply this change, is that at any time a user can in fact update the name of a given step from the Cognigy User Interface, which means that when these records are created, it could be in the future changed to something else. And we need a more flexible way of adjusting those so that we can view them correctly at any point in the future. Therefore, what we do in the Steps Record is store the latest version of the label and that can be referenced by the Entity Reference ID, which is available in the Conversations Record. So, to add this into our Conversations Record, we're going to add columns that reference these step reference IDs and pull in the name of the or the most recent name of the step from our Flow Editor. So to start with, I'm going to right-click on the headings and select a new column, that will allow me to add a new column to the data table, and I've pre-prepared some example queries that I can create for these columns, and these are available in our documentation page actually under our Help Center, which is linked in the article for this video, so you can copy these straight out of the article themselves. What this basically does is it references our Steps Record, pulls in the latest label, and it also counts the number of steps, and assigns an END name to the step that is the last step in the Flow. So, let's save that now for our first step, and you'll see that all of the conversations that had no steps contained within them have END as the first step. And all of the next conversations that had multiple steps executed will have the name of that step put through. So, let's add a few more steps to the data table, and I've pre-prepared these in my notepad editor. You can simply again copy these out of the Help Center article and make them available to you very easily. And you notice that every time I do add a new column here, I'm just updating the steps count to one less than the step path number and also updating the value in this reference ID here. The other thing I should point out is that if you are going to copy directly these values from the Help Center article, you should name the collections the same that I have. And that is just the same name that you would access them through the query. So ChatHistory conversations, for example. So now we've added four steps and we've pulled in the most up-to-date step name from our Steps Record, and we've also added an END step for the end of the conversation. So now we're ready to build a visualization with this data so I can navigate back to my dashboard. And for this, I'm going to use the 'Decomposition tree' that's available as a default in Power BI, just make that a little bit bigger. So, what I can do now is bring in the step columns that we've created within this particular data collection. I'm going to put those into the 'explained by' field. I'm just going to add all of those fields. We have all four steps, and of course, you can add as many steps as you want to view, obviously, to meet the requirements of your use case. And I'm also going to bring in the ID of the conversation into the analyze field. And it will populate the graph, and now I just need to go to the plus button and expand out the steps inside the visualization. So, step one, step two, step three, step four. And that is the end of creating this particular visualization. So now I have a nice navigatable visualization, where I can navigate through the conversations and see the volumes of conversations that have reached certain points within the Flows. So that's all we're going to cover in today's video, I hope you've enjoyed the content and of course, all the information that we've covered today is available on our documentation and also in our Help Center. And other than that, I do wish you all the best with using Cognigy and of course, getting the most out of Cognigy with your analytics features that are now available to you.
Thank you very much for watching our session on Cognigy Insights. Please don't forget to visit our Cognigy.AI Help Center under support.cognigy.com for additional resources and information and a link to our community where you can ask questions or leave us feedback for this episode. Thank you very much for watching. See you next time!