Our first ChatBot – part II

In the previous article we showed how to create a simple chatbot which will serve as communication channel with the Client. We implemented conversation which enabled user to purchase in our simplified sale system such products as: home insurance, travel insurance or car insurance.

Dialog line was based on button-focused flow, which means that user could choose only between options appearing on the screen.

He or she couldn’t use natural language, hence express what he or she expected. You may say that our solution lacked β€œa pinch” of artificial intelligence, which would be able to understand the user.
However, it does not mean that it is a bad solution. Discussions are ongoing as to which conversation method is better. The only right answer is famous β€œit depends”.

The above mentioned β€œpinch” is one of the challenges from the area of science called Natural Language Processing (NLP), which first of all consists of: speech recognition, natural language generation and natural language understanding.

In this article we will show how to use interpretation of LUIS (natural language interpretation service, built and developed by Microsoft as part of Cognitive Services), within our chatbot, in a simple way.
For better comparison of versions with and without integration, changes are being made on separate branch luis.

luis.ai

Creating intents and entities

First step, which we have to take, is developing an application in LUIS. It goes down to specifying the name of the application and language which is supposed to be understood.

integracja LUISa z Microsoft Bot FrameworkDeveloping new application on luis.ai

We will be communicating with our bot in the second most popular language in the world, which is English.

Before we move forward, it is necessary to understand two basic concepts – intent and entity, which frequently appear in NLU (Natural Language Understanding) area.

Intent is the information about what user wants to do while writing the sentence (for example while staying in the insurance area, it may be buying a new insurance or registering claim).

Entity is being used just like variable in mathematics. Thanks to it we might provide an important information (for example a type of insurance which we would like to buy).

In our case we have defined two intents (BuyPolicy and RegisterClaim) and one entity (InsuranceType).

While entering entity, you have to provide about five examples of how user may say that he wants to do exactly this thing. What is more, if an entity, which we are interested in, appears in expression entered, we should indicate it.
You can see all the details on the screen below.

integracja LUISa z Microsoft Bot FrameworkSample intent – BuyPolicy

It is worth mentioning that there are already a few predefined domains in LUIS, such as intents and entities for calendar, home automatization, music or notes management.

More details of how to define and then test our service, can be found after reading documentation.

You must not forget that the fact how well the service will understands human being’s statements, first of all depends on the definitions and latter training. During development follow the good practices described here.

Training and publication

Having defined intents and entities, we must give our definitions to the hands of artificial intelligence and let ourselves be educated a little. πŸ™‚ It occurs after clicking Train button in the right upper corner.
Once everything has been configured properly, after a while we should receive a message that the training is complete.

At the moment the service is ready to be published. While publishing we might choose an environment (Staging/Production). We are not afraid, so we go for production straight away. πŸ™‚

integracja LUISa z Microsoft Bot Framework

After a minute we should be redirected to the screen with URL address to our service.

integracja LUISa z Microsoft Bot Framework

By clicking indicated URL we will receive endpoint ready to be tested:

https://westus.api.cognitive.microsoft.com/luis/v2.0/apps/<your_app_id>?subscription-key=<your_subscription_key>&timezoneOffset=-360&q=

Now we might start testing. We would like to check whether our LUIS recognizes the sentence I want to buy home insurance properly. The result shown below proves that topScoringIntent and entity have been identified correctly.

integracja LUISa z Microsoft Bot Framework

Now we may move to second party’s action – integrating created service within chatbot.

Integrating LUIS with Microsoft Bot Framework

Our goal is changing the first static step (choosing between available options) which currently looks like this:

integracja LUISa z Microsoft Bot Framework

We want to provide the user with an opportunity to express what he would like to do in natural language.

To be more specific, we would like the user to be able to write: Hello, I want to buy home insurance. Bot should understand what is going on and redirect the user to buy a particular product (possibly to select a product if it was not indicated in the statement). In the example above it will be home insurance.

First step we must take is to indicate to which address our service has been issued and to tell our bot that he must use LUIS.

const recognizer = new builder.LuisRecognizer(process.env.LUIS_MODEL_URL);
bot.recognizer(recognizer);

Link to endpoint is stored in .env file, just like all other environment variables used within application.

Next, you have to modify our bot’s first steps. At the moment they are defined in getStartingSteps() function, and show a hint what the user should do:

function getStartingSteps() {
    return [
        function (session) {
            session.send('Hello in ASC LAB Insurance Agent Bot!');
            builder.Prompts.choice(
                session,
                'What do you want?',
                [IntentType.BUY_POLICY, IntentType.REGISTER_CLAIM],
                {
                    maxRetries: 3,
                    retryPrompt: 'Not a valid option'
                });
        },
        function (session, result) {
            [..]
        }
    ];
}

As we would like to replace the possible predefined responses with recognizing natural language service, we must simply get rid of them and ask the user what he would like to do:

function getWelcomeSteps() {
    return [
        function (session) {
            session.send('Hello in ASC LAB Insurance Agent Bot!');
            session.send('What do you want?');
        }
    ];
}

Now we should show which path (dialog) bot should trigger after detecting particular intent in the statement. Two intents have been configured in our service – BuyPolicy and RegisterClaim.

bot.dialog('buy-insurance', getBuyInsuranceSteps()).triggerAction({matches: 'BuyPolicy'});
bot.dialog('register-claim', getRegisterClaimSteps()).triggerAction({matches: 'RegisterClaim'});

Lines above means that if LUIS detects BuyPolicy intent, buy-insurance dialog will be triggered.

In the first lines of below dialog there is an attempt to read InsuranceType entity which talks about what type of insurance user is interested in.

If it is not included in the statement (because the user wrote only I want to buy insurance, without specific indication), we must ask about it directly.

However, if it is specified (I want to buy travel insurance), the user is redirected to specific sale path.

function getBuyInsuranceSteps() {
    return [
        function (session, result, next) {
            session.send('Great! You want to buy new insurance. We try recognize what insurance do you need...');
            console.log(result);
            let intent = result.intent;
            let insuranceType = builder.EntityRecognizer.findEntity(intent.entities, 'InsuranceType');
            if (!insuranceType) {
                session.send('Unfortunately, you have not written yet what insurance you need.');
                next();
            } else {
                switch (insuranceType.entity) {
                    case 'home':
                        return redirectToProperDialog(session, 'home');
                    case 'travel':
                        return redirectToProperDialog(session, 'travel');
                    case 'farm':
                        return redirectToProperDialog(session, 'farm');
                    case 'car':
                        return redirectToProperDialog(session, 'car');
                    default:
                        session.send('Unfortunately, you have not written yet what insurance you need.');
                        next();
                }
            }

        },
        function (session) {
            builder.Prompts.choice(
                session,
                'What kind of insurance do you need?',
                [InsuranceType.Driver.name, InsuranceType.Home.name, InsuranceType.Farm.name, InsuranceType.Travel.name],
                {
                    maxRetries: 3,
                    retryPrompt: 'Not a valid option'
                });
        }
        [...]
    ];
}

In GIF above you can see redirection to the path in which you may buy insurance of travel type (misspelling is intentional so as to check how LUIS will handle it).

integracja LUISa z Microsoft Bot Framework

For comparison, below you may see the situation in which the user has not written what kind of insurance he is interested in.

integracja LUISa z Microsoft Bot Framework

You may see most changes described above in this commit. The whole code is available at branch luis.

Finally, it is worth showing how conversation with chatbot conducted from start to finish looks like:

image4

Summary

Thanks to a few steps we have managed to integrate our chatbot with LUIS – the service used to understand natural language.

We have reached satisfactory moment for us when the whole process of selling insurance inside the chatbot, which is shared by LAB Insurance Sales Portal, and while our chatbot is intelligent at least on minimum level – it can understand human being’s statement.

The use of bots in communication with clients significantly increases, while the market requires more and more from this kind of solution architects.
Here, in ASC, we pay attention to such trends. We see advantages of communication interfaces and understand business reasons for which companies more and more often offer their products through virtual assistants which has been proved by the above experiments and shared examples.

Robert Witkowski
Lead Software Engineer & Team Leader, ASC LAB