Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Current »

If you want to design an IVR or an chat bot, that works on voice reorganization then you can use speech processing widget to achieve the same. Before that you need to perform basic setup in DialgFlow according to your use case. The setup requires following things -

→ Let's say you need to achieve following Use Case: 

There is an agent bot ‘Eva’ which when called, gives you 3 options and asks you to choose one 

Option 1 - To get Payment details

Option 2 - To get Account details

Option 3 - To get Loan details

If the user selects option 1 then he will be able to get the payment details, if the user says option 2 then he will be able to get account details and if the user selects option 3 then he will get details regarding loan. To do so you can configure dialogFlow as follows: 

Step 1 : Open DialogFlow

Step 2 : Go to Entities Page 

Entity: Each intent parameter has a type, called the entity type, which dictates exactly how data from an end-user expression is extracted.

Dialogflow provides predefined system entities that can match many common types of data. There are system entities for matching dates, times, colors, email addresses, and so on. You can also create your own custom entities for matching custom data. For example, you could define a vegetable entity that can match the types of vegetables available for purchase with a grocery store agent. 

 

Step 3 : Create New Entity

In order to create new entity you need to be aware of following terms-

Entity type: Defines the type of information you want to extract from user input. For example, OPTIONS could be the name of an entity type in our case as we want the user to select an ‘option’. Click on ‘Create Entity’ to create an entity type. 

Entity entry: For each entity type, there are many entity entries. Each entity entry provides a set of words or phrases that are considered equivalent.

If ‘options’ is an entity type, you could define these entity entries like: Option one, Option two, Option three.

Entity reference value and synonyms: Some entity entries have multiple words or phrases that are considered equivalent, like ‘option one’ and ‘first option’. For these entity entries, you provide one reference value and one or more synonyms.

Example : 

Reference value- Option 1

Synonymes- Option one, Option 1, First option, 1 option etc

 

 

Step 4 - After adding the details, SAVE the Entity 

If you want to use this entities in your flow then you can directly fetch entity (in speech processing widget) of particular intent and store the Entity in any variable to further use in the flow as shown below.

 

Step 5 - Now go to Intents page

An intent categorizes an end-user's intention for one conversation turn. For each agent, you define many intents, where your combined intents can handle a complete conversation. When an end-user writes or says something, referred to as an end-user expression, Dialogflow matches the end-user expression to the best intent in your agent. Matching an intent is also known as intent classification.


Step 6 - Create a new intent

Step 7 - Add training phrases in it

Training phrases: These are example phrases for what end-users might say. When an end-user expression resembles one of these phrases, Dialogflow matches the intent. You don't have to define every possible example, because Dialogflow's built-in machine learning expands on your list with other, similar phrases.

For Example- If we want user to select ‘option1’, we can create an intent name ‘option1’ and can add some training phrases which the end user might say like - I choose option one, I want first option, I prefer option 1 etc

 

Step 8 - Now add the responses of the intent


Responses: Responses can be text or  speech to return to the end-user. These may provide the end-user with answers, ask the end-user for more information, or terminate the conversation.

For example- Suppose if the end user selects option 1 and we need to provide the payment status then we add response as shown below.

So whenever the user gives input from the training phrases of this intent, one of the response will be given back to the user.


Step 9 - Now Save the intent 

Step 10 - To use this responses in your flow you can configure flow as follows

  1. first configure NLP Engine

  2. Create any inbound or outbound flow as per need, For the above example the flow can be like  - 

3. Enable ‘Fetch intent’ and then ‘fetch response’ in speech processing widget

4. Use this response variable further in any widgets like in play widget to give the user voice response. 

 

  • No labels

0 Comments

You are not logged in. Any changes you make will be marked as anonymous. You may want to Log In if you already have an account.