Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Info

The information provided here should be sufficient for you to complete the project. For those of you who are interested and want to learn more about conversational patterns and the related coding scheme that we use here (e.g. C4, etc.), see: Moore, R. J., Arar, R. (2019), Conversational UX Design: A Practitioner's Guide to the Natural Conversation Framework. ACM.

Team Contribution

During this week, as a complete team you will select three recipes for the conversational agent from cookingrecipes.pl, to work out further in pairs for recipe selection, visual support and recipe instruction. In addition, you will set up the initial version of the cooking assistant, by creating a Dialogflow agent (See here).

You should organize your team and decide who will work on the separate subtasks of Recipe Selection, Visual Support, and Recipe Instruction. In this week, the Recipe Selection team will focus on the start of the conversation, enabling the assistant to greet and to have the user select one of the three recipes, while the Recipe Instruction team will focus on the subsequent instruction of the three recipes. The Visual Support team will incorporate visuals to enrich the conversation. More details about the separate tasks of the three pairs are given below.

Recipe Selection

This section entails the tasks to be completed by The tasks for the Recipe Selection team . It will mainly involve in week 1 concern the steps to have allow the user to select a certain recipe from the knowledge base provided as cookingrecipes.pl. For the recipes, multiple websites are available. For this course, we aim more or less straightforward The recipes used for this course are one-pot recipes that can be made in a linear, step-by-step fashion . Therefore, simpler recipes were extracted from this website and made available in the file cooking.pl. that you can find on Tasty. Although not yet the focus for this week, you can get some ideas on how to implement the recipe selection task here in the Cooking Assistant space.

Opening the conversation by greeting

  • The agent should open the conversation with a greeting, and by sharing its name (self-identify). To achieve this, add the corresponding C1 opening patterns in the patterns.pl file and choose a name for your chatbot (insert the an agentName/1 fact into the agent’s belief base in the init module).

  • Check out the patterns.pl file as it contains additional instruction details. An opening pattern should be selected when the session history is still empty. Make sure that the agent self-identifies by the corresponding pattern only when an agentName/1 behavioral parameter is set has been instantiated.

  • You should also introduce a greeting intent in your Dialogflow agent to allow the user to respond to the agent's greeting. Click here for instructions on how to create an intent in your Dialogflow Agent.

...

  • The initial version of the GOAL agent will immediately start instructing a pasta recipe, while it should only start instructing a recipe after a user request. In the coming weeks you will enable the user to choose for a recipe by specifying particular recipe features. For now, we will assume that the user knows about the different recipes to pick from, and directly mentions the name of a recipe of choice. You may also for now have the agent suggest particular recipes to the user, upon request or as part of the greeting sequence.

  • The first thing to do is to make sure the agent is able to recognize recipe names. To recognize which recipe the user is talking about, in your Dialogflow agent create an entity recipe and add the (shorthand) names of the recipes from cookingrecipes.pl (possibly with different variations of the recipe names). Also, add an intent recipeRequest to your Dialogflow agent and integrate these recipe names as entities.

  • Now add a pattern a50recipeName to the patterns.pl file of your GOAL agent. Again, additional instructions are given in the comments of the file. As a final step, add the pattern to the agent's agenda. Make sure the agent's memory is initially empty, and check if the pattern is performed by testing your agent.

...

Enable Instructions of Recipes

The initial version of the cooking assistant spits out all the recipe steps at once. This should be changed to allow the user to indicate he or she has completed the step, which at a later point should also allow the user to ask queries about a step. To make this happen, we will make a small change to the given a30recipeStep pattern by adding a conversational step (user intent) to it. For this:

...

Add an intent you should call recipeContinuer to Dialogflow that is able to recognize phrases such as move on to next step, continue, simply next, or other similar phrases that the user might use to indicate a step has been completed.

...

As can be seen in the a30recipeStep pattern in patterns.pl, the agent is triggered by the user intent recipeContinuer. You need to implement such an intent in your Dialogflow agent to make this work. See this page for more information.

Enable End of Recipe

The initial version of the cooking assistant does not indicate that the final step of a recipe has been reached. It does already have a finalStep/3 predicate that is used as a marker for the final step. Modify the text generated for the finalStep intent in the text.pl file using the built-in string_concat/3 predicate to indicate to the user that this is the last step of the recipe.

...

The agent should close the conversation with a farewell or a last topic check. To this end, add corresponding C4 patterns in the patterns.pl file. A closing pattern should be selected when all patterns in the agenda have been performed. Make sure that the farewell pattern only is performed when the lastTopicCheck/0 behavioral parameter is not set.

Visual Support

Here, the main focus is to think of The assignment related to visual support is about how visuals can provide support to decide about for choosing a recipe or improve the improving understanding of a recipe instruction.

Visuals to display chosen recipe

The aim here is to You should design the visuals with respect to the recipe name and the instructions. The agent can thereby This should enrich the information communicated to the user through text and images displayed on the screen.

...

A first implementation is to display the name and an image related to the chosen three recipes

...

Visuals to display buttons during recipe instruction

Example Conversation

The first version of your chatbot should be able to conduct a conversation with the user, which is like the example you can see in Figure below.

...

Testing Phase 1

Your Cooking Assistant should be able to perform the following:

...

Greeting the user

...

Terminating the conversation

...

, using HTML. In the initial GOAL code, the pasta recipe is displayed as an image in the initialization of the agent (see dialog_init.mod2g), whenever the agent says something (see dialog_generation.mod2g) and whenever the agent is waiting for the user to say something (see dialog_update.mod2g). This is obviously not a very useful way to do this. Make the PastaAglioPage rule in html.pl more generic by displaying the recipe of choice:

  • Find images for the recipes that you have chosen to work out further.

  • Define img cards as was done for PastaAglio.

  • Define a rule by which the page is rendered based on the currently chosen recipe.

  • For recipes for which you have not added an image, you may display the name of the recipe.

  • Make sure that the statements in dialog_generation.mod2g, dialog_init.mod2g and dialog_update.mod2g are updated as well, so that they will not always render the PastaAglio image.

Week - 1 Deliverable

For your Week-1 submission, you need to submit a weekly progress report in .doc format (See Guidelines).

...