How to cook an assistant
Our cooking assistant should be able to support a user with selecting a recipe and with executing a recipe. For recipe selection, a cooking assistant should be able to retrieve user preferences and recommend recipes that meet these preferences. For recipe execution, the assistant should be able to guide a user through the steps of a recipe. Our focus is on an assistant that is able to interact with a user about the recipe to some extent. We do not aim here for an assistant that can keep track of devices and closely monitor timing of the cooking process, which would require detailed process information (see Neumann and Wachsmuth, 2021). We want the assistant to be able to interact with a user using speech. To assist a user, moreover, it is important to also provide visual support as we cannot expect a user to remember everything that the assistant will say.
To get started we need a few essential datasets to feed a cooking assistant. These are the bare minimal ingredients we need for creating a cooking assistant:
A recipe database.
A list of ingredients.
A list of kitchen equipment and utensils.
We need these data to train our NLU component (we’ll use Dialogflow) and for searching our database for recipes.
Ideally, all of these data also includes pictures to illustrate the recipes, ingredients, and equipment.
Recipe database
We can either use an available recipe database that we can download somewhere. See here for a list of datasets of varying size. Or we can scrape a recipe website to create a dataset ourselves. Finally, we can use available (online) APIs from third parties to retrieve information about recipes. See here for some pointers to APIs and scraping software.
Ingredient Lists
Various sources to retrieve lists of ingredients are available online. See here for some pointers. Again, we can also use available (online) APIs. See the link above.