Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 35 Current »

This brief guide introduces the main components and concepts of this API and provides some code samples that can be used as a starting point. The Python-SIC connector requires Python 3 (with tkinter enabled). Make sure Python can compile native extensions (e.g. for Windows see https://www.scivision.dev/python-windows-visual-c-14-required). You can use a Python editor of your choice (Pycharm for example). The Python skeleton project itself can be found in the ‘python’ folder at https://bitbucket.org/socialroboticshub/connectors. It is recommended to simply download this repository or even fork it directly (which you can do using the + button at the top-left of the page). Once you have your download or cloned fork, make sure the required dependencies are installed, e.g. by executing pip install -r requirements.txt in the folder.

Abstract SIC Connector

The first main component is the AbstractSICConnector class. It requires the IP address of the SIC with which it sets up the connection to the SIC and enables you to send action commands to the robot and receive data generated by either the robot or the SIC itself. The data contains events (e.g. when a button is pressed ‘LeftBumperPressed' or when an action is finished 'WakeUpDone’) or the results of certain actions (e.g. a recognized intent after a speech recognition attempt).

The AbstractSICConnector class is abstract, meaning that itself does not do anything with the incoming data. To process the incoming data you can implement your own concrete SIC Connector class by inheriting the AbstractSICConnector class and overriding the empty event handlers. In the below MyConnector example, you see that it uses the AbstractSICConnectorclass as a parent, inheriting all its methods. Two things have been added. First of all, the on_robot_event method is overridden to print all the events generated by the robot. Secondly, a run method is added that sends actions to the SIC.

from social_interaction_cloud.abstract_connector import AbstractSICConnector
from time import sleep


class MyConnector(AbstractSICConnector):
    def __init__(self, server_ip):
        super(MyConnector, self).__init__(server_ip)

    def run(self):
        self.start()
        self.set_language('en-US')
        sleep(1)  # wait for the language to change
        self.say('Hello, world!')
        sleep(3)  # wait for the robot to be done speaking (to see the relevant prints)
        self.stop()

    def on_robot_event(self, event):
        print(event)


# Run the application
my_connector = MyConnector(server_ip='127.0.0.1')
my_connector.run()

self.start() activates the connection. Under the hood a thread is started allowing the connector to receive actions and events. self.stop() gracefully closes the connection. self.set_language('en-US') and self.say('Hello, world!') are the two actions send to the robot to make it say ‘Hello, world!’ in English. These methods, as are all actions, are asynchronous. This means that they do not wait for a result before continuing. It also allows, if supported by the connected device(s), to execute actions in parallel (e.g. simultaneously speaking and gesturing).

The on_robot_event method will print all incoming events, which are: LanguageChanged, TextStarted, and TextDone. If you, for example, touch a robot's head sensors (while the program is running), the events FrontTactilTouched, MiddleTactilTouched, and/or RearTactilTouchedwill also be printed.

The sleep statements avoid the program to stop before all the events are generated. See what happens when you remove the sleep statements. Most of the time you do not know how long you have to wait for an action to finish. Therefore, sleep statements are not the way to go. Ideally, you want the device to wait until it has received the necessary data and select it’s next action based on the available data.

This class contains extensive documentation for each available action and on_* trigger.

Basic SIC Connector

The Python API also provides its own concrete implementation of the AbstractSICConnector class, called the BasicSICConnector. It allows you to register callback functions for each action you send. Whenever the action is finished or a result becomes available that callback function is called. For device actions, e.g. wake_up, say or set_eye_color, a callback function is only called once. You can also register callback functions that listen to touch events (e.g. MiddleTactilTouched) or the result of vision operations (e.g. on_face_recognized(identifier)). These callback functions are called each time that event or result becomes available.

import threading
from social_interaction_cloud.basic_connector import BasicSICConnector
from time import sleep


class Example:

    def __init__(self, server_ip):
        self.sic = BasicSICConnector(server_ip)

        self.awake_lock = threading.Event()

    def run(self):
        # active Social Interaction Cloud connection
        self.sic.start()

        # set language to English
        self.sic.set_language('en-US')

        # stand up and wait until this action is done (whenever the callback function self.awake is called)
        self.sic.wake_up(self.awake)
        self.awake_lock.wait()  # see https://docs.python.org/3/library/threading.html#event-objects

        self.sic.say_animated('You can tickle me by touching my head.')
        # Execute that_tickles call each time the middle tactile is touched
        self.sic.subscribe_touch_listener('MiddleTactilTouched', self.that_tickles)

        # You have 10 seconds to tickle the robot
        sleep(10)
        
        # Unsubscribe the listener if you don't need it anymore.
        self.sic.unsubscribe_touch_listener('MiddleTactilTouched')

        # Go to rest mode
        self.sic.rest()

        # close the Social Interaction Cloud connection
        self.sic.stop()

    def awake(self):
        """Callback function for wake_up action. Called only once.
        It lifts the lock, making the program continue from self.awake_lock.wait()"""

        self.awake_lock.set()

    def that_tickles(self):
        """Callback function for touch listener. Everytime the MiddleTactilTouched event is generated, this
         callback function is called, making the robot say 'That tickles!'"""
        
        self.sic.say_animated('That tickles!')


example = Example('127.0.0.1')
example.run()

In the example above, a connected Nao robot will stand up, say “You can tickle me by touching my head”, and for 10 seconds will say “that tickles” every time you touch the sensor on the middle of its head. After 10 second, the Nao will sit down again.

To wait until the Nao has finished standing up, the program is locked by the self.awake_lock.wait() statement. awake_lock is an threading.Event() object, that blocks the main thread until the threading.Event() is set by calling self.awake_lock.set(). This is done in the awake callback function. This callback function is added to the wake_up action. Once the robot is finished standing up, awake is called, and the “lock is lifted”, allowing the program to continue.

A different callback function is that_tickles. It is subscribed to the MiddleTactilTouched event. Whenever the program is running, that_tickles is called each time the middle head sensor is touched.

Action, ActionFactory, and ActionRunner

To help define behaviors, the Python API offers some additional facilities related to actions. An Action allows you to prepare an action and (re)use it when necessary. It requires a reference to a method of BasicSICConnector and the input arguments for that method. Optionally you can give it a callback function and a threading.Event() object as lock. Note that you have to explicitly state callback=… and lock=… to do so.

sic = BasicSICConnector(server_ip)
hello_action_lock = threading.Event()

hello_action = Action(self.sic.say, 'Hello, Action!', callback=hello_action_callback,
                      lock=hello_action_lock)
hello_action.perform().wait()  # perform() returns the lock, so you can immediately call wait() on it.
hello_action.perform().wait()  # you can reuse an action.

def hello_action_callback():
    print('Hello Action Done')
    hello_action_lock.set()
    hello_action_lock.clear()  # necessary for reuse

To ActionFactory helps build actions and especially can save you the trouble of managing all the different locks you might need. The build_action method return a regular action. But instead of providing a reference to the BasicSICConnector method, you can use its name.

sic = BasicSICConnector(server_ip)
action_factory = ActionFactory(sic)
hello_action_factory = action_factory.build_action('say', 'Hello, Action Factory!')
hello_action_factory.peform()

The build_waiting_action method return a waiting action. The ActionFactory creates a lock with callback function that releases the lock for you. Optionally you can also add your own callback function, that will be embedded in the callback function created by the ActionFactory.

wake_up_action = action_factory.build_waiting_action('wake_up', additional_callback=awake)
wake_up_action.perform().wait()

def awake():
  print('I am fully awake now')

The build_touch_listener method lets you build a touch listener that is registered to the BasicSICConnector. You can register to all the robot’s sensor events (e.g. MiddleTactilTouched). You can use it to wait until a button is pressed or to do something when a button is pressed. The listener can be notified only once or every time (until you unregister the listener) the button is pressed.

The build_vision_listener method work similar to the build_touch_listener method. It lets you build a listener for four specific vision events: on_person_detected, on_face_recognized, on_emotion_detected, and on_corona_check_passed. For these events to be generated you need to turn on the People Detection, Face Recognition, Emotion Detection and Corona Checker services respectively. To build a vision listener for these three events, you the label ‘people’, ‘face’, ‘emotion’ or ‘corona’ respectively for the vision_type parameter.

Finally, the last component of the action package is the ActionRunner. It allows you to run regular and waiting actions right away with run_action and run_waiting_action respectively. It takes the same input as the ActionFactory methods, because that is being called under the hood first. ActionRunner also allows you to preload a set of actions, run them in parallel, and wait for all of the waiting actions (not regular actions) to finish before continuing. use load_waiting_action and load_waiting_action to preload actions and run_loaded_actions to run them.

sic = BasicSICConnector(server_ip)
action_runner = ActionRunner(sic)
action_runner.run_waiting_action('say', 'Hello, Action Runner!',
                                 additional_callback=hello_action_runner_callback)
# run_waiting_action('say', ...) waits to finish talking before continuing

def hello_action_runner_callback():
  print('Hello Action Runner Done')

# The following actions will run in parallel.
action_runner.load_waiting_action('wake_up')
action_runner.load_waiting_action('set_eye_color', 'green')
action_runner.load_waiting_action('say', 'I am standing up now')
action_runner.run_loaded_actions()  # If you want to keep an action sequence for reuse, add clear=False as input.

action_runner.run_action('say', 'I will start sitting down as I say this. Because I am not a waiting action')
action_runner.run_action('set_eye_color', 'white')
action_runner.run_waiting_action('rest')

State Machines with pytransitions

Implementing a social interaction flow will go more efficiently if your code could have a similar structure. Most interactions can be structured as graph. Each step in the interaction is going from one state to another, based on the input from an end-user and the goals of the robot. To structure your code using state and state transitions you can use the state machine design pattern. The most important component of state machines are the state transitions. You define what triggers a transition, for example a button press. You can define prerequisites of a state transition. For example, to get from the sleep to the awake state, a robot first needs to stand up. For more information about the state machine pattern you can read this Medium article or this paper.

This is where pytransitions come in. It is “a lightweight, object-oriented finite state machine implementation in Python with many extensions”. Read their guide to learn more about pytransitions. Let’s look at an example of how to use it together with the SIC Python API. It starts with creating a model class for a robot that has states and link it to a state machine:

from transitions import Machine

class ExampleRobot(object):
   
   states = ['asleep', 'awake', 'introduced', 'got_acquainted', 'goodbye']

   def __init__(self):
      self.machine = Machine(model=self, states=ExampleRobot.states, initial='asleep')

The second step is create transitions between the states:

self.machine.add_transition(trigger='start', source='asleep', dest='awake')

If we have an instantiation of the ExampleRobot class we can now call the start method (trigger) to cause a transition from the initial asleep state (source) the the awake state (destination):

robot = ExampleRobot()
robot.start() # causes state transition from asleep to awake

Before the robot becomes awake it needs to wake up (SIC method to let the robot stand up). We can add a before='some_method' to our add_transition statement to trigger some_method before the state transition happens:

self.machine.add_transition(trigger='start', source='asleep', dest='awake', before='wake_up')

def wake_up(self):
    self.action_runner.load_waiting_action('set_language', 'en-US')
    self.action_runner.load_waiting_action('wake_up')
    self.action_runner.run_loaded_actions()

Often there are no external triggers to trigger a state transition in the human-robot interaction flow. For example, when the robot is awake and ready it should automatically move to a next state. This can be achieved by adding an after='some_method' statement to the transition. In this case some_method is not a dedicated method, but the trigger of the next transition. Using this approach you can create a whole chain of states, neatly separating each interaction step in different states and methods. It does not have to be a linear sequence. You can create branches and cycles, depending on the indented interaction flow.

self.machine.add_transition(trigger='start', source='asleep', dest='awake',
                            before='wake_up', after='introduce')
self.machine.add_transition(trigger='introduce', source='awake', dest='introduced',
                            before='introduction', after='get_acquainted')
self.machine.add_transition(trigger='get_acquainted', source='introduced', dest='got_acquainted',
                            before='getting_acquainted', after='say_goodbye')
self.machine.add_transition(trigger='say_goodbye', source='got_acquainted', dest='goodbye',
                            before='saying_goodbye', after='rest')
self.machine.add_transition(trigger='rest', source='*', dest='asleep',
                            before='stop')

For the complete state machine example see state_machine_example.py included in the API.

  • No labels