Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Requirements

  1. The Python-SIC connector requires Python 3 (with tkinter enabled).

    1. Make sure Python can compile native extensions

...

    1. (e.g. for Windows see https://www.scivision.dev/python-windows-visual-c-14-required).

    2. You can use a Python editor of your choice (Pycharm for example)

...

    1. .

...

Table of Contents
stylenone
separatorpipe

Abstract SIC Connector

The first main component is the AbstractSICConnector class. It requires the IP address of the SIC and the type of robot you have connected (‘nao’ or ‘pepper’). It sets up the connection to the SIC and enables you to send action commands to the robot and receive data generated by either the robot or the SIC itself. The data contains events (e.g. when a button is pressed ‘LeftBumperPressed' or when an action is finished 'WakeUpDone’) or the results of certain actions (e.g. a recognized intent after a speech recognition attempt).

...

Installation

Code Block
languagepynone
frompip install social_interaction_cloud.abstract_connector import AbstractSICConnector
from time import sleep


class MyConnector(AbstractSICConnector):
    def __init__(self, server_ip, robot):
        super(MyConnector, self).__init__(server_ip, robot)

    def run(self):
        self.start()
        self.set_language('en-US')
        sleep(1)  # wait for the language to change
        self.say('Hello, world!')
        sleep(3)  # wait for the robot to be done speaking (to see the relevant prints)
        self.stop()

    def on_robot_event(self, event):
        print(event)


# Run the application
my_connector = MyConnector(server_ip='127.0.0.1')
my_connector.run()

self.start() activates the connection. Under the hood a thread is started allowing the connector to receive actions and events. self.stop() gracefully closes the connection. self.set_language('en-US') and self.say('Hello, world!') are the two actions send to the robot to make it say ‘Hello, world!’ in English. These methods, as are all actions, are asynchronous. This means that they do not wait for a result before continuing. It also allows, if supported by the robot, to execute actions in parallel (e.g. simultaneously speaking and gesturing).

The on_robot_event() method will print all incoming events, which are: LanguageChanged, TextStarted, and TextDone. If you, for example, touch the robots head sensors (while to program is running) the events FrontTactilTouched, MiddleTactilTouched, and/or RearTactilTouchedwill also be printed.

The sleep statements avoid the program to stop before all the events are generated. See what happens when you remove the sleep statements. Most of the time you do not know how long you have to wait for an action to finish. Therefore, sleep statements are not the way to go. Ideally, you want the robot to wait until it has received the necessary data and select it’s next action based on the available data.

Basic SIC Connector

The python API also provides its own concrete implementation of the AbstractSICConnector class, called the BasicSICConnector. It allows you to register callback functions for each action you send. Whenever the action is finished or a result becomes available that callback function is called. For robot actions, e.g. wake_up(), say() or set_eye_color(), a callback function is only called once. You can also register callback functions that listen to touch events (e.g. MiddleTactilTouched) or the result of vision operations (e.g. on_face_recognized(identifier)). These callback functions are called each time that event or result becomes available.

Code Block
languagepy
import threading
from social_interaction_cloud.basic_connector import BasicSICConnector
from time import sleep


class Example:

    def __init__(self, server_ip, robot):
        self.sic = BasicSICConnector(server_ip, robot)

        self.awake_lock = threading.Event()

    def run(self):
        # active Social Interaction Cloud connection
        self.sic.start()

        # set language to English
        self.sic.set_language('en-US')

        # stand up and wait until this action is done (whenever the callback function self.awake is called)
        self.sic.wake_up(self.awake)
        self.awake_lock.wait()  # see https://docs.python.org/3/library/threading.html#event-objects

        self.sic.say_animated('You can tickle me by touching my head.')
        # Execute that_tickles call each time the middle tactile is touched
        self.sic.subscribe_touch_listener('MiddleTactilTouched', self.that_tickles)

        # You have 10 seconds to tickle the robot
        sleep(10)
        
        # Unsubscribe the listener if you don't need it anymore.
        self.sic.unsubscribe_touch_listener('MiddleTactilTouched')

        # Go to rest mode
        self.sic.rest()

        # close the Social Interaction Cloud connection
        self.sic.stop()

    def awake(self):
        """Callback function for wake_up action. Called only once.
        It lifts the lock, making the program continue from self.awake_lock.wait()"""

        self.awake_lock.set()

    def that_tickles(self):
        """Callback function for touch listener. Everytime the MiddleTactilTouched event is generated, this
         callback function is called, making the robot say 'That tickles!'"""
        
        self.sic.say_animated('That tickles!')


example = Example('127.0.0.1')
example.run()

In the example above the Nao robot will stand up, say “You can tickle me by touching my head”, and for 10 seconds will say “that tickles” every time you touch the sensor on the middle of its head. After 10 second, the robot will sit down again.

To wait until the robot is finished standing up, the program is locked by the self.awake_lock.wait() statement. awake_lock is an threading.Event() object, that blocks the main thread until the threading.Event() is set by calling self.awake_lock.set(). This is done in the awake()callback function. This callback function is added to the wake_up()action. Once the robot is finished standing up, awake()is called, and the “lock is lifted”, allowing the program to continue.

A different callback function is that_tickles(). It is subscribed to the MiddleTactilTouched event. Whenever the program is running, that_tickles()is called each time the middle head sensor is touched.

Action, ActionFactory, and ActionRunner

To help define the robot behaviors the python API offers the action package. An Action allows you to prepare an action and (re)use it when necessary. It requires a reference to a method of BasicSICConnector and the input arguments for that method. Optionally you can give it a callback function and a threading.Event() object as lock. Note that you have to explicitly state callback=… and lock=… to do so.

Code Block
languagepy
sic = BasicSICConnector(server_ip)
hello_action_lock = threading.Event()

hello_action = Action(self.sic.say, 'Hello, Action!', callback=hello_action_callback,
                      lock=hello_action_lock)
hello_action.perform().wait()  # perform() returns the lock, so you can immediately call wait() on it.
hello_action.perform().wait()  # you can reuse an action.

def hello_action_callback():
    print('Hello Action Done')
    hello_action_lock.set()
    hello_action_lock.clear()  # necessary for reuse

To ActionFactory helps build actions and especially can save you the trouble of managing all the different locks you might need. The build_action() method return a regular action. But instead of providing a reference to the BasicSICConnector method, you can use its name.

Code Block
sic = BasicSICConnector(server_ip)
action_factory = ActionFactory(sic)
hello_action_factory = action_factory.build_action('say', 'Hello, Action Factory!')
hello_action_factory.peform()

The build_waiting_action() method return a waiting action. The ActionFactory creates a lock with callback function that releases the lock for you. Optionally you can also add your own callback function, that will be embedded in the callback function created by the ActionFactory.

Code Block
languagepy
wake_up_action = action_factory.build_waiting_action('wake_up', additional_callback=awake)
wake_up_action.perform().wait()

def awake():
  print('I am fully awake now')

The build_touch_listener() method lets you build a touch listener that is registered to the BasicSICConnector. You can register to all the robot’s sensor events (e.g. MiddleTactilTouched). You can use it to wait until a button is pressed or to do something when a button is pressed. The listener can be notified only once or every time (until you unregister the listener) the button is pressed.

The build_vision_listener() method work similar to the build_touch_listener() method. It lets you build a listener for three specific vision events: on_person_detected, on_face_recognized, on_emotion_detected. For these events to be generated you need to turn on the People Detection, Face Recognition, and Emotion Detection services (via the SIC web portal) respectively. To build a vision listener for these three events, you the label ‘people’, ‘face’, or ‘emotion’ respectively for the vision_type parameter.

Finally, the last component of the action package is the ActionRunner. It allows you to run regular and waiting actions right away with run_action() and run_waiting_action() respectively. It takes the same input as the ActionFactory methods, because that is being called under the hood first. ActionRunner also allows you to preload a set of actions, run them in parallel, and wait for all of the waiting actions (not regular actions) to finish before continuing. use load_waiting_action() and load_waiting_action() to preload actions and run_loaded_actions() to run them.

Code Block
languagepy
sic = BasicSICConnector(server_ip)
action_runner = ActionRunner(sic)
action_runner.run_waiting_action('say', 'Hello, Action Runner!',
                                 additional_callback=hello_action_runner_callback)
# run_waiting_action('say', ...) waits to finish talking before continuing

def hello_action_runner_callback():
  print('Hello Action Runner Done')

# The following actions will run in parallel.
action_runner.load_waiting_action('wake_up')
action_runner.load_waiting_action('set_eye_color', 'green')
action_runner.load_waiting_action('say', 'I am standing up now')
action_runner.run_loaded_actions()  # If you want to keep an action sequence for reuse, add clear=False as input.

action_runner.run_action('say', 'I will start sitting down as I say this. Because I am not a waiting action')
action_runner.run_action('set_eye_color', 'white')
action_runner.run_waiting_action('rest')

State Machines with pytransitions

Implementing a human-robot interaction flow will go more efficiently if your code could have a similar structure. Most interactions can be structured as graph. Each step in the interaction is going from one state to another, based on the input from the user and the goals of the robot. To structure your code using state and state transitions you can use the state machine design pattern. The most important component of state machines are the state transitions. You define what triggers a transition, for example a button press. You can define prerequisites of a state transition. For example, to get from the sleep to the awake state, the robot first needs to stand up. For more information about the state machine pattern you can read this Medium article or this paper.

This is where pytransitions come in. It is “a lightweight, object-oriented finite state machine implementation in Python with many extensions”. Read their guide to learn more about pytransitions. Let’s look at an example of how to use it together with the SIC python API. It starts with creating a model class for the robot that has states and link it to a state machine.

Code Block
languagepy
from transitions import Machine

class ExampleRobot(object):
   
   states = ['asleep', 'awake', 'introduced', 'got_acquainted', 'goodbye']

   def __init__(self):
      self.machine = Machine(model=self, states=ExampleRobot.states, initial='asleep')

The second step is create transitions between the states.

Code Block
languagepy
self.machine.add_transition(trigger='start', source='asleep', dest='awake')

If we have an instantiation of the ExampleRobot class we can now call the start() method (trigger) to cause a transition from the initial asleep state (source) the the awake state (destination).

Code Block
robot = ExampleRobot()
robot.start() # causes state transition from asleep to awake

Before the robot becomes awake it needs to wake up (SIC method to let the robot stand up). We can add a before='some_method' to our add_transition() statement to trigger some_method() before the state transition happens.

Code Block
self.machine.add_transition(trigger='start', source='asleep', dest='awake', before='wake_up')

def wake_up(self):
    self.action_runner.load_waiting_action('set_language', 'en-US')
    self.action_runner.load_waiting_action('wake_up')
    self.action_runner.run_loaded_actions()

Often there are no external triggers to trigger a state transition in the human-robot interaction flow. For example, when the robot is awake and ready it should automatically move to a next state. This can be achieved by adding an after='some_method' statement to the transition. In this case some_method() is not a dedicated method, but the trigger of the next transition. Using this approach you can create a whole chain of states, neatly separating each interaction step in different states and methods. It does not have to be a linear sequence. You can create branches and cycles, depending on the indented interaction flow.

Code Block
languagepy
self.machine.add_transition(trigger='start', source='asleep', dest='awake',
                            before='wake_up', after='introduce')
self.machine.add_transition(trigger='introduce', source='awake', dest='introduced',
                            before='introduction', after='get_acquainted')
self.machine.add_transition(trigger='get_acquainted', source='introduced', dest='got_acquainted',
                            before='getting_acquainted', after='say_goodbye')
self.machine.add_transition(trigger='say_goodbye', source='got_acquainted', dest='goodbye',
                            before='saying_goodbye', after='rest')
self.machine.add_transition(trigger='rest', source='*', dest='asleep',
                            before='stop')

...

Resources

API Documentation Python Connector API

Documented examples https://bitbucket.org/socialroboticshub/examples/src/main/

Background information Python Connector (Restructured)