Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Percept

Description

on_audio-_language(LanguageKey)

A new audio language has been requested, possibly by something external like the browser (see setLanguage and renderPage).

Example: on_audio_language('en-US').

on_audio_recording(Filename)

A new recorded audio file is available; the filename is always hh-mm-ss.wav and stored in the same folder as the currently running MAS2G (see enableRecording and startListening). Can be fed into one of the PlayAudio actions.

Example: on_audio_recording('12-00-00.wav').

on_battery_charge_change(Charge)

The current percentage of battery charge left in the robot.

Example: batteryCharge(100).

on_corona_check_passed

Sent by the coronachecker service once some valid Dutch CoronaCheck QR code has been recognised in the video stream.

on_emotion_detected(Emotion)

An emotion was detected in the given image by the emotion detection service (see startWatching).

Example: on_emotion_detected('happy').

on_event(Event)

Either an event related to some action above, i.e. one of [BreathingDisabled, BreathingEnabled, ClearLoadedAudioDone, EarColourDone, EarColourStarted, EyeColourDone, EyeColourStarted, GestureDone, GestureStarted, HeadColourDone, HeadColourStarted, LanguageChanged, ListeningDone, ListeningStarted, MemoryEntryStored, PlayAudioDone, PlayAudioStarted, PlayMotionDone, PlayMotionStarted, RecordMotionStarted, SetIdle, SetNonIdle, SetSpeechParamDone, TextDone, TextStarted, TurnDone, TurnStarted, UserDataSet, WatchingDone, WatchingStarted], an event related to one of the robot's sensors, i.e. one of [BackBumperPressed, FrontTactilTouched, HandLeftBackTouched, HandLeftLeftTouched, HandLeftRightTouched, HandRightBackTouched, HandRightLeftTouched, HandRightRightTouched, LeftBumperPressed, MiddleTactilTouched, RearTactilTouched, RightBumperPressed] (see http://doc.aldebaran.com/2-5/family/robots/contact-sensors_robot.html), or an event originating from one of the services, i.e. one of [EmotionDetectionDone,EmotionDetectionStarted,FaceRecognitionDone,FaceRecognitionStarted,IntentDetectionDone,IntentDetectionStarted,MemoryEntryStored,PeopleDetectionDone,PeopleDetectionStarted,UserDataSet].

Example: event('TextDone').

on_face_recognized(Identifier)

A face was recognised by the face recognition service. The identifier is a unique number for the given face, starting from 0. The percept will be sent continuously as long as the camera is open and the face is recognised.

Example: on_face_ecognized(12).

on_hot_device_detected(DeviceList)

One or more devices of the robot are (too) hot.

Example: on_hot_device_detected(['LLeg','RLeg']).

on_audio_intent(Intent,Params,Confidence,Text,Source)

An Intent with the given name was detected by the Dialogflow service, possibly under the current context (see startListening). Note that the intent can be an empty string as well, meaning no intent was matched (but some speech was still processed into text). The Params are an optional key-value list of all recognised entities in the intent. The Confidence is indicated between 0 (no intent detected) and 100 (completely sure about the match). The Text is the raw text that was generated by Dialogflow from the speech input. Finally, the source is one of ['audio', ‘chat’, ‘webhook’].

Note that when an IntentDetectionDone event is sent, but no intent percept has arrived at that time, no (recognisable) speech was found by Dialogflow.

Example: on_audio_intent('answer_yesno', [], 100, "Yes", 'audio').

is_awake(Awake)

The rest-mode of the robot has changed, it’s either resting (awake=false) or it woke up (awake=true). See the wakeUp and rest actions.

Example: is_wake(true).

is_chargin(Charging)

The robot is plugged-in (charging=true) or not (charging=false).

Example: is_charging(true).

on_robot_motion_recording(Recording)

The result of a startMotionRecording action, which can be fed into playMotion.

Example: on_robot_motion_recording([…]).

on_person_detected(X,Y)

Sent when the people detection service detects someone; the X and Y coordinates represent the (estimated) center of the person’s face in the image. The percepts will be sent continuously as long as the camera is open and someone is detected.

on_picture(Filename)

A new picture file is available; the filename is always hh-mm-ss.jpg and stored in the same folder as the currently running MAS2G (see startWatching and takePicture). Can be fed into the renderPage action (e.g. converted to base64).

Example: on_picture('12-00-00.jpg').

on_posture_changed(Posture)

The robot has taken the posture; see the goToPosture action.

Example: on_posture_changed('Stand').

on_text_sentiment(Sentiment)

Sent by the sentiment analysis service for each transcript; either positive or negative

Example: on_text_sentiment('positive').

on_text_transcript(Text)

A quick direct indication of the text spoken by an end-user whilst the intent detection is running; the final text given in the intent percept might be different than the multiple transcripts that may be received first!

Example: on_text_transcript("Hey").

...