Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Percept

Description

answer(Answer)

The text from a button which was pressed in the browser (see renderPage).

audioLanguage(LanguageKey)

A new audio language has been requested, possibly by something external like the browser (see setLanguage and renderPage).

Example: audioLanguage('en-US').

audioRecording(Filename)

A new recorded audio file is available; the filename is always hh-mm-ss.wav and stored in the same folder as the currently running MAS2G (see enableRecording and startListening). Can be fed into one of the PlayAudio actions.

Example: audioRecording('12-00-00.wav').

batteryCharge(Charge)

The current percentage of battery charge left in the robot.

Example: batteryCharge(100).

device(Type,Identifier)

Provides information about the selected devices when the agent starts. Type can be one of ['cam', 'mic', 'robot', 'speaker', 'browser', 'ga']. Identifier is the unique identifier for the device (as also visible when selecting the devices in the initial dialog).

Example: device('cam', 'default-12345').

emotionDetected(Emotion)

An emotion was detected in the given image by the emotion detection service (see startWatching).

Example: emotionDetected('happy').

event(Event)

Either an event related to some action above, i.e. one of [BreathingDisabled, BreathingEnabled, ClearLoadedAudioDone, EarColourDone, EarColourStarted, EyeColourDone, EyeColourStarted, GestureDone, GestureStarted, HeadColourDone, HeadColourStarted, LanguageChanged, ListeningDone, ListeningStarted, MemoryEntryStored, PlayAudioDone, PlayAudioStarted, PlayMotionDone, PlayMotionStarted, RecordMotionStarted, SetIdle, SetNonIdle, SetSpeechParamDone, TextDone, TextStarted, TurnDone, TurnStarted, UserDataSet, WatchingDone, WatchingStarted], an event related to one of the robot's sensors, i.e. one of [BackBumperPressed, FrontTactilTouched, HandLeftBackTouched, HandLeftLeftTouched, HandLeftRightTouched, HandRightBackTouched, HandRightLeftTouched, HandRightRightTouched, LeftBumperPressed, MiddleTactilTouched, RearTactilTouched, RightBumperPressed] (see http://doc.aldebaran.com/2-5/family/robots/contact-sensors_robot.html), or an event originating from one of the services, i.e. one of [EmotionDetectionDone,EmotionDetectionStarted,FaceRecognitionDone,FaceRecognitionStarted,IntentDetectionDone,IntentDetectionStarted,MemoryEntryStored,PeopleDetectionDone,PeopleDetectionStarted,UserDataSet].

Example: event('TextDone').

faceRecognized(Identifier)

A face was recognised by the face recognition service. The identifier is a unique number for the given face, starting from 0. The percept will be sent continuously as long as the camera is open and the face is recognised.

Example: faceRecognized(12).

hotDevices(DeviceList)

One or more devices of the robot are (too) hot.

Example: hotDevices(['LLeg','RLeg']).

intent(Intent,Params,Confidence,Text,Source)

An Intent with the given name was detected by the Dialogflow service, possibly under the current context (see startListening). Note that the intent can be an empty string as well, meaning no intent was matched (but some speech was still processed into text). The Params are an optional key-value list of all recognised entities in the intent. The Confidence is indicated between 0 (no intent detected) and 100 (completely sure about the match). The Text is the raw text that was generated by Dialogflow from the speech input. Finally, the source is one of ['audio', ‘chat’, ‘webhook’].

Note that when an IntentDetectionDone event is sent, but no intent percept has arrived at that time, no (recognisable) speech was found by Dialogflow.

Example: intent('answer_yesno', [], 100, “Yes”"Yes", 'audio').

isAwake(Awake)

The rest-mode of the robot has changed, it’s either resting (awake=false) or it woke up (awake=true). See the wakeUp and rest actions.

Example: isAwake(true).

isCharging(Charging)

The robot is plugged-in (charging=true) or not (charging=false).

Example: isCharging(true).

loadedAudioID(Identifier)

The audio given in the latest loadAudio action has been preloaded and assigned the given identifier.

Example: loadedAudioID(0).

memoryData(Key,Value)

A response to a getUserData action.

Example: memoryData('something', 'blabla').

motionRecording(Recording)

The result of a startMotionRecording action, which can be fed into playMotion.

Example: motionRecording([…]).

personDetected

Sent when the people detection service detected someone. The percept will be sent continuously as long as the camera is open and a person is detected.

picture(Filename)

A new picture file is available; the filename is always hh-mm-ss.jpg and stored in the same folder as the currently running MAS2G (see startWatching and takePicture). Can be fed into the renderPage action (e.g. converted to base64).

Example: picture('12-00-00.jpg').

posture(Posture)

The robot has taken the posture; see the goToPosture action.

Example: posture('Stand').

stiffness(Stiffness)

A number indicating the current average stiffness in the robot’s body. 0: less than 0.05 average, 1: between 0.05 and 0.95 average, 2: above 0.95 average. See the setStiffness action.

Example: stiffness(1).

transcript(Text)

A quick direct indication of the text spoken by a user whilst the intent detection is running; the final text given in the intent percept might be different than the multiple transcripts that may be received first!

Example: transcript('"Hey'").