Group 1 Log book

Week 1

Tuesday 11/01/22 1:30PM-4:30PM

  • Connecting Pepper Robot to computers, brainstorming on ideas of personalized greetings,

  • working on gesture examples.

Tuesday 11/01/22 1:30PM-4:30PM

  • Connecting Pepper Robot to computers, brainstorming on ideas of personalized greetings,

  • working on gesture examples.

Thursday 13/01/22 Evening

  • Making schedules for labs for following week

  • Brainstorming on idea of personalized greetings taught by human teachers.

Friday 14/01/22 1:00-4:00PM

  • Trying different modules on Mediapipe library including hand gestures, object recognition, gesture recognition.

Tuesday 11/01/22 1:30PM-4:30PM

We went to lab and worked on connection of Pepper robot to Yi’s computer and it has connectedly successfully. We have performed some basic gesture examples on Pepper Robot. For example, handshake gesture, basic_connector_example, my_connector_example and guesture_example these python files. We have also tried to connect Hong’s computer to Pepper robot, but her computer’s Disc C has many files so she is going to move some files to Disc C so that she can have Docker software properly running in her computer and to have her computer connected to Pepper Robot.

We did brainstorming on ideas of learning personalized social greetings from human teachers. There are a few good ideas of personalized social greetings. For example, Japanese bow to elders was suggested, and also different types of greetings from different cultures are suggested.

Thursday 13/01/22 Evening

Maike, Hong and Yi has formed up a team and had a meeting on Thursday evening on Whatsapp. We have performed brainstorming on ideas of teaching personalized social greetings by human. During meeting, we have agreed to use High-five to children, and Wave to older people, these two social greeting gestures to start with and we may add some further personalized social greetings.

Friday 14/01/22 1:00-4:00PM Lab session:

On today’s lab, we have Hong’s computer connected to Pepper robot and has run a few gesture examples of pepper robot. Maike, Hong and Yi have been discussing on what modules in Mediapipe library to use to have inputs from different users to teach Pepper robot. We have tried to use hands module on Mediapipe library and thinking to use different hands gestures as input to teach Pepper robot to differentiate users to have different actions, but it didn’t work. We have also tried to use object detection module on Mediapipe library to detect different objects with cameras.

Challenges and obstacles: We have difficulty in trying different modules of Mediapipe library and also other libraries to have reinforcement learning and Machine learning of Pepper robot. We are trying to figure that out. And also we try to use Choreographe to perform wave and high-five gestures on Pepper robot, but after installing Choreographe when we open the software, there comes an error. So we are trying to install Binary version of Choreographe to try to have it installed successfully.

Week 2

Monday 17/01/22 1:30PM-5:00PM

  • Decided only to focus on high five’s with possible extension to fist bumps.

  • Connecting Pepper Robot to Maike’s computer

  • Run and experimented on the socker-connector files

  • experimented with Pepper’s visuals and coordinates

Monday 17/01/22 1:30PM-5:00PM

  • Decided only to focus on high five’s with possible extension to fist bumps.

  • Connecting Pepper Robot to Maike’s computer

  • Run and experimented on the socker-connector files

  • experimented with Pepper’s visuals and coordinates

Tuesday 18/01/2022 1:30PM-5PM

  • Installed socked_connection successfully.

  • Have tried modules of hand_pose_test and body_pose_test with socket connection.

Wednesday 19/01/2022 1:30PM-5PM

  • Have been working on pepper_kinematics module

  • Have been working on record_motion module.

Saturday 22/01/2022 1PM-9PM

  • Working on age and gender detection modules on OpenCV library

  • Thinking to implement age_detection module on Pepper’s robot

Monday 17/01/22 1:30PM-5:00PM

Feedback presentation:

Too Many ideas <- focus on one

Maybe do only high fives on the y axis.  Easy to expand into fist-bumps

feedback: higher or lower NONVERBAL

This week: explore how, visual space where robot meets hand with robot. 

Height might not be the best indicator
End of the  week: who initiates? Robot or user

What will we teach? Hand matching 

Our focus is going to be COORDINATION

Experimented with pepper’s visuals and coordinates. The x-axis goes from 0 to 1.

Tuesday 18/01/2022 1:30PM-5PM

On Tuesday lab, we have installed socked_connection on each of our computers succesfully and have tried hand_pose_test and body_pose_test with socked_connection. We have tried to initiate greetings first to Pepper robot, and we have tried coding for it. But it didn’t work.

Obstacle and challenges: It seems more realistic to have human users to initiate greeting first. And to use Pepper’s real-time camera to identify user’s age and differentiate user groups.

Wednesday 19/01/2022 1:30PM-5PM

On Wednesday afternoon we had a Zoom meeting with Professor Kim, and Professor suggested to use an open library of Python_pepper_kinematics by Ysuga. The open source codes were written 6 years ago, we have to tried installed and implement the codes and there are many errors. We have discussed with TAs and found out that codes may not work properly. And then we have been thinking to use record_motion function from SIC framework and trying to use record_and_play_motion_example.py.

Obstacles and challenges: How to record and then play back that motion after using record_and_play_motion_example.py. file.

Saturday 22/01/2022 1PM-9PM:

On Saturday, I have been working on learning about Computer Vision libraries such as OpenCV and its functions of age_and_gender_detection. I have also used my laptop camera to have real-time age_and_gender_detection function running and see how is effect and how accurate for prediction. It seems age_and_gender_detection function is basically working well as long as I stay close to the camera. But sometimes when I turned my head, the function cannot recognize my gender correctly. Once I face the camera, it predicts correct. I was thinking how to have Pepper robot’s camera to perform age_and_gender_detection function.

Obstacles and challenges: To have Pepper’s camera to implement age and gender detection. And need to solve opening error problem of Choreographe. The third challenge is to load Speech Greeting in Korean languages after detecting users’ age and gender for different groups of human users.

Week 3

Monday 24/01/2022 1:30PM-3PM

  • Presentation

  • Discussing and Learning what to improve

Monday 24/01/2022 1:30PM-3PM

  • Presentation

  • Discussing and Learning what to improve

Tuesday 25/01/2022 1:30PM-5PM

  • Working on implementation of age-detection on Pepper’s camera

  • Working on Choreographe

Wednesday, Thursday 26/01/2022-27/01/2022

  • Resolving errors on Choreographe software when it is starting up.

Friday 1:30PM-5PM 28/01/2022

  • Working on age_detection and gender_detenction modules on Choreographe, and motion_record function and various other functions

Monday 24/01/2022 1:30PM-3PM:

I have had presentation on Monday’s plenary section and presented current progress, obstacles and challenges, have discussed in detail about what to improve and what to add in my assignment.

Obstacles and challenges: Choreographe software has errors when starting up in Windows system, need to find a way to solve the errors. Trying to figure out how to have Pepper’s camera to implement age_detection and gender_detection, how to make inputs from age_detection and gender_detection result and have output as greetings to different user types of human teachers of Pepper robot. How to add Schknit-Learn and dicision tree into the greetings models.

Tuesday 25/01/2022 1:30PM-5PM Lab:

I have been working on putting Age_Gender_detection and socket_connection together to have Pepper’s camera to perform age_detection and gender_detection, but it didn’t work out. Obstacles and Challenges: How to use Pepper’s camera to implement age_detection and gender_detection to have data from different types of human teachers to perform different greetings.

Wednesday and Thursdays at home:

I have been spending a lot of time and resolving Choreographe errors, and finally have resolved it by deleting some files on Registry editor in Window system. So right now Choreographe can work properly.

Friday 1:30PM-5PM 28/01/2022 Lab session:

I was working on Choreographe to perform age_detection, gender_detection, move_along, record_motion, and various other functions and have testings on Pepper robot. But age_detection and gender_detion don’t work properly yet. Obstacles and challenges: Will spend time to learn tutorials about age and gender detections on Youtube and also about motion_record function. Will also try to work on Python programming of age_detection and socket_connection to perform age detection on Pepper’s robot.

 

Week 4

Tuesday 01/02/2022 Lab 12PM-5PM:

  • Implemented 3 scenarios of Pepper’s greetings

Tuesday 01/02/2022 Lab 12PM-5PM:

  • Implemented 3 scenarios of Pepper’s greetings

Wednesday 02/02/2022 Lab 11:45AM-5pm:

  • Added human teaching elements to Pepper robot’s greetings

  • Trouble-shootings.

Thursday 03/02/2022 1:30PM-10PM:

  • Making slides and preparing for Final presentation’s slides

Friday 04/02/2022 1PM-3PM:

  • Final Presentation and Demo

Tuesday 01/02/2022 Lab 12PM-5PM:

On Tuesday lab, I have implemented 3 scenarios of Pepper’s greetings by using Choreographe. First one is by get_age function to have Pepper to guess human user’s age and perform greeting differently to different groups of human users. For example, Pepper is performing High Five greeting to human user who is younger than 30 years old and will perform Wave greetings to users who are older than 30 years old. The reason why I changed the age threshold of 30 years old from 18 years old is because we cannot find human teachers who are kids. More realistically speaking, it is relatively easier to find human teachers who are over 20 years old so that I have redesigned age threshold as 30 years old for Pepper robot to differentiate human users and perform greetings differently.

I have also designed movements of 3 greeting gestures by using timeline function of movements for Pepper, first of them is High-Five, second one is Waving greetings, the third one is fist-bump. And also I have used basic_awareness function to have human as a stimulus for Pepper robot.

Obstacles and challenges: I want to add some personalization and learning process for Pepper robot to learn from human teachers during greetings process. I have consulted TAs about this.

Wednesday 02/02/2022 Lab 11:45AM-5pm:

During lab on Wednesday, I have added teaching/learning elements for Pepper robot to learn from human teachers. Because there are quite some cases Pepper robot wrongly about user users' age. In this scenario, I have introduced personalization/teaching/learning process for Pepper robot to learn from human users and correct his greetings. When Pepper guessed age wrongly, then human teacher can tell Pepper his age is younger or older than Pepper’s guessed age, then Pepper will correct his greetings accordingly. I have also performed trouble-shootings for Pepper’s greeting process to solve some errors of gestures, and greeting flow process and also speech sentences.

Obstacles and challenges: I have asked professor and TA what do I need to further improve for my project’s Pepper’s greetings and waiting for reply and further improve it during lab on Thursday and Friday.

Thursday 03/02/2022 1:30PM-10PM:

During Thursday afternoon and evening, I have spending time on making slides for Final Presentation and preparing for the presentation. I have asked TAs some questions about simulation, and what should I add onto my presentation. And I have uploaded codes onto Github and also I have been spending time to write logbooks.

Obstacles and challenges: I have been asking questions to TAs about what should I improve for my presentation slides, what structure of Presentation slides is good, and what contents else I should add onto my presentation and got very constructive suggestions from them.

Friday 04/02/2022 1PM-3PM:

I’m going to do Final presentation and demo with physical robot Pepper during class.

Reflection and future works: I think it would be great to add features of remembering people’s face by Pepper so that Pepper robot can perform same greetings to the same person. And also to have more correct teaching and learning process for Pepper to guess age more correctly. And also to have higher-efficiency of time-management for doing this project to make sure progress of project is happening on time according to plans to have better implementation of planned ideas.