Human Robot Interaction System

The Robot-Receptionist project is an idea initiated in Information Technology University as an effort to create a user-friendly intelligent Robot which will help people by giving required assistance. This robot uses Computer Vision and Artificial Intelligence to detect different hand gestures, recognize faces and facial expressions, along with creating his own daily life stories and events for people of different mindsets. You can easily use this Robot to get information of ITU Employees and also play interactive games if you are feeling bored. The main objective of this project is to build a robot that can not only assist people as a receptionist but can also listen to them, understand them and respond to them accordingly. This distinct feature will be used to mentally ease the user and reduce the everyday stress of the general public.

Social Robotics and Human Robot Interaction

Human behaviour is an interesting area of research since centuries, however the trend and techniques of understanding human responses have changed with time. With the emergence of robotics, scientists all over the world have been keen to understand human behaviours with robots. Robot receptionist is deployed in front of the main gate of Arfa Technology Park, and hundreds of visitors interact with the robot for various reasons. The interaction records are used to analyse and study human robot interactions and different experiments are performed to reach successful conclusions.

Robot Receptionist Features

The Receptionist robot uses mid-air input to interact with the users. The robot gives instructions and relevant information to the users and also interacts with them according to the situation. The prototype talks to people in Urdu and attracts a large number of audiences. The mid-air input functionalities include,

Cursor Control: Cursor is controlled remotely using hand detection and tracking. A grip gesture is used to click on current position.

Highlight gripping: Currently selected button is highlighted based on the calculation of coordinates of detected hand and the user easy-to-move region. A grip gesture is used to click on the selected button.

Swipe: A swipe gesture was use to navigate through the options and buttons.

The robot has the capability to lip sync all Urdu characters while interacting with the users. Currently, implementation on mood detection is in progress. The Robot also gives an option to play games with hand gestures if the user is bored. The nature of the project revolves around computer vision, human robot interaction, social robotics and affective computing.

Face Detection:

Faces are detected using Viola Jones algorithm which is the most basic and close to human method. This algorithm uses sample pictures of a specific thing in different conditions and tries to learn them exactly like how a child starts to see and name things.


Facial Expression Recognition:

The detected Faces are processed using Chehra 3D Head Pose Estimator for Matlab[1] and Chehra Matlab Fitting Model[1] to detect Head pose and extract facial landmarks. These Landmarks are tracked and processed along with values of pitch, yaw and roll in a learning model to predict the basic expressions of the user.



A customized algorithm is implemented which use a string to generate an order of respective viseme images and duration array. Lip-Sync is created using these two arrays along with the robotic voice from text to speech module.


Hand Gesture game:

A puzzle game was implemented for the user which uses swipe gesture to arrange the puzzle pieces.


Interactive Pen Game:

The same game was controlled using color detection. It highlights the selectable puzzle piece using the region estimation of a specific color. This color is illuminated using a button to create a different color which is used to select the puzzle piece and move it to the required direction.


Research Publication

Topic: Designing Robot Receptionist for Overcoming Poor Infrastructure, Low Literacy and Low Rate of Female Interaction
Author Names:- Talha Rehmani, Sabur Butt, Inam-ur-Rehman Baig, Mohammad Zubair Malik, Mohsen Ali:
Conference:- HRI (Companion) 2018' Chicago