Human & Robot Interaction Lab. (TaarLab)

Login
A+ A A-

Nowadays, advancements in robotics have enabled robotic systems to go beyond their classical restrictions up to a point that not only they are a viable replacement for human workforce, but they can also act alongside or in direct interaction with human beings. One of the resulting focus areas of this development is Humanoid- Robot Interaction (HRI), e.g., in the context of collaborative robots that work with human workforce on the floor of large scale factories, elderly care robots, service robots, personal robots and even robotic pets. The HRI discipline will as a result generate new standards to insure a safe interaction between robotic systems and human beings on a variety of levels from physical to emotional.

TaarLab possesses a number of robotic platforms and its researchers have already obtained significant experiences in Parallel Mechanisms (PMs). In the near future we are going to practice new approaches in the HRI field which would cover a wide range of topics whilst maintaining a bigger focus on Physical HRI topics (pHRI). Some of these topics that are of interest to us and will be covered are listed below:

 

HRI in Tripteron Robot

People Involved
   Mohammad SharifzadehAzadeh Droudchi

In Tripteron, as a decoupled parallel robot performing 3 translational motion,  the main goal in HRI field is concerned with construction an HRI interface mounted on the End-Effector in which the amount and direction of force attended by the users hand is sensed, processed and sent to computer. This information would be the main input of dynamical control loops and the servo motors would be driven in such a way that the End-Effector of the robot would move through the desired direction where the speed is directly related to the amount of the applied force. The main sensor in this interface is Load-Cell which the resistance of its strain gauges is changed by attended stress. The construction of the HRI Interface, as a mechatronic problem,   contains mechanical design and construction, design and construction of amplifier and filter circuit, usage of Micro Controllers as pre-processor, sending data through Bluetooth module a transmitter and main data processing in QT programming.

tripteron

An Experimental Study on Blinking Detection via EEG Signals forHuman-Robot Interaction Purposes Based on a Spherical 2-DOF Parallel Robot

People Involved
   Alaleh Arian

Blinking and eye movement are one of the most important abilities that most people have, even people with spinal cord problem. By using this ability these people could handle some of their activities such as moving their wheelchair without the help of others. One of the most important fields in Human-Robot Interaction is the development of artificial limbs working with brain signals. The purpose of this project consists in detecting blinking and left and right eye movements via electroencephalogram signals which is also useful in determining sleepiness. Moving Window, Fast Fourier Transform and Wavelet Transform are the methods which are used in this project. Obtained results reveal that the most reliable method to recognize blinking and eye movements is Fast Fourier Transform since it is almost insensitive and invariant to the defined variables. After recognition of blinking and eye movement, these actions are transferred to a robot which is called the spherical 2-DOFparallel robot, built first at the Laval University.

hri2 eigleeye2

Real-time Tracking of an Object by a 2-DOF Spherical Parallel Robot with a GUI in Linux

People Involved
   Milad Eyvazi Hesar

In this research, by means of the identified system derived from the common online recursive least square (RLS) method, 2 types of controllers, PID and Sliding mode Controller, were applied on the 2-DoF Agile Eye to track a ball. Some common algorithms in Image Processing (for for the purpose of color filteration, noise reduction and so on) were also hired to optimize ball detection in a common space with all noises and relatively close colors to the ball (also the other constrains forced from camera and even light in the environment). Furtheremore, an Auto Regressive(AR) model was used to predict the steps ahead of the ball movement to diminish the latency in response ,then approach to a relatively real-time system. Moreover, awared selection of existed C++ scripts and OpenCV libraries in this project, helped us to save time and never try to "reinvent a wheel again!". On another side, designing a GUI for this work was another barrior which finally ended up with a simple one just to show the all events together. As a conclusion, the results of this study were compared and published in ICROM 2014. 

hri3

Development of a Glove for Hand-gesture Interaction with mechatronics devices based on Sensor Fusion and Fuzzy Logic

People Involved
   Arya Saboury

For control robotics and mechatrnics systems a special glove is designed which cancontrol the robots by moving only two fingers and the wrist. Cheap and open source technology has also been used in TL-HCG(Taarlab-hand controlled glove) design.

Easy and innovative robot control is one of the advantages of this glove and also different actuators without the need of side controls like different kinds of joy sticks.

This creative control with the ability to exclude several hardware interfaces and real time performance enables the ability to create ideas and products across various industries for the humans and robots interaction. The performance of this glove has been tested on TL-OSR(Taarlab-Open Source Robots) and the robot is controlled by sensory data fusion and by using fuzzy logic by hand gestures. These technics can be used for helping the disabled and handicapped people, treatment for children with autism or making new control systems for mechatronic devices.

glove

Design and Implement 3-DOF (Rotational) platform for robots tests

People Involved
   Alireza Safaryazdi

Any robots after build and developed control systems must be tested on standard environment.

This platform are allow user test their robots and improve robots abilities in best way. This platform can permit researchers to test their robots on three rotational degrees of freedom as known as roll, pitch and yaw.

Low cost production is one of this mechanism’s feature. Researchers can choose any rotational degrees of freedom optional or use there combination to each other. For example use roll, pitch and yaw independently or combination roll-pitch and yaw-roll and etc.

On upper plate (robots stand on it) in this platform installed 6-DOF sensor to measure rotation and comparison with robots data and used for other types of researchers demand.

Until now this platform is used for 2-DOF spherical robot (agile-eye) and NAO.

platform1 platform2

Design and implemention camera stabilizer controller system with 6-DOF Sensors based on agile eye 2DOF platform

People Involved
   Alireza SafaryazdiOmid Abolghasemi

Using the camera stabilizer systems on aerial vehicles is very important subject today, Because of the main mission of MAV’s and UAV’s includes video capturing, mapping and etc.

Camera stabilizers are suitable for any case of video recording and take pictures. 

With this attitudes we decide to design a camera stabilizer system on 2-DOF spherical robot (agile eye) platform.

To achieve this purpose we design a system that includes controlling program on PC (QT programming), GUI, Transmitter-board and Receiver-board.

The transmitter-board includes a microcontroller, 6-DOF Sensor and wireless transceiver.  Microcontroller on transmitter-board reads data from 6-DOF Sensor using I2C Protocol and writing them on the air with the 2.4GHz wireless transceiver using SPI Protocol.

On the other side we design a board to receiving data from the wireless signals and sending them to controller pc. This board reads data from the RF signals with wireless transceiver module and send them to controller pc using RS232 Protocol. 

At the controller designed on pc a PID controller program with GUI using multithreading methods to eliminate any delay on control loop, getting real time monitoring end-effector rotational state on GUI plotter and set some variables.

eigleeye1 camera1

Design and implemention camera stabilizer controller system with 6-DOF Sensors based on agile eye 2DOF platform

People Involved
   Omid Abolghasemi ,Alireza Safaryazdi

 Using the camera stabilizer systems on aerial vehicles is very important subject today, Because of the main mission of MAV’s and UAV’s includes video capturing, mapping and etc.
Camera stabilizers are suitable for any case of video recording and take pictures. 
With this attitudes we decide to design a camera stabilizer system on 2-DOF spherical robot (agile eye) platform.
To achieve this purpose we design a system that includes controlling program on PC (QT programming), GUI, ransmitter-board and Receiver-board.


The transmitter-board includes a microcontroller, 6-DOF Sensor and wireless transceiver.  Microcontroller on transmitter-board reads data from 6-DOF Sensor using I2C Protocol and writing them on the air with the 2.4GHz wireless transceiver using SPI Protocol.

On the other side we design a board to receiving data from the wireless signals and sending them to controller pc. This board reads data from the RF signals with wireless transceiver module and send them to controller pc using RS232 Protocol. 

At the controller designed on pc a PID controller program with GUI using multithreading methods to eliminate any delay on control loop, getting real time monitoring end-effector rotational state on GUI plotter and set some variables.

elecboard1 

TaarLab 2014, All Rights Reserved.
Designed, Developed and Powered by: Payam Ghasemi & Nima Karbasi

Login