Human Arm Simulation Using Kinect

Submission Deadline-30th April 2024
April 2024 Issue : Publication Fee: 30$ USD Submit Now
Submission Deadline-20th April 2024
Special Issue of Education: Publication Fee: 30$ USD Submit Now

International Journal of Research and Scientific Innovation (IJRSI) | Volume V, Issue III, March 2018 | ISSN 2321–2705

Human Arm Simulation Using Kinect

Nikunj Agarwal1, Priya Bajaj2, Jayesh Pal3, Piyush Kushwaha4

IJRISS Call for paper

  1, 2, 3, 4 Student, Computer Science & Engineering Department, IMS Engineering College, Ghaziabad, Uttar Pradesh, India

Abstract— In this presented work, a mechanical arm system is implemented which utilizes Kinect gesture recognition as the interface between human and robot arm. The movement of the human arm in 3D space is captured using Kinect, processed using MATLAB and simulated on the mechanical arm using servo motors, controlled by microcontroller.

Keywords— Gesture Recognition, Simulink, Kinect Sensor SDK, Image Processing

I. INTRODUCTION

Computers play a crucial role in our lives. Today, our lives have become dependent on the technology. The idea of our work has come from applications which may be used in the area of medicine, i.e. where the presence of human can contaminate the production process, for instance, where controlling any device remotely is necessary. Inspired by applications which may be used in the area of real time controlled systems, i.e. robotic arms, medicine robotic arms, robots, medicine robots and adaptive robots, home automation systems, gesture recognition systems obtained with markers. The main motive is to discover the capability to Kinect and tests the accuracy of its human skeleton recognition system.

The long term goal of this project to develop a remotely operated robotic arm, operated using Kinect and both the remote systems are connected via network(Internet), requires a human-friendly interface that transfers information of guidance or other types or commands. In the field of gesture recognition and robotics, many studies have been carried out on adapting gesture as an ideal communication interface in the Human-Robot Interaction (HRI) context.