A Novel Approach of Prosthetic Arm Control using Computer Vision, Biosignals, and Motion Capture
Harold Martin1 Jaime Donaw1 Robert Kelly2 YoungJin Jung3 *Jong-Hoon Kim1 1School of Computing and Information Sciences, Florida International University, Miami, FL, USA 2School of Electrical Engineering and Computer Science, Louisiana State University, Baton Rouge, LA, USA 3Center of Advanced Rehabilitation/Research and Education, Nicole Wertheim College of Nursing & Health Sciences, Florida International University, Miami, FL, USA
Abstract— Modern day prosthetics are traditionally controlled using EMG readings, which allow the user to control a limited number of degrees of freedom at one time. This creates a serious disadvantage compared to a biological arm because it constrains the fluid motion and dynamic functionality of the device. We present a novel architecture for controlling a transhumeral prosthetic device through the combination of several techniques, namely computer vision algorithms operating on “eye gaze” data, traditional prosthetic control methods, and the operator’s motion capture data. This sensor fusion allows the prosthetic device to locate itself in a 3D environment as well as the locations of objects of interest. Moreover, this architecture enables a more seamless motion and intuitive control of the prosthetic device. In this paper, we demonstrate the feasibility of this architecture and its implementation with a prototype.