counter easy hit

Friday, October 12, 2018

Download EEG Based Multi-finger Prosthesis PDF Free

EEG Based Multi-finger Prosthesis PDF
By:Harutyun Sarkisyan,California State University, Sacramento
Published on 2013 by


Electroencephalography (EEG) signals recorded from the Motor Cortex and Electromyography (EMG) signals recorded from the facial muscles are used to control a prosthetic hand. Preprocessing and pattern recognition phase of EEG signal analysis is performed by the Emotiv Software Suite. The proposed Brain Computer Interface System is using seven inputs to control a motorized prosthesis. These inputs range from EEG recorded from mental tasks to EEG and EMG recorded from actual limb movements. Imagined limb movements will be detected based on the brain wave rhythm at specific locations on the scalp. Each type of detected EEG pattern will be used to control a specific finger on the prosthetic hand. Electromyography signals will be recorded from eye movement and eye brow movement. Giving the user the ability to manipulate each finger individually will allow the user to perform tasks that are not possible when using body powered prosthetic hand or the EMG based prosthetic hand. The accuracy of individual finger manipulation was further increased by combining two different EEG/EMG inputs to control one desired output. Based on the user's EEG and EMG signals the following were most accurately detected by the Emotiv Suite: left/right eye wink, imagined left hand movement, clenching of team, smirk, and raising of both eyebrows. While testing for EEG/EMG compatibility, the following combinations were found impossible to be used at the same time: clenching + raising eyebrows, clenching + imagined hand movement, and winking with both eyes. The following combination of EEG/EMG signals showed good results: left eye wink + imagined left hand movement, right eye wink + imagined left hand movement, clenching + left eye wink, clenching + right eye wink, clenching + look right, raising eyebrows + look right, and smirk left + right eye wink. Hand grabber from Toysmith was redesigned to fit the needs of this project. Arduino Uno R3 Board was used to control five 180 degree servo motors. Emokey software was used to send the detected EEG patterns from the Emotiv Software to the Arduino serial monitor window. One possible future study would be to use a 32 or 64 channel EEG headset to redo this research project and compare the accuracy of controlling each individual finger. The 32/64 electrode EEG headset would help in detecting multiple imagined tasks more accurately, thus giving us additional input commands to be used for controlling the prosthesis.

This Book was ranked at 35 by Google Books for keyword Prosthesis.

Book ID of EEG Based Multi-finger Prosthesis's Books is dwZ0ngEACAAJ, Book which was written byHarutyun Sarkisyan,California State University, Sacramentohave ETAG "Q9ogmH3IOyY"

Book which was published by since 2013 have ISBNs, ISBN 13 Code is and ISBN 10 Code is

Reading Mode in Text Status is false and Reading Mode in Image Status is false

Book which have "50 Pages" is Printed at BOOK under Category

Book was written in en

eBook Version Availability Status at PDF is falseand in ePub is false

Book Preview


Download EEG Based Multi-finger Prosthesis PDF Free

Download EEG Based Multi-finger Prosthesis Books Free

Download EEG Based Multi-finger Prosthesis Free

Download EEG Based Multi-finger Prosthesis PDF

Download EEG Based Multi-finger Prosthesis Books

No comments:

Post a Comment