Gesture recognition with application to human-robot interaction
| dc.contributor.advisor | Nicolls, Fred | en_ZA |
| dc.contributor.advisor | Senekal, F | en_ZA |
| dc.contributor.author | Mangera, Ra'eesah | en_ZA |
| dc.date.accessioned | 2015-08-14T14:27:15Z | |
| dc.date.available | 2015-08-14T14:27:15Z | |
| dc.date.issued | 2015 | en_ZA |
| dc.description.abstract | Gestures are a natural form of communication, often transcending language barriers. Recently, much research has been focused on achieving natural human-machine interaction using gestures. This dissertation presents the design of a gestural interface that can be used to control a robot. The system consists of two modes: far-mode and near-mode. In far-mode interaction, upper-body gestures are used to control the motion of a robot. Near-mode interaction uses static hand poses to control a graphical user interface. For upper-body gesture recognition, features are extracted from skeletal data. The extracted features consist of joint angles and relative joint positions and are extracted for each frame of the gesture sequence. A novel key-frame selection algorithm is used to align the gesture sequences temporally. A neural network and hidden Markov model are then used to classify the gestures. The framework was tested on three different datasets, the CMU Military dataset of 3 users, 15 gestures and 10 repetitions per gesture, the VisApp2013 dataset with 28 users, 8 gestures and 1 repetition/gesture and a recorded dataset of 15 users, 10 gestures and 3 repetitions per gesture. The system is shown to achieve a recognition rate of 100% across the three different datasets, using the key-frame selection and a neural network for gesture identification. Static hand-gesture recognition is achieved by first retrieving the 24-DOF hand model. The hand is segmented from the image using both depth and colour information. A novel calibration method is then used to automatically obtain the anthropometric measurements of the user’s hand. The k-curvature algorithm, depth-based and parallel border-based methods are used to detect fingertips in the image. An average detection accuracy of 88% is achieved. A neural network and k-means classifier are then used to classify the static hand gestures. The framework was tested on a dataset of 15 users, 12 gestures and 3 repetitions per gesture. A correct classification rate of 75% is achieved using the neural network. It is shown that the proposed system is robust to changes in skin colour and user hand size. | en_ZA |
| dc.identifier.apacitation | Mangera, R. (2015). <i>Gesture recognition with application to human-robot interaction</i>. (Thesis). University of Cape Town ,Faculty of Engineering & the Built Environment ,Department of Electrical Engineering. Retrieved from http://hdl.handle.net/11427/13732 | en_ZA |
| dc.identifier.chicagocitation | Mangera, Ra'eesah. <i>"Gesture recognition with application to human-robot interaction."</i> Thesis., University of Cape Town ,Faculty of Engineering & the Built Environment ,Department of Electrical Engineering, 2015. http://hdl.handle.net/11427/13732 | en_ZA |
| dc.identifier.citation | Mangera, R. 2015. Gesture recognition with application to human-robot interaction. University of Cape Town. | en_ZA |
| dc.identifier.ris | TY - Thesis / Dissertation AU - Mangera, Ra'eesah AB - Gestures are a natural form of communication, often transcending language barriers. Recently, much research has been focused on achieving natural human-machine interaction using gestures. This dissertation presents the design of a gestural interface that can be used to control a robot. The system consists of two modes: far-mode and near-mode. In far-mode interaction, upper-body gestures are used to control the motion of a robot. Near-mode interaction uses static hand poses to control a graphical user interface. For upper-body gesture recognition, features are extracted from skeletal data. The extracted features consist of joint angles and relative joint positions and are extracted for each frame of the gesture sequence. A novel key-frame selection algorithm is used to align the gesture sequences temporally. A neural network and hidden Markov model are then used to classify the gestures. The framework was tested on three different datasets, the CMU Military dataset of 3 users, 15 gestures and 10 repetitions per gesture, the VisApp2013 dataset with 28 users, 8 gestures and 1 repetition/gesture and a recorded dataset of 15 users, 10 gestures and 3 repetitions per gesture. The system is shown to achieve a recognition rate of 100% across the three different datasets, using the key-frame selection and a neural network for gesture identification. Static hand-gesture recognition is achieved by first retrieving the 24-DOF hand model. The hand is segmented from the image using both depth and colour information. A novel calibration method is then used to automatically obtain the anthropometric measurements of the user’s hand. The k-curvature algorithm, depth-based and parallel border-based methods are used to detect fingertips in the image. An average detection accuracy of 88% is achieved. A neural network and k-means classifier are then used to classify the static hand gestures. The framework was tested on a dataset of 15 users, 12 gestures and 3 repetitions per gesture. A correct classification rate of 75% is achieved using the neural network. It is shown that the proposed system is robust to changes in skin colour and user hand size. DA - 2015 DB - OpenUCT DP - University of Cape Town LK - https://open.uct.ac.za PB - University of Cape Town PY - 2015 T1 - Gesture recognition with application to human-robot interaction TI - Gesture recognition with application to human-robot interaction UR - http://hdl.handle.net/11427/13732 ER - | en_ZA |
| dc.identifier.uri | http://hdl.handle.net/11427/13732 | |
| dc.identifier.vancouvercitation | Mangera R. Gesture recognition with application to human-robot interaction. [Thesis]. University of Cape Town ,Faculty of Engineering & the Built Environment ,Department of Electrical Engineering, 2015 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/13732 | en_ZA |
| dc.language.iso | eng | en_ZA |
| dc.publisher.department | Department of Electrical Engineering | en_ZA |
| dc.publisher.faculty | Faculty of Engineering and the Built Environment | |
| dc.publisher.institution | University of Cape Town | |
| dc.subject.other | Electrical Engineering | en_ZA |
| dc.title | Gesture recognition with application to human-robot interaction | en_ZA |
| dc.type | Master Thesis | |
| dc.type.qualificationlevel | Masters | |
| dc.type.qualificationname | MSc (Eng) | en_ZA |
| uct.type.filetype | Text | |
| uct.type.filetype | Image | |
| uct.type.publication | Research | en_ZA |
| uct.type.resource | Thesis | en_ZA |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- thesis_ebe_2015_mangera_r.pdf
- Size:
- 2.61 MB
- Format:
- Adobe Portable Document Format
- Description: