Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning
| dc.contributor.advisor | Abdul, Gaffar Mohammed Yunus | |
| dc.contributor.advisor | Son, Jarryd | |
| dc.contributor.author | Lin, Chiao-Shing | |
| dc.date.accessioned | 2022-03-04T07:52:48Z | |
| dc.date.available | 2022-03-04T07:52:48Z | |
| dc.date.issued | 2021 | |
| dc.date.updated | 2022-03-03T12:38:39Z | |
| dc.description.abstract | The space of hand gesture recognition using radar and sonar is dominated mostly by radar applications. In addition, the machine learning algorithms used by these systems are typically based on convolutional neural networks with some applications exploring the use of long short term memory networks. The goal of this study was to build and design a Sonar system that can classify hand gestures using a machine learning approach. Secondly, the study aims to compare convolutional neural networks to long short term memory networks as a means to classify hand gestures using sonar. A Doppler Sonar system was designed and built to be able to sense hand gestures. The Sonar system is a multi-static system containing one transmitter and three receivers. The sonar system can measure the Doppler frequency shifts caused by dynamic hand gestures. Since the system uses three receivers, three different Doppler frequency channels are measured. Three additional differential frequency channels are formed by computing the differences between the frequency of each of the receivers. These six channels are used as inputs to the deep learning models. Two different deep learning algorithms were used to classify the hand gestures; a Doppler biLSTM network [1] and a CNN [2]. Six basic hand gestures, two in each x- y- and z-axis, and two rotational hand gestures are recorded using both left and right hand at different distances. The gestures were also recorded using both left and right hands. Ten-Fold cross-validation is used to evaluate the networks' performance and classification accuracy. The LSTM was able to classify the six basic gestures with an accuracy of at least 96% but with the addition of the two rotational gestures, the accuracy drops to 47%. This result is acceptable since the basic gestures are more commonly used gestures than rotational gestures. The CNN was able to classify all the gestures with an accuracy of at least 98%. Additionally, The LSTM network is also able to classify separate left and right-hand gestures with an accuracy of 80% and The CNN with an accuracy of 83%. The study shows that CNN is the most widely used algorithm for hand gesture recognition as it can consistently classify gestures with various degrees of complexity. The study also shows that the LSTM network can also classify hand gestures with a high degree of accuracy. More experimentation, however, needs to be done in order to increase the complexity of recognisable gestures. | |
| dc.identifier.apacitation | Lin, C. (2021). <i>Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning</i>. (). ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering. Retrieved from http://hdl.handle.net/11427/35900 | en_ZA |
| dc.identifier.chicagocitation | Lin, Chiao-Shing. <i>"Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning."</i> ., ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering, 2021. http://hdl.handle.net/11427/35900 | en_ZA |
| dc.identifier.citation | Lin, C. 2021. Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning. . ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering. http://hdl.handle.net/11427/35900 | en_ZA |
| dc.identifier.ris | TY - Master Thesis AU - Lin, Chiao-Shing AB - The space of hand gesture recognition using radar and sonar is dominated mostly by radar applications. In addition, the machine learning algorithms used by these systems are typically based on convolutional neural networks with some applications exploring the use of long short term memory networks. The goal of this study was to build and design a Sonar system that can classify hand gestures using a machine learning approach. Secondly, the study aims to compare convolutional neural networks to long short term memory networks as a means to classify hand gestures using sonar. A Doppler Sonar system was designed and built to be able to sense hand gestures. The Sonar system is a multi-static system containing one transmitter and three receivers. The sonar system can measure the Doppler frequency shifts caused by dynamic hand gestures. Since the system uses three receivers, three different Doppler frequency channels are measured. Three additional differential frequency channels are formed by computing the differences between the frequency of each of the receivers. These six channels are used as inputs to the deep learning models. Two different deep learning algorithms were used to classify the hand gestures; a Doppler biLSTM network [1] and a CNN [2]. Six basic hand gestures, two in each x- y- and z-axis, and two rotational hand gestures are recorded using both left and right hand at different distances. The gestures were also recorded using both left and right hands. Ten-Fold cross-validation is used to evaluate the networks' performance and classification accuracy. The LSTM was able to classify the six basic gestures with an accuracy of at least 96% but with the addition of the two rotational gestures, the accuracy drops to 47%. This result is acceptable since the basic gestures are more commonly used gestures than rotational gestures. The CNN was able to classify all the gestures with an accuracy of at least 98%. Additionally, The LSTM network is also able to classify separate left and right-hand gestures with an accuracy of 80% and The CNN with an accuracy of 83%. The study shows that CNN is the most widely used algorithm for hand gesture recognition as it can consistently classify gestures with various degrees of complexity. The study also shows that the LSTM network can also classify hand gestures with a high degree of accuracy. More experimentation, however, needs to be done in order to increase the complexity of recognisable gestures. DA - 2021_ DB - OpenUCT DP - University of Cape Town KW - Sonar KW - hand gesture recognition KW - LSTM KW - CNN KW - human-computer interaction LK - https://open.uct.ac.za PY - 2021 T1 - Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning TI - Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning UR - http://hdl.handle.net/11427/35900 ER - | en_ZA |
| dc.identifier.uri | http://hdl.handle.net/11427/35900 | |
| dc.identifier.vancouvercitation | Lin C. Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning. []. ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering, 2021 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/35900 | en_ZA |
| dc.language.rfc3066 | eng | |
| dc.publisher.department | Department of Electrical Engineering | |
| dc.publisher.faculty | Faculty of Engineering and the Built Environment | |
| dc.subject | Sonar | |
| dc.subject | hand gesture recognition | |
| dc.subject | LSTM | |
| dc.subject | CNN | |
| dc.subject | human-computer interaction | |
| dc.title | Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning | |
| dc.type | Master Thesis | |
| dc.type.qualificationlevel | Masters | |
| dc.type.qualificationlevel | MSc |