Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning

dc.contributor.advisorAbdul, Gaffar Mohammed Yunus
dc.contributor.advisorSon, Jarryd
dc.contributor.authorLin, Chiao-Shing
dc.date.accessioned2022-03-04T07:52:48Z
dc.date.available2022-03-04T07:52:48Z
dc.date.issued2021
dc.date.updated2022-03-03T12:38:39Z
dc.description.abstractThe space of hand gesture recognition using radar and sonar is dominated mostly by radar applications. In addition, the machine learning algorithms used by these systems are typically based on convolutional neural networks with some applications exploring the use of long short term memory networks. The goal of this study was to build and design a Sonar system that can classify hand gestures using a machine learning approach. Secondly, the study aims to compare convolutional neural networks to long short term memory networks as a means to classify hand gestures using sonar. A Doppler Sonar system was designed and built to be able to sense hand gestures. The Sonar system is a multi-static system containing one transmitter and three receivers. The sonar system can measure the Doppler frequency shifts caused by dynamic hand gestures. Since the system uses three receivers, three different Doppler frequency channels are measured. Three additional differential frequency channels are formed by computing the differences between the frequency of each of the receivers. These six channels are used as inputs to the deep learning models. Two different deep learning algorithms were used to classify the hand gestures; a Doppler biLSTM network [1] and a CNN [2]. Six basic hand gestures, two in each x- y- and z-axis, and two rotational hand gestures are recorded using both left and right hand at different distances. The gestures were also recorded using both left and right hands. Ten-Fold cross-validation is used to evaluate the networks' performance and classification accuracy. The LSTM was able to classify the six basic gestures with an accuracy of at least 96% but with the addition of the two rotational gestures, the accuracy drops to 47%. This result is acceptable since the basic gestures are more commonly used gestures than rotational gestures. The CNN was able to classify all the gestures with an accuracy of at least 98%. Additionally, The LSTM network is also able to classify separate left and right-hand gestures with an accuracy of 80% and The CNN with an accuracy of 83%. The study shows that CNN is the most widely used algorithm for hand gesture recognition as it can consistently classify gestures with various degrees of complexity. The study also shows that the LSTM network can also classify hand gestures with a high degree of accuracy. More experimentation, however, needs to be done in order to increase the complexity of recognisable gestures.
dc.identifier.apacitationLin, C. (2021). <i>Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning</i>. (). ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering. Retrieved from http://hdl.handle.net/11427/35900en_ZA
dc.identifier.chicagocitationLin, Chiao-Shing. <i>"Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning."</i> ., ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering, 2021. http://hdl.handle.net/11427/35900en_ZA
dc.identifier.citationLin, C. 2021. Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning. . ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering. http://hdl.handle.net/11427/35900en_ZA
dc.identifier.ris TY - Master Thesis AU - Lin, Chiao-Shing AB - The space of hand gesture recognition using radar and sonar is dominated mostly by radar applications. In addition, the machine learning algorithms used by these systems are typically based on convolutional neural networks with some applications exploring the use of long short term memory networks. The goal of this study was to build and design a Sonar system that can classify hand gestures using a machine learning approach. Secondly, the study aims to compare convolutional neural networks to long short term memory networks as a means to classify hand gestures using sonar. A Doppler Sonar system was designed and built to be able to sense hand gestures. The Sonar system is a multi-static system containing one transmitter and three receivers. The sonar system can measure the Doppler frequency shifts caused by dynamic hand gestures. Since the system uses three receivers, three different Doppler frequency channels are measured. Three additional differential frequency channels are formed by computing the differences between the frequency of each of the receivers. These six channels are used as inputs to the deep learning models. Two different deep learning algorithms were used to classify the hand gestures; a Doppler biLSTM network [1] and a CNN [2]. Six basic hand gestures, two in each x- y- and z-axis, and two rotational hand gestures are recorded using both left and right hand at different distances. The gestures were also recorded using both left and right hands. Ten-Fold cross-validation is used to evaluate the networks' performance and classification accuracy. The LSTM was able to classify the six basic gestures with an accuracy of at least 96% but with the addition of the two rotational gestures, the accuracy drops to 47%. This result is acceptable since the basic gestures are more commonly used gestures than rotational gestures. The CNN was able to classify all the gestures with an accuracy of at least 98%. Additionally, The LSTM network is also able to classify separate left and right-hand gestures with an accuracy of 80% and The CNN with an accuracy of 83%. The study shows that CNN is the most widely used algorithm for hand gesture recognition as it can consistently classify gestures with various degrees of complexity. The study also shows that the LSTM network can also classify hand gestures with a high degree of accuracy. More experimentation, however, needs to be done in order to increase the complexity of recognisable gestures. DA - 2021_ DB - OpenUCT DP - University of Cape Town KW - Sonar KW - hand gesture recognition KW - LSTM KW - CNN KW - human-computer interaction LK - https://open.uct.ac.za PY - 2021 T1 - Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning TI - Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning UR - http://hdl.handle.net/11427/35900 ER - en_ZA
dc.identifier.urihttp://hdl.handle.net/11427/35900
dc.identifier.vancouvercitationLin C. Dynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning. []. ,Faculty of Engineering and the Built Environment ,Department of Electrical Engineering, 2021 [cited yyyy month dd]. Available from: http://hdl.handle.net/11427/35900en_ZA
dc.language.rfc3066eng
dc.publisher.departmentDepartment of Electrical Engineering
dc.publisher.facultyFaculty of Engineering and the Built Environment
dc.subjectSonar
dc.subjecthand gesture recognition
dc.subjectLSTM
dc.subjectCNN
dc.subjecthuman-computer interaction
dc.titleDynamic Hand Gesture Recognition Using Ultrasonic Sonar Sensors and Deep Learning
dc.typeMaster Thesis
dc.type.qualificationlevelMasters
dc.type.qualificationlevelMSc
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
thesis_ebe_2021_lin chiao shing.pdf
Size:
11.84 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description:
Collections