Detailed Information

Cited 18 time in webofscience Cited 25 time in scopus
Metadata Downloads

Smart Healthcare Hand Gesture Recognition using CNN-based Detector and Deep Belief Networkopen access

Authors
Alonazi, M.[Alonazi, Mohammed]Ansar, H.[Ansar, Hira]Mudawi, N.A.[Mudawi, Naif Al]Alotaibi, S.S.[Alotaibi, Saud S.]Almujally, N.A.[Almujally, Nouf Abdullah]Alazeb, A.[Alazeb, Abdulwahab]Jalal, A.[Jalal, Ahmad]Kim, J.[Kim, Jaekwang]Min, M.[Min, Moohong]
Issue Date
2023
Publisher
Institute of Electrical and Electronics Engineers Inc.
Keywords
Computational modeling; Convolution; convolution neural network; Convolutional neural networks; deep belief network; Evidence theory; Feature extraction; fuzzy logic; Fuzzy logic; Gesture recognition; hand detection and tracking; neural gas; Support vector machines; thermal locomotion mapping; Videos
Citation
IEEE Access, v.11, pp.1 - 1
Indexed
SCIE
SCOPUS
Journal Title
IEEE Access
Volume
11
Start Page
1
End Page
1
URI
https://scholarx.skku.edu/handle/2021.sw.skku/107021
DOI
10.1109/ACCESS.2023.3289389
ISSN
2169-3536
Abstract
Gesture recognition in dynamic images is challenging in computer vision, automation and medical field. Hand gesture tracking and recognition between both human and computer must have symmetry in real world. With advances in sensor technology, numerous researchers have recently proposed RGB gesture recognition techniques. In our research paper, we introduce a reliable hand gesture tracking and recognition model that is accurate despite any complex environment, it can track and recognise RGB dynamic gestures. Firstly, videos are converted into frames. After images light intensity adjustment and noise removal, images are passed through CNN for hand gesture extraction. Then from the extracted hand, features are extracted from full hand. Neural gas and locomotion thermal mapping are extracted to make the feature vector. The feature vector are then passed through the fuzzy optimiser to reduce the uncertainties and the fuzziness. The optimised features are then passed to the classifier Deep Belief Network (DBW) for the classification of the gestures. Egogesture and Jester datasets are used for the validation of proposed systems. The experimental results over Egogesture and Jester datasets demonstrate overall accuracies of 90.73% and 89.33% respectively. The experiments proves our system readability and suitability of our proposed model with the other state of the arts model. Author
Files in This Item
There are no files associated with this item.
Appears in
Collections
Computing and Informatics > Convergence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher KIM, JAEKWANG photo

KIM, JAEKWANG
Computing and Informatics (Convergence)
Read more

Altmetrics

Total Views & Downloads

BROWSE