[1]X. Huang, Q. Wang, S. Zang, J. Wan, G. Yang, Y. Huang and X. Ren, “Tracing the Motion of Finger Joints for Gesture Recognition via Sewing RGO-Coated Fibers Onto a Textile Glove,” IEEE Sensors Journal, vol. 19, no. 20, pp. 9504-9511, 15 Oct.15, 2019.
[2]M. Lee and J. Bae, “Deep Learning Based Real-Time Recognition of Dynamic Finger Gestures Using a Data Glove,” IEEE Access, vol. 8, pp. 219923-219933, 2020.
[3]P. Pławiak, T. Sośnicki, M. Niedźwiecki, Z. Tabor and K. Rzecki, “Hand Body Language Gesture Recognition Based on Signals from Specialized Glove and Machine Learning Algorithms,” IEEE Transactions on Industrial Informatics, vol. 12, no. 3, pp. 1104-1113, June 2016.
[4]S. Jiang, B. Lv, W. Guo, C. Zhang, H. Wang, X. Sheng and P.B. Shull, “Feasibility of Wrist-Worn, Real-Time Hand, and Surface Gesture Recognition via sEMG and IMU Sensing,” IEEE Transactions on Industrial Informatics, vol. 14, no. 8, pp. 3376-3385, Aug. 2018.
[5]C. Shen, Y. Chen, G. Yang and X. Guan, “Toward Hand-Dominated Activity Recognition Systems with Wristband-Interaction Behavior Analysis,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 50, no. 7, pp. 2501-2511, July 2020.
[6]Y. Zhang, B. Liu and Z. Liu, “Recognizing Hand Gestures with Pressure-Sensor-Based Motion Sensing,” IEEE Transactions on Biomedical Circuits and Systems, vol. 13, no. 6, pp. 1425-1436, Dec. 2019.
[7]T. Simões Dias, J. J. A. M. Júnior and S. F. Pichorim, “An Instrumented Glove for Recognition of Brazilian Sign Language Alphabet,” IEEE Sensors Journal, vol. 22, no. 3, pp. 2518-2529, 1 Feb.1, 2022.
[8]T. Zhao, J. Liu, Y. Wang, H. Liu and Y. Chen, “Towards Low-Cost Sign Language Gesture Recognition Leveraging Wearables,” IEEE Transactions on Mobile Computing, vol. 20, no. 4, pp. 1685-1701, 1 April 2021.
[9]H. Rewari, V. Dixit, D. Batra and N. Hema, “Automated Sign Language Interpreter,” Proc. 2018 Eleventh International Conference on Contemporary Computing (IC3), 2018, pp. 1-5.
[10]A. Sengupta, T. Mallick and A. Das, “A Cost Effective Design and Implementation of Arduino Based Sign Language Interpreter,” Proc. 2019 Devices for Integrated Circuit (DevIC), 2019, pp. 12-15.
[11]S. P. Y. Jane and S. Sasidhar, “Sign Language Interpreter: Classification of Forearm EMG and IMU Signals for Signing Exact English,” Proc. 2018 IEEE 14th International Conference on Control and Automation (ICCA), 2018, pp. 947-952.
[12]D. L. Arsenault and A. D. Whitehead, “Quaternion Based Gesture Recognition Using Worn Inertial Sensors in a Motion Tracking System,” Proc. 2014 IEEE Games Media Entertainment, 2014, pp. 1-7.
[13]Y. Liu, F. Jiang and M. Gowda, “Application Informed Motion Signal Processing for Finger Motion Tracking Using Wearable Sensors,” Proc. ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020, pp. 8334-8338.
[14]P. Bellitti, A.D. Angelis, M. Dionigi, E. Sardini, M. Serpelloni, A. Moschitta and P. Carbone, “A Wearable and Wirelessly Powered System for Multiple Finger Tracking,” IEEE Transactions on Instrumentation and Measurement, vol. 69, no. 5, pp. 2542-2551, May 2020.
[15]Y. Chen, Y. Tsai, K. Huang and P. H. Chou, “MobiRing: A Finger-Worn Wireless Motion Tracker,” Proc. 2014 IEEE International Conference on Internet of Things (iThings), and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom), 2014, pp. 316-323.
[16]H. S. Chudgar, S. Mukherjee and K. Sharma, “S Control: Accelerometer-Based Gesture Recognition for Media Control,” Proc. 2014 International Conference on Advances in Electronics Computers and Communications, 2014, pp. 1-6.
[17]J. Yao, H. Chen, Z. Xu, J. Huang, J. Li, J. Jia and H. Wu, “Development of a Wearable Electrical Impedance Tomographic Sensor for Gesture Recognition with Machine Learning,” IEEE Journal of Biomedical and Health Informatics, vol. 24, no. 6, pp. 1550-1556, June 2020.
[18]X. Liang, R. Ghannam and H. Heidari, “Wrist-Worn Gesture Sensing with Wearable Intelligence,” IEEE Sensors Journal, vol. 19, no. 3, pp. 1082-1090, 1 Feb.1, 2019.
[19]F. -T. Liu, Y. -T. Wang and H. -P. Ma, “Gesture Recognition with Wearable 9-Axis Sensors,” Proc. 2017 IEEE International Conference on Communications (ICC), 2017, pp. 1-6.
[20]K. S. Krishnan, A. Saha, S. Ramachandran and S. Kumar, “Recognition of Human Arm Gestures Using Myo Armband for the Game of Hand Cricket,” Proc. 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), 2017, pp. 389-394.
[21]Y. -T. Liu, Y. -A. Zhang and M. Zeng, “Novel Algorithm for Hand Gesture Recognition Utilizing a Wrist-Worn Inertial Sensor,” IEEE Sensors Journal, vol. 18, no. 24, pp. 10085-10095, 15 Dec.15, 2018.
[22]K. Yamagishi, L. Jing and Z. Cheng, “A System for Controlling Personal Computers by Hand Gestures Using a Wireless Sensor Device,” Proc. 2014 IEEE International Symposium on Independent Computing (ISIC), 2014, pp. 1-7,.
[23]D. Moazen, S. A. Sajjadi and A. Nahapetian, “AirDraw: Leveraging Smart Watch Motion Sensors for Mobile Human Computer Interactions,” Proc. 2016 13th IEEE Annual Consumer Communications & Networking Conference (CCNC), 2016, pp. 442-446.
[24]X. Liu, J. Sacks, M. Zhang, A. G. Richardson, T. H. Lucas and J. Van der Spiegel, “The Virtual Trackpad: An Electromyography-Based, Wireless, Real-Time, Low-Power, Embedded Hand-Gesture-Recognition System Using an Event-Driven Artificial Neural Network,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 64, no. 11, pp. 1257-1261, Nov. 2017.
[25]S. R. Kurniawan and D. Pamungkas, “MYO Armband Sensors and Neural Network Algorithm for Controlling Hand Robot,” Proc. 2018 International Conference on Applied Engineering (ICAE), 2018, pp. 1-6.
[26]F. S. Sayin, S. Ozen and U. Baspinar, “Hand Gesture Recognition by Using sEMG Signals for Human Machine Interaction Applications,” Proc. 2018 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA), 2018, pp. 27-30.
[27]S. Hickman, A. S. Mirzakhani, J. Pabon and R. Alba-Flores, “A Case Study on Tuning Artificial Neural Networks to Recognize Signal Patterns of Hand Motions,” Proc. SoutheastCon 2015, 2015, pp. 1-4.
[28]U. Baspinar, H. Selcuk Varol and K. Yildiz, “Classification of Hand Movements by Using Artificial Neural Network,” Proc. 2012 International Symposium on Innovations in Intelligent Systems and Applications, 2012, pp. 1-4.
[29]D. Qu, Z. Huang, Z. Gao, Y. Zhao, X. Zhao and G. Song, “An Automatic System for Smile Recognition Based on CNN and Face Detection,” Proc. 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2018, pp. 243-247.
[30]Y. Zhou, H. Ni, F. Ren and X. Kang, “Face and Gender Recognition System Based on Convolutional Neural networks,” Proc. 2019 IEEE International Conference on Mechatronics and Automation (ICMA), 2019, pp. 1091-1095.
[31]R. Szmurlo and S. Osowski, “Deep CNN Ensemble for Recognition of Face Images,” Proc. 2021 22nd International Conference on Computational Problems of Electrical Engineering (CPEE), 2021, pp. 1-4.
[32]A. Song, Q. Hu, X. Ding, X. Di and Z. Song, “Similar Face Recognition Using the IE-CNN Model,” IEEE Access, vol. 8, pp. 45244-45253, 2020.
[33]D. S. Breland, A. Dayal, A. Jha, P. K. Yalavarthy, O. J. Pandey and L. R. Cenkeramaddi, “Robust Hand Gestures Recognition Using a Deep CNN and Thermal Images,” IEEE Sensors Journal, vol. 21, no. 23, pp. 26602-26614, 1 Dec.1, 2021.
[34]G. Lingyun, Z. Lin and W. Zhaokui, “Hierarchical Attention-Based Astronaut Gesture Recognition: A Dataset and CNN Model,” IEEE Access, vol. 8, pp. 68787-68798, 2020.
[35]D. Fan, H. Lu, S. Xu and S. Cao, “Multi-Task and Multi-Modal Learning for RGB Dynamic Gesture Recognition,” IEEE Sensors Journal, vol. 21, no. 23, pp. 27026-27036, 1 Dec.1, 2021.
[36]F. Zhan, “Hand Gesture Recognition with Convolution Neural Networks,” Proc. 2019 IEEE 20th International Conference on Information Reuse and Integration for Data Science (IRI), 2019, pp. 295-298.
[37]Z. Liu, X. Wang, M. SU and K. Lu, “A Method to Recognize Sleeping Position Using an CNN Model Based on Human Body Pressure Image,” Proc. 2019 IEEE International Conference on Power, Intelligent Computing and Systems (ICPICS), 2019, pp. 219-224.
[38]M. Atikuzzaman, T. R. Rahman, E. Wazed, M. P. Hossain and M. Z. Islam, “Human Activity Recognition System from Different Poses with CNN,” Proc. 2020 2nd International Conference on Sustainable Technologies for Industry 4.0 (STI), 2020, pp. 1-5.
[39]M. CUI, J. FANG and Y. ZHAO, “Emotion Recognition of Human Body's Posture in Open Environment,” Proc. 2020 Chinese Control And Decision Conference (CCDC), 2020, pp. 3294-3299.
[40]S. Mohsen, A. Elkaseer and S. G. Scholz, “Industry 4.0-Oriented Deep Learning Models for Human Activity Recognition,” IEEE Access, vol. 9, pp. 150508-150521, 2021.
[41]E. Martinez-Martin and M. Cazorla, “A Socially Assistive Robot for Elderly Exercise Promotion,” IEEE Access, vol. 7, pp. 75515-75529, 2019.
[42]Z. Yu and W. Q. Yan, “Human Action Recognition Using Deep Learning Methods,” Proc. 2020 35th International Conference on Image and Vision Computing New Zealand (IVCNZ), 2020, pp. 1-6.
[43]T. Nyajowi, N. Oyie and M. Ahuna, “CNN Real-Time Detection of Vandalism Using a Hybrid -LSTM Deep Learning Neural Networks,” Proc. 2021 IEEE AFRICON, 2021, pp. 1-6.
[44]D. Wang, J. Yang and Y. Zhou, “Human Action Recognition Based on Multi-Mode Spatial-Temporal Feature Fusion,” Proc. 2019 22th International Conference on Information Fusion (FUSION), 2019, pp. 1-7.
[45]H. El-Ghaish, M. E. Hussien, A. Shoukry and R. Onai, “Human Action Recognition Based on Integrating Body Pose, Part Shape, and Motion,” IEEE Access, vol. 6, pp. 49040-49055, 2018.
[46]U. Haroon, A. Ullah, T. Hussain, W. Ullah, M. Sajjad, K. Muhammad, M.Y. Lee and S.W. Baik, “A Multi-Stream Sequence Learning Framework for Human Interaction Recognition,” IEEE Transactions on Human-Machine Systems, vol. 52, no. 3, pp. 435-444, June 2022.
[47]X. Weiyao, W. Muqing, Z. Min and X. Ting, “Fusion of Skeleton and RGB Features for RGB-D Human Action Recognition,” IEEE Sensors Journal, vol. 21, no. 17, pp. 19157-19164, 1 Sept.1, 2021.
[48]Y. Tang, Z. Wang, J. Lu, J. Feng and J. Zhou, “Multi-Stream Deep Neural Networks for RGB-D Egocentric Action Recognition,” IEEE Transactions on Circuits and Systems for Video Technology, vol. 29, no. 10, pp. 3001-3015, Oct. 2019.
[49]J. Yu, H. Gao, W. Yang, Y. Jiang, W. Chin, N. Kubota and Z. Ju, “A Discriminative Deep Model with Feature Fusion and Temporal Attention for Human Action Recognition,” IEEE Access, vol. 8, pp. 43243-43255, 2020.
[50]A. Kamel, B. Sheng, P. Yang, P. Li, R. Shen and D. D. Feng, “Deep Convolutional Neural Networks for Human Action Recognition Using Depth Maps and Postures,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 49, no. 9, pp. 1806-1819, Sept. 2019.
[51]J. He, H. Xia, C. Feng and Y. Chu, “CNN-Based Action Recognition Using Adaptive Multiscale Depth Motion Maps and Stable Joint Distance Maps,” Proc. 2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 2018, pp. 439-443.
[52]顏承宇,2019,“運用視訊及音訊感測之非正常人顏面族群的發音照護系統設計研究”,國立虎尾科技大學電機工程學系碩士論文,2014。