{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,21]],"date-time":"2025-12-21T07:11:58Z","timestamp":1766301118497,"version":"3.41.2"},"reference-count":37,"publisher":"Cambridge University Press (CUP)","issue":"3","license":[{"start":{"date-parts":[[2025,1,21]],"date-time":"2025-01-21T00:00:00Z","timestamp":1737417600000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/www.cambridge.org\/core\/terms"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Robotica"],"published-print":{"date-parts":[[2025,3]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Traditional bulky and complex control devices such as remote control and ground station cannot meet the requirement of fast and flexible control of unmanned aerial vehicles (UAVs) in complex environments. Therefore, a data glove based on multi-sensor fusion is designed in this paper. In order to achieve the goal of gesture control of UAVs, the method can accurately recognize various gestures and convert them into corresponding UAV control commands. First, the wireless data glove fuses flexible fiber optic sensors and inertial sensors to construct a gesture dataset. Then, the trained neural network model is deployed to the STM32 microcontroller-based data glove for real-time gesture recognition, in which the convolutional neural network-Attention mechanism (CNN-Attention) network is used for static gesture recognition, and the convolutional neural network-bidirectional long and short-term memory (CNN-Bi-LSTM) network is used for dynamic gesture recognition. Finally, the gestures are converted into control commands and sent to the vehicle terminal to control the UAV. Through the UAV simulation test on the simulation platform, the average recognition accuracy of 32 static gestures reaches 99.7%, and the average recognition accuracy of 13 dynamic gestures reaches 99.9%, which indicates that the system\u2019s gesture recognition effect is perfect. The task test in the scene constructed in the real environment shows that the UAV can respond to the gestures quickly, and the method proposed in this paper can realize the real-time stable control of the UAV on the terminal side.<\/jats:p>","DOI":"10.1017\/s0263574724002194","type":"journal-article","created":{"date-parts":[[2025,1,21]],"date-time":"2025-01-21T02:34:34Z","timestamp":1737426874000},"page":"816-850","source":"Crossref","is-referenced-by-count":2,"title":["Wearable gesture control design for unmanned aerial vehicle based on multi-sensor fusion"],"prefix":"10.1017","volume":"43","author":[{"ORCID":"https:\/\/orcid.org\/0009-0004-3348-4475","authenticated-orcid":false,"given":"Guang","family":"Liu","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0009-0005-4491-9902","authenticated-orcid":false,"given":"Yang","family":"Liu","sequence":"additional","affiliation":[]},{"given":"Shurui","family":"Fan","sequence":"additional","affiliation":[]},{"given":"Weijia","family":"Cui","sequence":"additional","affiliation":[]},{"given":"Kewen","family":"Xia","sequence":"additional","affiliation":[]},{"given":"Li","family":"Wang","sequence":"additional","affiliation":[]}],"member":"56","published-online":{"date-parts":[[2025,1,21]]},"reference":[{"doi-asserted-by":"publisher","key":"S0263574724002194_ref11","DOI":"10.1109\/ACCESS.2021.3129650"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref10","DOI":"10.3390\/s20123571"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref9","DOI":"10.1109\/ACCESS.2019.2936034"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref23","DOI":"10.3390\/s23031672"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref14","DOI":"10.1017\/S026357472000003X"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref3","DOI":"10.3390\/s18072208"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref13","DOI":"10.1109\/ACCESS.2020.3039401"},{"key":"S0263574724002194_ref28","first-page":"1","article-title":"An interactive astronaut-robot system with gesture control","volume":"2016","author":"Liu","year":"2016","journal-title":"Comput Intel Neurosc"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref1","DOI":"10.1017\/S0263574721000370"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref6","DOI":"10.1109\/JSEN.2021.3122236"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref5","DOI":"10.3390\/s24102980"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref12","DOI":"10.3390\/jimaging8040098"},{"key":"S0263574724002194_ref15","first-page":"1","volume-title":"IEEE Transactions On Neural Networks and Learning Systems","author":"Antillon","year":"2022"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref27","DOI":"10.1017\/S0263574723000309"},{"key":"S0263574724002194_ref29","first-page":"157","article-title":"A novel SE-CNN attention architecture for sEMG-based hand gesture recognition[J]","volume":"134","author":"Xu","year":"2023","journal-title":"Comput Model Eng Sci"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref32","DOI":"10.1109\/ACCESS.2022.3142918"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref18","DOI":"10.3390\/electronics9060905"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref4","DOI":"10.3390\/drones7020089"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref35","DOI":"10.1109\/ACCESS.2020.2979735"},{"doi-asserted-by":"crossref","unstructured":"[33] Zhang, P. F. , Xue, J. R. , Lan, C. L. , Zeni, W. J. , Gao, Z. N. and Zheng, N. N. , \u201cAdding Attentiveness to the Neurons in Recurrent Neural Networks,\u201d In: Computer Vision (2018) pp. 136\u2013152.","key":"S0263574724002194_ref33","DOI":"10.1007\/978-3-030-01240-3_9"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref30","DOI":"10.1109\/ACCESS.2023.3254537"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref24","DOI":"10.3390\/s23031096"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref36","DOI":"10.3390\/s22218410"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref2","DOI":"10.1109\/MAES.2022.3220725"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref37","DOI":"10.1109\/TNNLS.2013.2277540"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref31","DOI":"10.3390\/electronics10141685"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref22","DOI":"10.3390\/s21216948"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref16","DOI":"10.1016\/j.neucom.2017.02.101"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref25","DOI":"10.3390\/s21124204"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref20","DOI":"10.3390\/mi12040362"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref19","DOI":"10.1109\/TNSRE.2022.3162416"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref8","DOI":"10.3390\/s20041074"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref26","DOI":"10.3390\/math10244753"},{"key":"S0263574724002194_ref34","article-title":"A gesture recognition method based on MIC-attention- LSTM[J]","volume":"13","author":"Hu","year":"2023","journal-title":"Hum-Centric Comput Inform Sci"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref21","DOI":"10.1109\/JSEN.2017.2776262"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref7","DOI":"10.1109\/ACCESS.2022.3199358"},{"doi-asserted-by":"publisher","key":"S0263574724002194_ref17","DOI":"10.3390\/s141224483"}],"container-title":["Robotica"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.cambridge.org\/core\/services\/aop-cambridge-core\/content\/view\/S0263574724002194","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,7,28]],"date-time":"2025-07-28T07:50:57Z","timestamp":1753689057000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.cambridge.org\/core\/product\/identifier\/S0263574724002194\/type\/journal_article"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,1,21]]},"references-count":37,"journal-issue":{"issue":"3","published-print":{"date-parts":[[2025,3]]}},"alternative-id":["S0263574724002194"],"URL":"https:\/\/doi.org\/10.1017\/s0263574724002194","relation":{},"ISSN":["0263-5747","1469-8668"],"issn-type":[{"type":"print","value":"0263-5747"},{"type":"electronic","value":"1469-8668"}],"subject":[],"published":{"date-parts":[[2025,1,21]]}}}