{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,20]],"date-time":"2026-03-20T20:33:56Z","timestamp":1774038836992,"version":"3.50.1"},"reference-count":43,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2025,10,23]],"date-time":"2025-10-23T00:00:00Z","timestamp":1761177600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100011093","name":"University of Phayao","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100011093","id-type":"DOI","asserted-by":"crossref"}]},{"name":"Thailand Science Research and Innovation Fund"},{"name":"National Science, Research and Innovation Fund"},{"DOI":"10.13039\/501100007345","name":"King Mongkut\u2019s University of Technology North Bangkok","doi-asserted-by":"publisher","award":["KMUTNB-FF-68-B-02"],"award-info":[{"award-number":["KMUTNB-FF-68-B-02"]}],"id":[{"id":"10.13039\/501100007345","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Informatics"],"abstract":"<jats:p>This study focuses on human awareness, a critical component in human\u2013robot interaction, particularly within agricultural environments where interactions are enriched by complex contextual information. The main objective is identifying human activities occurring during collaborative harvesting tasks involving humans and robots. To achieve this, we propose a novel and lightweight deep learning model, named 1D-ResNeXt, designed explicitly for recognizing activities in agriculture-related human\u2013robot collaboration. The model is built as an end-to-end architecture incorporating feature fusion and a multi-kernel convolutional block strategy. It utilizes residual connections and a split\u2013transform\u2013merge mechanism to mitigate performance degradation and reduce model complexity by limiting the number of trainable parameters. Sensor data were collected from twenty individuals with five wearable devices placed on different body parts. Each sensor was embedded with tri-axial accelerometers, gyroscopes, and magnetometers. Under real field conditions, the participants performed several sub-tasks commonly associated with agricultural labor, such as lifting and carrying loads. Before classification, the raw sensor signals were pre-processed to eliminate noise. The cleaned time-series data were then input into the proposed deep learning network for sequential pattern recognition. Experimental results showed that the chest-mounted sensor achieved the highest F1-score of 99.86%, outperforming other sensor placements and combinations. An analysis of temporal window sizes (0.5, 1.0, 1.5, and 2.0 s) demonstrated that the 0.5 s window provided the best recognition performance, indicating that key activity features in agriculture can be captured over short intervals. Moreover, a comprehensive evaluation of sensor modalities revealed that multimodal fusion of accelerometer, gyroscope, and magnetometer data yielded the best accuracy at 99.92%. The combination of accelerometer and gyroscope data offered an optimal compromise, achieving 99.49% accuracy while maintaining lower system complexity. These findings highlight the importance of strategic sensor placement and data fusion in enhancing activity recognition performance while reducing the need for extensive data and computational resources. This work contributes to developing intelligent, efficient, and adaptive collaborative systems, offering promising applications in agriculture and beyond, with improved safety, cost-efficiency, and real-time operational capability.<\/jats:p>","DOI":"10.3390\/informatics12040115","type":"journal-article","created":{"date-parts":[[2025,10,23]],"date-time":"2025-10-23T09:44:30Z","timestamp":1761212670000},"page":"115","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["Efficient Wearable Sensor-Based Activity Recognition for Human\u2013Robot Collaboration in Agricultural Environments"],"prefix":"10.3390","volume":"12","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-3735-4262","authenticated-orcid":false,"given":"Sakorn","family":"Mekruksavanich","sequence":"first","affiliation":[{"name":"Department of Computer Engineering, School of Information and Communication Technology, University of Phayao, Phayao 56000, Thailand"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5249-2786","authenticated-orcid":false,"given":"Anuchit","family":"Jitpattanakul","sequence":"additional","affiliation":[{"name":"Department of Mathematics, Faculty of Applied Science, King Mongkut\u2019s University of Technology North Bangkok, Bangkok 10800, Thailand"},{"name":"Intelligent and Nonlinear Dynamic Innovations Research Center, Science and Technology Research Institute, King Mongkut\u2019s University of Technology North Bangkok, Bangkok 10800, Thailand"}]}],"member":"1968","published-online":{"date-parts":[[2025,10,23]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"150","DOI":"10.3390\/agriengineering2010010","article-title":"An Extensive Review of Mobile Agricultural Robotics for Field Operations: Focus on Cotton Harvesting","volume":"2","author":"Fue","year":"2020","journal-title":"AgriEngineering"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Tan, Y., Liu, X., Zhang, J., Wang, Y., and Hu, Y. (2025). A Review of Research on Fruit and Vegetable Picking Robots Based on Deep Learning. Sensors, 25.","DOI":"10.3390\/s25123677"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1888","DOI":"10.1109\/TCYB.2019.2947532","article-title":"Physical Human\u2013Robot Collaboration: Robotic Systems, Learning Methods, Collaborative Strategies, Sensors, and Actuators","volume":"51","author":"Ogenyi","year":"2021","journal-title":"IEEE Trans. Cybern."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1109\/TAFE.2024.3366335","article-title":"Software Architecture for Agricultural Robots: Systems, Requirements, Challenges, Case Studies, and Future Perspectives","volume":"2","author":"Raja","year":"2024","journal-title":"IEEE Trans. AgriFood Electron."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Han, J., and Conti, D. (2025). Recent Advances in Human\u2013Robot Interactions. Appl. Sci., 15.","DOI":"10.3390\/app15126850"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Liu, H., Gamboa, H., and Schultz, T. (2024). Human Activity Recognition, Monitoring, and Analysis Facilitated by Novel and Widespread Applications of Sensors. Sensors, 24.","DOI":"10.3390\/s24165250"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1965","DOI":"10.1007\/s11042-023-15443-5","article-title":"A review of vision-based indoor HAR: State-of-the-art, challenges, and future prospects","volume":"83","author":"Bhola","year":"2023","journal-title":"Multimed. Tools Appl."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"93450","DOI":"10.1109\/ACCESS.2024.3422831","article-title":"Simple to Complex, Single to Concurrent Sensor-Based Human Activity Recognition: Perception and Open Challenges","volume":"12","author":"Ankalaki","year":"2024","journal-title":"IEEE Access"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"2494","DOI":"10.3390\/agriengineering6030146","article-title":"Human\u2013Robot Interaction through Dynamic Movement Recognition for Agricultural Environments","volume":"6","author":"Moysiadis","year":"2024","journal-title":"AgriEngineering"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"109363","DOI":"10.1016\/j.compag.2024.109363","article-title":"Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review","volume":"225","author":"Upadhyay","year":"2024","journal-title":"Comput. Electron. Agric."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Moysiadis, V., Katikaridis, D., Benos, L., Busato, P., Anagnostis, A., Kateris, D., Pearson, S., and Bochtis, D. (2022). An Integrated Real-Time Hand Gesture Recognition Framework for Human\u2013Robot Interaction in Agriculture. Appl. Sci., 12.","DOI":"10.3390\/app12168160"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"104567","DOI":"10.1016\/j.robot.2023.104567","article-title":"A novel end-to-end vision-based architecture for agricultural human\u2013robot collaboration in fruit picking operations","volume":"172","author":"Pal","year":"2024","journal-title":"Robot. Auton. Syst."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Thottempudi, P., Acharya, B., and Moreira, F. (2024). High-Performance Real-Time Human Activity Recognition Using Machine Learning. Mathematics, 12.","DOI":"10.3390\/math12223622"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Mekruksavanich, S., and Jitpattanakul, A. (2023). A Deep Learning Network with Aggregation Residual Transformation for Human Activity Recognition Using Inertial and Stretch Sensors. Computers, 12.","DOI":"10.3390\/computers12070141"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Aguileta, A.A., Brena, R.F., Mayora, O., Molino-Minero-Re, E., and Trejo, L.A. (2019). Multi-Sensor Fusion for Activity Recognition\u2014A Survey. Sensors, 19.","DOI":"10.3390\/s19173808"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"339","DOI":"10.1109\/TASE.2018.2874487","article-title":"A Sensor Fusion Approach to Indoor Human Localization Based on Environmental and Wearable Sensors","volume":"16","author":"Pham","year":"2019","journal-title":"IEEE Trans. Autom. Sci. Eng."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Mekruksavanich, S., Jitpattanakul, A., Youplao, P., and Yupapin, P. (2020). Enhanced Hand-Oriented Activity Recognition Based on Smartwatch Sensor Data Using LSTMs. Symmetry, 12.","DOI":"10.3390\/sym12091570"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Mekruksavanich, S., Jantawong, P., and Jitpattanakul, A. (2023, January 22\u201325). Deep Learning Approaches for HAR of Daily Living Activities Using IMU Sensors in Smart Glasses. Proceedings of the 2023 Joint International Conference on Digital Arts, Media and Technology with ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering (ECTI DAMT & NCON), Phuket, Thailand.","DOI":"10.1109\/ECTIDAMTNCON57770.2023.10139685"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Ye, X., Sakurai, K., Nair, N.K.C., and Wang, K.I.K. (2024). Machine Learning Techniques for Sensor-Based Human Activity Recognition with Data Heterogeneity\u2014A Review. Sensors, 24.","DOI":"10.3390\/s24247975"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"842","DOI":"10.3390\/make6020040","article-title":"A Comprehensive Survey on Deep Learning Methods in Human Activity Recognition","volume":"6","author":"Kaseris","year":"2024","journal-title":"Mach. Learn. Knowl. Extr."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Lai, Y.C., Kan, Y.C., Hsu, K.C., and Lin, H.C. (2024). Multiple inputs modeling of hybrid convolutional neural networks for human activity recognition. Biomed. Signal Process. Control, 92.","DOI":"10.1016\/j.bspc.2024.106034"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Sassi Hidri, M., Hidri, A., Alsaif, S.A., Alahmari, M., and AlShehri, E. (2025). Enhancing Sensor-Based Human Physical Activity Recognition Using Deep Neural Networks. J. Sens. Actuator Netw., 14.","DOI":"10.3390\/jsan14020042"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Pan, J., Hu, Z., Yin, S., and Li, M. (2022). GRU with Dual Attentions for Sensor-Based Human Activity Recognition. Electronics, 11.","DOI":"10.3390\/electronics11111797"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Xie, S., Girshick, R., Doll\u00e1r, P., Tu, Z., and He, K. (2017, January 21\u201326). Aggregated Residual Transformations for Deep Neural Networks. Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA.","DOI":"10.1109\/CVPR.2017.634"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"6171","DOI":"10.3390\/s90806171","article-title":"Monitoring System for Farming Operations with Wearable Devices Utilized Sensor Networks","volume":"9","author":"Fukatsu","year":"2009","journal-title":"Sensors"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Yerebakan, M.O., and Hu, B. (2024). Wearable Sensors Assess the Effects of Human\u2013Robot Collaboration in Simulated Pollination. Sensors, 24.","DOI":"10.3390\/s24020577"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"123143","DOI":"10.1016\/j.eswa.2024.123143","article-title":"Human activity recognition with smartphone-integrated sensors: A survey","volume":"246","author":"Dentamaro","year":"2024","journal-title":"Expert Syst. Appl."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Uddin, M.Z., and Soylu, A. (2021). Human activity recognition using wearable sensors, discriminant analysis, and long short-term memory-based neural structured learning. Sci. Rep., 11.","DOI":"10.1038\/s41598-021-95947-y"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"106637","DOI":"10.1016\/j.compag.2021.106637","article-title":"Worker safety in agriculture 4.0: A new approach for mapping operator\u2019s vibration risk through Machine Learning activity recognition","volume":"193","author":"Aiello","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Tagarakis, A.C., Benos, L., Aivazidou, E., Anagnostis, A., Kateris, D., and Bochtis, D. (2021). Wearable Sensors for Identifying Activity Signatures in Human-Robot Collaborative Agricultural Environments. Eng. Proc., 9.","DOI":"10.3390\/engproc2021009005"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Anagnostis, A., Benos, L., Tsaopoulos, D., Tagarakis, A., Tsolakis, N., and Bochtis, D. (2021). Human Activity Recognition Through Recurrent Neural Networks for Human\u2013Robot Interaction in Agriculture. Appl. Sci., 11.","DOI":"10.3390\/app11052188"},{"key":"ref_32","first-page":"77","article-title":"Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities","volume":"54","author":"Chen","year":"2021","journal-title":"ACM Comput. Surv."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"5671","DOI":"10.3934\/mbe.2022265","article-title":"RNN-based deep learning for physical activity recognition using smartwatch sensors: A case study of simple and complex activity recognition","volume":"19","author":"Mekruksavanich","year":"2022","journal-title":"Math. Biosci. Eng."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Ord\u00f3\u00f1ez, F.J., and Roggen, D. (2016). Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors, 16.","DOI":"10.3390\/s16010115"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Imran, H.A., Hamza, K., and Mehmood, Z. (2022, January 24\u201326). HARResNext: An efficient ResNext inspired network for human activity recognition with inertial sensors. Proceedings of the 2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2), Rawalpindi, Pakistan.","DOI":"10.1109\/ICoDT255437.2022.9787447"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Benos, L., Tsaopoulos, D., Tagarakis, A.C., Kateris, D., and Bochtis, D. (2024). Optimal Sensor Placement and Multimodal Fusion for Human Activity Recognition in Agricultural Tasks. Appl. Sci., 14.","DOI":"10.3390\/app14188520"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Slattery, P., Cofr\u00e9 Lizama, L.E., Wheat, J., Gastin, P., Dascombe, B., and Middleton, K. (2024). The Agreement between Wearable Sensors and Force Plates for the Analysis of Stride Time Variability. Sensors, 24.","DOI":"10.3390\/s24113378"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.gaitpost.2024.04.006","article-title":"A novel method for accurate division of the gait cycle into seven phases using shank angular velocity","volume":"111","author":"Salminen","year":"2024","journal-title":"Gait Posture"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"1936","DOI":"10.1007\/s10618-020-00710-y","article-title":"InceptionTime: Finding AlexNet for time series classification","volume":"34","author":"Lucas","year":"2020","journal-title":"Data Min. Knowl. Discov."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Bragan\u00e7a, H., Colonna, J.G., Oliveira, H.A.B.F., and Souto, E. (2022). How Validation Methodology Influences Human Activity Recognition Mobile Systems. Sensors, 22.","DOI":"10.3390\/s22062360"},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"24224","DOI":"10.1109\/JSEN.2024.3412736","article-title":"RepMobile: A MobileNet-Like Network With Structural Reparameterization for Sensor-Based Human Activity Recognition","volume":"24","author":"Yu","year":"2024","journal-title":"IEEE Sens. J."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Silpa, A.S., Benifa, J.B., Anu, K., and Vijayakumar, A. (2023, January 14\u201315). Human Activity Recognition Using Efficientnet-B0 Deep Learning Model. Proceedings of the 2023 Intelligent Computing and Control for Engineering and Business Systems (ICCEBS), Chennai, India.","DOI":"10.1109\/ICCEBS58601.2023.10448623"},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Zhou, H., Zhao, Y., Liu, Y., Lu, S., An, X., and Liu, Q. (2023). Multi-Sensor Data Fusion and CNN-LSTM Model for Human Activity Recognition System. Sensors, 23.","DOI":"10.3390\/s23104750"}],"container-title":["Informatics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2227-9709\/12\/4\/115\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,25]],"date-time":"2025-10-25T04:23:26Z","timestamp":1761366206000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2227-9709\/12\/4\/115"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,10,23]]},"references-count":43,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2025,12]]}},"alternative-id":["informatics12040115"],"URL":"https:\/\/doi.org\/10.3390\/informatics12040115","relation":{},"ISSN":["2227-9709"],"issn-type":[{"value":"2227-9709","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,10,23]]}}}