{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,19]],"date-time":"2026-01-19T08:30:40Z","timestamp":1768811440683,"version":"3.49.0"},"reference-count":37,"publisher":"MDPI AG","issue":"7","license":[{"start":{"date-parts":[[2023,3,28]],"date-time":"2023-03-28T00:00:00Z","timestamp":1679961600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"European Union\u2019s Horizon 2020 research and innovation program","award":["857202"],"award-info":[{"award-number":["857202"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Applied Sciences"],"abstract":"<jats:p>Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.<\/jats:p>","DOI":"10.3390\/app13074265","type":"journal-article","created":{"date-parts":[[2023,3,28]],"date-time":"2023-03-28T02:11:40Z","timestamp":1679969500000},"page":"4265","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":10,"title":["Nano Aerial Vehicles for Tree Pollination"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-5387-0980","authenticated-orcid":false,"given":"Isabel","family":"Pinheiro","sequence":"first","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6909-0209","authenticated-orcid":false,"given":"Andr\u00e9","family":"Aguiar","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3343-385X","authenticated-orcid":false,"given":"Andr\u00e9","family":"Figueiredo","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1403-7221","authenticated-orcid":false,"given":"Tatiana","family":"Pinho","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5798-1298","authenticated-orcid":false,"given":"Ant\u00f3nio","family":"Valente","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"},{"name":"School of Science and Technology, University of Tr\u00e1s-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe","family":"Santos","sequence":"additional","affiliation":[{"name":"INESC Technology and Science (INESC TEC), 4200-465 Porto, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2023,3,28]]},"reference":[{"key":"ref_1","unstructured":"EFSA (2021, December 27). Bee Health. Available online: https:\/\/www.efsa.europa.eu\/en\/topics\/topic\/bee-health."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"75","DOI":"10.1007\/s41055-016-0003-z","article-title":"Pollinators and global food security: The need for holistic global stewardship","volume":"1","author":"Vaage","year":"2016","journal-title":"Food Ethics"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1","DOI":"10.17221\/7240-VETMED","article-title":"The foraging behaviour of honey bees, Apis mellifera: A review","volume":"59","year":"2014","journal-title":"Vet. Med."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"99","DOI":"10.1016\/j.paerosci.2017.04.003","article-title":"Classifications, applications, and design challenges of drones: A review","volume":"91","author":"Hassanalian","year":"2017","journal-title":"Prog. Aerosp. Sci."},{"key":"ref_5","unstructured":"Franklin, E. (2022, February 17). Robot Bees vs. Real Bees\u2014Why Tiny Drones Can\u2019t Compete with the Real Thing. Available online: https:\/\/theconversation.com\/robot-bees-vs-real-bees-why-tiny-drones-cant-compete-with-the-real-thing-72769."},{"key":"ref_6","unstructured":"Institute, W. (2022, February 17). RoboBees: Autonomous Flying Microrobots. Available online: https:\/\/wyss.harvard.edu\/technology\/robobees-autonomous-flying-microrobots\/."},{"key":"ref_7","unstructured":"Potenza, A. (2022, February 17). Bee Optimistic: This Drone Can Still Pollinate Plants even If All the Bees Die. Available online: https:\/\/www.theverge.com\/2017\/2\/9\/14549786\/drone-bees-artificial-pollinators-colony-collapse-disorder."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016, January 1\u201326). You only look once: Unified, real-time object detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_9","unstructured":"Wang, C.Y., Bochkovskiy, A., and Liao, H.Y.M. (2022). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. arXiv."},{"key":"ref_10","unstructured":"Wang, C.Y., Yeh, I.H., and Liao, H.Y.M. (2021). You only learn one representation: Unified network for multiple tasks. arXiv."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Wang, D., Liu, Z., Gu, X., Wu, W., Chen, Y., and Wang, L. (2022). Automatic detection of pothole distress in asphalt pavement using improved convolutional neural networks. Remote Sens., 14.","DOI":"10.3390\/rs14163892"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"104698","DOI":"10.1016\/j.autcon.2022.104698","article-title":"Automatic recognition of pavement cracks from combined GPR B-scan and C-scan images using multiscale feature fusion deep neural networks","volume":"146","author":"Liu","year":"2023","journal-title":"Autom. Constr."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Bu\u010dko, B., Lieskovsk\u00e1, E., Z\u00e1bovsk\u00e1, K., and Z\u00e1bovsk\u1ef3, M. (2022). Computer Vision Based Pothole Detection under Challenging Conditions. Sensors, 22.","DOI":"10.3390\/s22228878"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"3491","DOI":"10.3934\/mbe.2021175","article-title":"Infusion port level detection for intravenous infusion based on Yolo v3 neural network","volume":"18","author":"Huang","year":"2021","journal-title":"Math. Biosci. Eng."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Gallo, I., Rehman, A.U., Dehkordi, R.H., Landro, N., La Grassa, R., and Boschetti, M. (2023). Deep Object Detection of Crop Weeds: Performance of YOLOv7 on a Real Case Dataset from UAV Images. Remote Sens., 15.","DOI":"10.3390\/rs15020539"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"107576","DOI":"10.1016\/j.compag.2022.107576","article-title":"Small unopened cotton boll counting by detection with MRF-YOLO in the wild","volume":"204","author":"Liu","year":"2023","journal-title":"Comput. Electron. Agric."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Ahmad, I., Yang, Y., Yue, Y., Ye, C., Hassan, M., Cheng, X., Wu, Y., and Zhang, Y. (2022). Deep learning based detector YOLOv5 for identifying insect pests. Appl. Sci., 12.","DOI":"10.3390\/app121910167"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"29","DOI":"10.1016\/j.biosystemseng.2022.07.014","article-title":"Identification of the operating position and orientation of a robotic kiwifruit pollinator","volume":"222","author":"Li","year":"2022","journal-title":"Biosyst. Eng."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"417","DOI":"10.3233\/JBR-160138","article-title":"Kiwifruit pollination: The interaction between pollen quality, pollination systems and flowering stage","volume":"6","author":"Tacconi","year":"2016","journal-title":"J. Berry Res."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"107114","DOI":"10.1016\/j.compag.2022.107114","article-title":"Design of a lightweight robotic arm for kiwifruit pollination","volume":"198","author":"Li","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_21","unstructured":"Lim, J., Ahn, H.S., Nejati, M., Bell, J., Williams, H., and MacDonald, B.A. (2020). Deep neural network based real-time kiwi fruit flower detection in an orchard environment. arXiv."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"100151","DOI":"10.1016\/j.atech.2022.100151","article-title":"Mask R-CNN based apple flower detection and king flower identification for precision pollination","volume":"4","author":"Mu","year":"2023","journal-title":"Smart Agric. Technol."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"107342","DOI":"10.1016\/j.compag.2022.107342","article-title":"Multi-class detection of kiwifruit flower and its distribution identification in orchard based on YOLOv5l and Euclidean distance","volume":"201","author":"Li","year":"2022","journal-title":"Comput. Electron. Agric."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Hulens, D., Van Ranst, W., Cao, Y., and Goedem\u00e9, T. (2022). Autonomous Visual Navigation for a Flower Pollination Drone. Machines, 10.","DOI":"10.3390\/machines10050364"},{"key":"ref_25","first-page":"012086","article-title":"Simulation of multiple unmanned aerial vehicles for compensatory pollination in facility agriculture","volume":"Volume 2005","author":"Fan","year":"2021","journal-title":"Proceedings of the Journal of Physics: Conference Series"},{"key":"ref_26","unstructured":"Braithwaite, A., Alhinai, T., Haas-Heger, M., McFarlane, E., and Kova\u010d, M. (2018). Robotics Research, Springer."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Burgu\u00e9s, J., Hern\u00e1ndez, V., Lilienthal, A.J., and Marco, S. (2019). Smelling nano aerial vehicle for gas source localization and mapping. Sensors, 19.","DOI":"10.3390\/s19030478"},{"key":"ref_28","first-page":"42","article-title":"Mixed reality and remote sensing application of unmanned aerial vehicle in fire and smoke detection","volume":"15","author":"Esfahlani","year":"2019","journal-title":"J. Ind. Inf. Integr."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Kang, K., Belkhale, S., Kahn, G., Abbeel, P., and Levine, S. (2019, January 20\u201324). Generalization through simulation: Integrating simulated and real data into deep reinforcement learning for vision-based autonomous flight. Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), IEEE, Montreal, QC, Canada.","DOI":"10.1109\/ICRA.2019.8793735"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Neumann, P.P., Hirschberger, P., Baurzhan, Z., Tiebe, C., Hofmann, M., H\u00fcllmann, D., and Bartholmai, M. (2019, January 26\u201329). Indoor air quality monitoring using flying nanobots: Design and experimental study. Proceedings of the 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), IEEE, Fukuoka, Japan.","DOI":"10.1109\/ISOEN.2019.8823496"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"470","DOI":"10.1016\/j.matpr.2019.03.151","article-title":"Concept of a gas-sensitive nano aerial robot swarm for indoor air quality monitoring","volume":"12","author":"Neumann","year":"2019","journal-title":"Mater. Today Proc."},{"key":"ref_32","unstructured":"Palossi, D., Gomez, A., Draskovic, S., Keller, K., Benini, L., and Thiele, L. (2017). Proceedings of the Computing Frontiers Conference, Association for Computing Machinery. CF\u201917."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"8357","DOI":"10.1109\/JIOT.2019.2917066","article-title":"A 64-mw dnn-based visual navigation engine for autonomous nano-drones","volume":"6","author":"Palossi","year":"2019","journal-title":"IEEE Internet Things J."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Padilla, R., Passos, W.L., Dias, T.L., Netto, S.L., and Da Silva, E.A. (2021). A comparative analysis of object detection metrics with a companion open-source toolkit. Electronics, 10.","DOI":"10.3390\/electronics10030279"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Wu, S., Liu, J., Lei, X., Zhao, S., Lu, J., Jiang, Y., Xie, B., and Wang, M. (2022). Research Progress on Efficient Pollination Technology of Crops. Agronomy, 12.","DOI":"10.3390\/agronomy12112872"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Deng, L., Chu, H.H., Shi, P., Wang, W., and Kong, X. (2020). Region-based CNN method with deformable modules for visually classifying concrete cracks. Appl. Sci., 10.","DOI":"10.3390\/app10072528"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1914","DOI":"10.1111\/mice.12881","article-title":"Tiny-Crack-Net: A multiscale feature fusion network with attention mechanisms for segmentation of tiny cracks","volume":"37","author":"Chu","year":"2022","journal-title":"Comput.-Aided Civ. Infrastruct. Eng."}],"container-title":["Applied Sciences"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2076-3417\/13\/7\/4265\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T19:04:33Z","timestamp":1760123073000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2076-3417\/13\/7\/4265"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,3,28]]},"references-count":37,"journal-issue":{"issue":"7","published-online":{"date-parts":[[2023,4]]}},"alternative-id":["app13074265"],"URL":"https:\/\/doi.org\/10.3390\/app13074265","relation":{},"ISSN":["2076-3417"],"issn-type":[{"value":"2076-3417","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,3,28]]}}}