{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,6]],"date-time":"2026-04-06T14:49:31Z","timestamp":1775486971708,"version":"3.50.1"},"reference-count":26,"publisher":"Springer Science and Business Media LLC","issue":"4","license":[{"start":{"date-parts":[[2021,9,28]],"date-time":"2021-09-28T00:00:00Z","timestamp":1632787200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2021,9,28]],"date-time":"2021-09-28T00:00:00Z","timestamp":1632787200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"National Key Research and Development Program of China","award":["2017YFD0701502"],"award-info":[{"award-number":["2017YFD0701502"]}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Complex Intell. Syst."],"published-print":{"date-parts":[[2022,8]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>For automating the harvesting of bunches of tomatoes in a greenhouse, the end-effector needs to reach the exact cutting point and adaptively adjust the pose of peduncles. In this paper, a method is proposed for peduncle cutting point localization and pose estimation. Images captured in real time at a fixed long-distance are detected using the YOLOv4-Tiny detector with a precision of 92.7% and a detection speed of 0.0091\u00a0s per frame, then the YOLACT\u2009+\u2009\u2009+\u2009Network with mAP of 73.1 and a time speed of 0.109\u00a0s per frame is used to segment the close-up distance. The segmented peduncle mask is fitted to the curve using least squares and three key points on the curve are found. Finally, a geometric model is established to estimate the pose of the peduncle with an average error of 4.98\u00b0 in yaw angle and 4.75\u00b0 in pitch angle over the 30 sets of tests.<\/jats:p>","DOI":"10.1007\/s40747-021-00522-7","type":"journal-article","created":{"date-parts":[[2021,9,28]],"date-time":"2021-09-28T07:08:59Z","timestamp":1632812939000},"page":"2955-2969","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":46,"title":["A peduncle detection method of tomato for autonomous harvesting"],"prefix":"10.1007","volume":"8","author":[{"given":"Jiacheng","family":"Rong","sequence":"first","affiliation":[]},{"given":"Guanglin","family":"Dai","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3828-3053","authenticated-orcid":false,"given":"Pengbo","family":"Wang","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2021,9,28]]},"reference":[{"issue":"4","key":"522_CR1","first-page":"52","volume":"8","author":"L Fu","year":"2015","unstructured":"Fu L, Wang B, Cui Y, Su S, Gejima Y, Kobayashi T (2015) Kiwifruit recognition at nighttime using artificial lighting based on machine vision. Int J Agric Biol Eng 8(4):52\u201359","journal-title":"Int J Agric Biol Eng"},{"key":"522_CR2","doi-asserted-by":"publisher","first-page":"311","DOI":"10.1016\/j.compag.2016.06.022","volume":"127","author":"Y Zhao","year":"2016","unstructured":"Zhao Y, Gong L, Huang Y, Liu C (2016) A review of key techniques of vision-based control for harvesting robot. Comput Electron Agric 127:311\u2013323","journal-title":"Comput Electron Agric"},{"issue":"6","key":"522_CR3","doi-asserted-by":"publisher","first-page":"888","DOI":"10.1002\/rob.21525","volume":"31","author":"CW Bac","year":"2014","unstructured":"Bac CW, Henten EJ, Hemming J, Edan Y (2014) Harvesting robots for high-value crops: state-of-the-art review and challenges ahead. J Field Robot 31(6):888\u2013911","journal-title":"J Field Robot"},{"key":"522_CR4","doi-asserted-by":"publisher","first-page":"70","DOI":"10.1016\/j.compag.2018.02.016","volume":"147","author":"A Kamilaris","year":"2018","unstructured":"Kamilaris A, Prenafeta-Bold\u00fa FX (2018) Deep learning in agriculture: a survey. Comput Electron Agric 147:70\u201390","journal-title":"Comput Electron Agric"},{"issue":"6","key":"522_CR5","doi-asserted-by":"publisher","first-page":"1107","DOI":"10.1007\/s11119-019-09642-0","volume":"20","author":"A Koirala","year":"2019","unstructured":"Koirala A, Walsh KB, Wang Z, McCarthy C (2019) Deep learning for real-time fruit detection and orchard fruit load estimation: benchmarking of \u201cMangoYOLO.\u201d Precis Agric 20(6):1107\u20131135","journal-title":"Precis Agric"},{"key":"522_CR6","doi-asserted-by":"publisher","first-page":"417","DOI":"10.1016\/j.compag.2019.01.012","volume":"157","author":"Y Tian","year":"2019","unstructured":"Tian Y, Yang G, Wang Z, Wang H, Li E, Liang Z (2019) Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput Electron Agr 157:417\u2013426","journal-title":"Comput Electron Agr"},{"issue":"7","key":"522_CR7","doi-asserted-by":"publisher","first-page":"2145","DOI":"10.3390\/s20072145","volume":"20","author":"G Liu","year":"2020","unstructured":"Liu G, Nouaze JC, Mbouembe PLT, Kim JH (2020) YOLO-Tomato: a robust algorithm for tomato detection based on YOLOv3. Sensors (Basel) 20(7):2145","journal-title":"Sensors (Basel)"},{"issue":"2","key":"522_CR8","doi-asserted-by":"publisher","first-page":"225","DOI":"10.1002\/rob.21888","volume":"37","author":"S Birrell","year":"2020","unstructured":"Birrell S, Hughes J, Cai J, Iida F (2020) A field-tested robotic harvesting system for iceberg lettuce. J Field Robot 37(2):225\u2013245","journal-title":"J Field Robot"},{"key":"522_CR9","doi-asserted-by":"publisher","first-page":"105736","DOI":"10.1016\/j.compag.2020.105736","volume":"178","author":"I Perez-Borrero","year":"2020","unstructured":"Perez-Borrero I, Marin-Santos D, Gegundez-Arias ME, Cortes-Ancos E (2020) A fast and accurate deep learning method for strawberry instance segmentation. Comput Electron Agric 178:105736","journal-title":"Comput Electron Agric"},{"key":"522_CR10","doi-asserted-by":"publisher","first-page":"105933","DOI":"10.1016\/j.compag.2020.105933","volume":"181","author":"Z Song","year":"2021","unstructured":"Song Z, Zhou Z, Wang W, Gao F, Fu L, Li R, Cui Y (2021) Canopy segmentation and wire reconstruction for kiwifruit robotic harvesting. Comput Electron Agric 181:105933","journal-title":"Comput Electron Agric"},{"key":"522_CR11","first-page":"13","volume":"2020","author":"W Chen","year":"2020","unstructured":"Chen W, Lu S, Liu B, Li G, Qian T (2020) Detecting citrus in orchard environment by using improved YOLOv4. Sci Program Neth 2020:13","journal-title":"Sci Program Neth"},{"issue":"2","key":"522_CR12","doi-asserted-by":"publisher","first-page":"765","DOI":"10.1109\/LRA.2017.2651952","volume":"2","author":"I Sa","year":"2017","unstructured":"Sa I, Lehnert C, English A, McCool C, Dayoub F, Upcroft B, Perez T (2017) Peduncle detection of sweet pepper for autonomous crop harvesting-combined color and 3-D information. IEEE Robot Autom Let 2(2):765\u2013772","journal-title":"IEEE Robot Autom Let"},{"key":"522_CR13","doi-asserted-by":"publisher","first-page":"130","DOI":"10.1016\/j.compind.2018.03.017","volume":"99","author":"L Luo","year":"2018","unstructured":"Luo L, Tang Y, Lu Q, Chen X, Zhang P, Zou X (2018) A vision methodology for harvesting robot to detect cutting points on peduncles of double overlapping grape clusters in a vineyard. Comput Ind 99:130\u2013139","journal-title":"Comput Ind"},{"issue":"2","key":"522_CR14","doi-asserted-by":"publisher","first-page":"180","DOI":"10.20965\/jrm.2018.p0180","volume":"30","author":"T Yoshida","year":"2018","unstructured":"Yoshida T, Fukao T, Hasegawa T (2018) Fast detection of tomato peduncle using point cloud with a harvesting robot. J Robot Mechatron 30(2):180\u2013186","journal-title":"J Robot Mechatron"},{"issue":"2","key":"522_CR15","doi-asserted-by":"publisher","first-page":"437","DOI":"10.20965\/jrm.2020.p0437","volume":"32","author":"T Yoshida","year":"2020","unstructured":"Yoshida T, Fukao T, Hasegawa T (2020) Cutting point detection using a robot with point clouds for tomato harvesting. J Robot Mechatron 32(2):437\u2013444","journal-title":"J Robot Mechatron"},{"key":"522_CR16","doi-asserted-by":"publisher","first-page":"26","DOI":"10.1016\/j.biosystemseng.2019.04.006","volume":"183","author":"R Barth","year":"2019","unstructured":"Barth R, Hemming J, Van Henten EJ (2019) Angle estimation between plant parts for grasp optimisation in harvest robots. Biosyst Eng 183:26\u201346","journal-title":"Biosyst Eng"},{"key":"522_CR17","doi-asserted-by":"publisher","first-page":"116556","DOI":"10.1109\/ACCESS.2020.3003034","volume":"8","author":"Y Yu","year":"2020","unstructured":"Yu Y, Zhang K, Liu H, Yang L, Zhang D (2020) Real-time visual localization of the picking points for a ridge-planting strawberry harvesting robot. IEEE Access 8:116556\u2013116568","journal-title":"IEEE Access"},{"key":"522_CR18","doi-asserted-by":"publisher","first-page":"105192","DOI":"10.1016\/j.compag.2019.105192","volume":"169","author":"CX Liang","year":"2020","unstructured":"Liang CX, Xiong JT, Zheng ZH, Zhong Z, Li ZH, Chen SM, Yang ZG (2020) A visual detection method for nighttime litchi fruits and fruiting stems. Comput Electron Agr 169:105192","journal-title":"Comput Electron Agr"},{"issue":"5","key":"522_CR19","doi-asserted-by":"publisher","first-page":"23","DOI":"10.1364\/NOMA.2018.NoW2J.3","volume":"18","author":"JZ Liu","year":"2018","unstructured":"Liu JZ, Yuan Y, Zhou Y, Zhu XX, Syed TN (2018) Experiments and analysis of close-shot identification of on-branch citrus fruit with RealSense. Sensors (Basel) 18(5):23","journal-title":"Sensors (Basel)"},{"key":"522_CR20","doi-asserted-by":"crossref","unstructured":"Redmon J, Divvala S, Girshick R, Farhadi (2016) A You only look once: unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 779\u2013788","DOI":"10.1109\/CVPR.2016.91"},{"key":"522_CR21","doi-asserted-by":"crossref","unstructured":"Redmon J, Farhadi A (2017) YOLO9000: better, faster, stronger. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7263\u20137271","DOI":"10.1109\/CVPR.2017.690"},{"key":"522_CR22","unstructured":"Redmon J, Farhadi A (2018) YOLOv3: an incremental improvement. 1804.02767"},{"key":"522_CR23","unstructured":"Bochkovskiy A, Wang C, Liao H (2020) Yolov4: Optimal speed and accuracy of object detection."},{"key":"522_CR24","doi-asserted-by":"crossref","unstructured":"Bolya D, Zhou C, Xiao F, Lee Y (2019) Yolact: real-time instance segmentation. In: Proceedings of the IEEE\/CVF international conference on computer vision, pp 9157\u20139166","DOI":"10.1109\/ICCV.2019.00925"},{"key":"522_CR25","doi-asserted-by":"crossref","unstructured":"Bolya D, Zhou C, Xiao F, et al (2019) YOLACT++: better real-time instance segmentation [J]","DOI":"10.1109\/ICCV.2019.00925"},{"key":"522_CR26","doi-asserted-by":"crossref","unstructured":"Neubeck A, Van Gool L (2006) Efficient non-maximum suppression. In: Eighteenth international conference on pattern recognition (ICPR\u201906), pp 850\u2013855","DOI":"10.1109\/ICPR.2006.479"}],"container-title":["Complex &amp; Intelligent Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-021-00522-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s40747-021-00522-7\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-021-00522-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,8,3]],"date-time":"2022-08-03T10:25:16Z","timestamp":1659522316000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s40747-021-00522-7"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,9,28]]},"references-count":26,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2022,8]]}},"alternative-id":["522"],"URL":"https:\/\/doi.org\/10.1007\/s40747-021-00522-7","relation":{},"ISSN":["2199-4536","2198-6053"],"issn-type":[{"value":"2199-4536","type":"print"},{"value":"2198-6053","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,9,28]]},"assertion":[{"value":"29 March 2021","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"31 August 2021","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"28 September 2021","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare no conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"Not applicable.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Availability of data and materials"}},{"value":"Not applicable.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Code availability"}},{"value":"Not applicable.","order":5,"name":"Ethics","group":{"name":"EthicsHeading","label":"Humans and\/or animal rights"}},{"value":"Not applicable.","order":6,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval"}},{"value":"Not applicable.","order":7,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent to participate"}},{"value":"Not applicable.","order":8,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}}]}}