{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,4]],"date-time":"2026-03-04T08:50:08Z","timestamp":1772614208699,"version":"3.50.1"},"reference-count":47,"publisher":"MDPI AG","issue":"24","license":[{"start":{"date-parts":[[2020,12,10]],"date-time":"2020-12-10T00:00:00Z","timestamp":1607558400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100003329","name":"Ministerio de Econom\u00eda y Competitividad","doi-asserted-by":"publisher","award":["AGL2013-48297-C2-2-R"],"award-info":[{"award-number":["AGL2013-48297-C2-2-R"]}],"id":[{"id":"10.13039\/501100003329","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100014440","name":"Ministerio de Ciencia, Innovaci\u00f3n y Universidades","doi-asserted-by":"publisher","award":["RTI2018-094222-B-I00"],"award-info":[{"award-number":["RTI2018-094222-B-I00"]}],"id":[{"id":"10.13039\/100014440","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The use of 3D sensors combined with appropriate data processing and analysis has provided tools to optimise agricultural management through the application of precision agriculture. The recent development of low-cost RGB-Depth cameras has presented an opportunity to introduce 3D sensors into the agricultural community. However, due to the sensitivity of these sensors to highly illuminated environments, it is necessary to know under which conditions RGB-D sensors are capable of operating. This work presents a methodology to evaluate the performance of RGB-D sensors under different lighting and distance conditions, considering both geometrical and spectral (colour and NIR) features. The methodology was applied to evaluate the performance of the Microsoft Kinect v2 sensor in an apple orchard. The results show that sensor resolution and precision decreased significantly under middle to high ambient illuminance (&gt;2000 lx). However, this effect was minimised when measurements were conducted closer to the target. In contrast, illuminance levels below 50 lx affected the quality of colour data and may require the use of artificial lighting. The methodology was useful for characterizing sensor performance throughout the full range of ambient conditions in commercial orchards. Although Kinect v2 was originally developed for indoor conditions, it performed well under a range of outdoor conditions.<\/jats:p>","DOI":"10.3390\/s20247072","type":"journal-article","created":{"date-parts":[[2020,12,10]],"date-time":"2020-12-10T08:59:34Z","timestamp":1607590774000},"page":"7072","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":35,"title":["Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions"],"prefix":"10.3390","volume":"20","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-3915-3584","authenticated-orcid":false,"given":"Jordi","family":"Gen\u00e9-Mola","sequence":"first","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4625-1860","authenticated-orcid":false,"given":"Jordi","family":"Llorens","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6746-2830","authenticated-orcid":false,"given":"Joan R.","family":"Rosell-Polo","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"given":"Eduard","family":"Gregorio","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1179-8794","authenticated-orcid":false,"given":"Jaume","family":"Arn\u00f3","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6611-4009","authenticated-orcid":false,"given":"Francesc","family":"Solanelles","sequence":"additional","affiliation":[{"name":"Department of Agriculture, Livestock, Fisheries and Food, Generalitat de Catalunya, Lleida, 25198 Catalunya, Spain"}]},{"given":"Jos\u00e9 A.","family":"Mart\u00ednez-Casasnovas","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Environmental and Soil Sciences, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9775-5471","authenticated-orcid":false,"given":"Alexandre","family":"Escol\u00e0","sequence":"additional","affiliation":[{"name":"Research Group in AgroICT &amp; Precision Agriculture, Department of Agricultural and Forest Engineering, Universitat de Lleida (UdL)\u2013Agrotecnio Centre, Lleida, 25198 Catalonia, Spain"}]}],"member":"1968","published-online":{"date-parts":[[2020,12,10]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"647","DOI":"10.1177\/0278364911434148","article-title":"RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments","volume":"31","author":"Henry","year":"2012","journal-title":"Int. J. Rob. Res."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"229","DOI":"10.1016\/j.agrformet.2018.06.017","article-title":"LIDAR and non-LIDAR-based canopy parameters to estimate the leaf area in fruit trees and vineyard","volume":"260","author":"Sanz","year":"2018","journal-title":"Agric. For. Meteorol."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Gen\u00e9-Mola, J., Sanz-Cortiella, R., Rosell-Polo, J.R., Morros, J.-R., Ruiz-Hidalgo, J., Vilaplana, V., and Gregorio, E. (2020). Fruit detection and 3D location using instance segmentation neural networks and structure-from-motion photogrammetry. Comput. Electron. Agric., 169.","DOI":"10.1016\/j.compag.2019.105165"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Sarbolandi, H., Lefloch, D., and Kolb, A. (2015). Kinect range sensing: Structured-light versus Time-of-Flight Kinect. Comput. Vis. Image Underst.","DOI":"10.1016\/j.cviu.2015.05.006"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Dal Mutto, C., Zanuttigh, P., and Cortelazzo, G. (2012). Time-of-Flight Cameras and Microsoft KinectTM, Springer Science & Business Media.","DOI":"10.1007\/978-1-4614-3807-6"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Giancola, S., Valenti, M., and Sala, R. (2018). A survey on 3D cameras: Metrological comparison of time-of-flight, structured-light and active stereoscopy technologies. Springer Briefs in Computer Science, Springer.","DOI":"10.1007\/978-3-319-91761-0"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"689","DOI":"10.1016\/j.compag.2019.05.016","article-title":"Multi-modal deep learning for Fuji apple detection using RGB-D cameras and their radiometric capabilities","volume":"162","author":"Vilaplana","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"33","DOI":"10.1016\/j.biosystemseng.2016.01.007","article-title":"Detection of red and bicoloured apples on tree with an RGB-D camera","volume":"146","author":"Nguyen","year":"2016","journal-title":"Biosyst. Eng."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"104","DOI":"10.1016\/j.compag.2015.02.001","article-title":"Obstacle detection in a greenhouse environment using the Kinect sensor","volume":"113","author":"Nissimov","year":"2015","journal-title":"Comput. Electron. Agric."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"392","DOI":"10.1016\/j.compag.2019.01.009","article-title":"Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper","volume":"157","author":"Xiong","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"And\u00fajar, D., Dorado, J., Fern\u00e1ndez-Quintanilla, C., and Ribeiro, A. (2016). An approach to the use of depth cameras for weed volume estimation. Sensors, 16.","DOI":"10.3390\/s16070972"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"35","DOI":"10.1002\/rob.21897","article-title":"Automated crop plant detection based on the fusion of color and depth images for robotic weed control","volume":"37","author":"Gai","year":"2020","journal-title":"J. F. Robot."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"122","DOI":"10.1016\/j.compag.2011.12.007","article-title":"On the use of depth camera for 3D phenotyping of entire plants","volume":"82","author":"Rousseau","year":"2012","journal-title":"Comput. Electron. Agric."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"20463","DOI":"10.3390\/s150820463","article-title":"In situ 3D segmentation of individual plant leaves using a RGB-D camera for agricultural automation","volume":"15","author":"Xia","year":"2015","journal-title":"Sensors"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"4019","DOI":"10.3390\/s150204019","article-title":"Digitization and visualization of greenhouse tomato plants in indoor environments","volume":"15","author":"Li","year":"2015","journal-title":"Sensors"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"16216","DOI":"10.3390\/s131216216","article-title":"Assessing the potential of low-cost 3D cameras for the rapid measurement of plant woody structure","volume":"13","author":"Nock","year":"2013","journal-title":"Sensors"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"3001","DOI":"10.3390\/s140203001","article-title":"Low-cost 3D systems: Suitable tools for plant phenotyping","volume":"14","author":"Paulus","year":"2014","journal-title":"Sensors"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"2384","DOI":"10.3390\/s130202384","article-title":"Rapid characterization of vegetation structure with a microsoft kinect sensor","volume":"13","author":"Azzari","year":"2013","journal-title":"Sensors"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"71","DOI":"10.1016\/bs.agron.2015.05.002","article-title":"Advances in Structured Light Sensors Applications in Precision Agriculture and Livestock Farming","volume":"133","author":"Cheein","year":"2015","journal-title":"Adv. Agron."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"12999","DOI":"10.3390\/s150612999","article-title":"Matching the best viewing angle in depth cameras for biomass estimation based on poplar seedling geometry","volume":"15","author":"Dorado","year":"2015","journal-title":"Sensors"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"67","DOI":"10.1016\/j.compag.2016.01.018","article-title":"Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops","volume":"122","author":"Ribeiro","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"V\u00e1zquez-arellano, M., Griepentrog, H.W., Reiser, D., and Paraforos, D.S. (2016). 3-D Imaging Systems for Agricultural Applications\u2014A Review. Sensors, 16.","DOI":"10.3390\/s16050618"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"H\u00e4mmerle, M., and H\u00f6fle, B. (2016). Direct derivation of maize plant and crop height from low-cost time-of-flight camera measurements. Plant Methods, 12.","DOI":"10.1186\/s13007-016-0150-6"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"276","DOI":"10.1016\/j.compag.2018.09.006","article-title":"Determination of stem position and height of reconstructed maize plants using a time-of-flight camera","volume":"154","author":"Paraforos","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"86","DOI":"10.1016\/j.biosystemseng.2018.11.005","article-title":"Field-based architectural traits characterisation of maize plant using time-of-flight 3D imaging","volume":"178","author":"Bao","year":"2019","journal-title":"Biosyst. Eng."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"235","DOI":"10.1016\/j.compag.2018.01.002","article-title":"3-D reconstruction of maize plants using a time-of- fl ight camera","volume":"145","author":"Reiser","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"2420","DOI":"10.1109\/TMECH.2017.2663436","article-title":"Kinect v2 Sensor-based Mobile Terrestrial Laser Scanner for Agricultural Outdoor Applications","volume":"22","author":"Gregorio","year":"2017","journal-title":"IEEE\/ASME Trans. Mechatron."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Bengochea-Guevara, J.M., And\u00fajar, D., Sanchez-Sardana, F.L., Cantu\u00f1a, K., and Ribeiro, A. (2018). A low-cost approach to automatically obtain accurate 3D models of woody crops. Sensors, 18.","DOI":"10.3390\/s18010030"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"And\u00fajar, D., Dorado, J., Bengochea-Guevara, J.M., Conesa-Mu\u00f1oz, J., Fern\u00e1ndez-Quintanilla, C., and Ribeiro, \u00c1. (2017). Influence of Wind Speed on RGB-D Images in Tree Plantations. Sensors, 17.","DOI":"10.3390\/s17040914"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"386","DOI":"10.1016\/j.compag.2018.10.029","article-title":"Branch detection for apple trees trained in fruiting wall architecture using depth features and Regions-Convolutional Neural Network (R-CNN)","volume":"155","author":"Zhang","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"293","DOI":"10.1016\/j.compag.2018.11.026","article-title":"In-field high throughput grapevine phenotyping with a consumer-grade depth camera","volume":"156","author":"Milella","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1002\/rob.21876","article-title":"Semantic Mapping for Orchard Environments by Merging Two-Sides Reconstructions of Tree Rows","volume":"37","author":"Dong","year":"2018","journal-title":"J. F. Robot."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Vit, A., and Shani, G. (2018). Comparing RGB-D Sensors for Close Range Outdoor Agricultural Phenotyping. Sensors, 18.","DOI":"10.20944\/preprints201810.0664.v1"},{"key":"ref_34","unstructured":"Gen\u00e9-Mola, J., Llorens, J., Rosell-Polo, J.R., Gregorio, E., Arn\u00f3, J., Solanelles-Batlle, F., Martinez-Casasnovas, J.A., and Escol\u00e0, A. (2020). KEvOr dataset. Zenodo."},{"key":"ref_35","unstructured":"Gen\u00e9-Mola, J., Llorens, J., Rosell-Polo, J.R., Gregorio, E., Arn\u00f3, J., Solanelles-Batlle, F., Martinez-Casasnovas, J.A., and Escol\u00e0, A. (2020). Matlab implementation to evaluate RGB-D sensor performance in orchard environments. GitHub Repos., in press."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"171","DOI":"10.1016\/j.biosystemseng.2019.08.017","article-title":"Fruit detection in an apple orchard using a mobile terrestrial laser scanner","volume":"187","author":"Gregorio","year":"2019","journal-title":"Biosyst. Eng."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1975","DOI":"10.1109\/JSEN.2015.2508802","article-title":"Low-Cost Reflectance-Based Method for the Radiometric Calibration of Kinect 2","volume":"16","year":"2016","journal-title":"IEEE Sens. J."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"124","DOI":"10.1016\/j.compag.2011.09.007","article-title":"A review of methods and applications of the geometric characterization of tree crops in agricultural activities","volume":"81","author":"Rosell","year":"2012","journal-title":"Comput. Electron. Agric."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"104","DOI":"10.1016\/j.compag.2018.01.022","article-title":"Mechatronic terrestrial LiDAR for canopy porosity and crown surface estimation","volume":"146","author":"Pfeiffer","year":"2018","journal-title":"Comput. Electron. Agric."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"231","DOI":"10.1016\/j.compag.2017.05.014","article-title":"Flexible system of multiple RGB-D sensors for measuring and classifying fruits in agri-food Industry","volume":"139","author":"Cheein","year":"2017","journal-title":"Comput. Electron. Agric."},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"18587","DOI":"10.3390\/s150818587","article-title":"Structured light-based 3D reconstruction system for plants","volume":"15","author":"Nguyen","year":"2015","journal-title":"Sensors"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"160","DOI":"10.1016\/j.compag.2013.11.011","article-title":"Estimating mango crop yield using image analysis using fruit at \u201cstone hardening\u201d stage and night time imaging","volume":"100","author":"Payne","year":"2014","journal-title":"Comput. Electron. Agric."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Li, N., Zhang, X., Zhang, C., Ge, L., He, Y., and Wu, X. (2019, January 6\u20138). Review of machine-vision-based plant detection technologies for robotic weeding. Proceedings of the 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), Dali, China.","DOI":"10.1109\/ROBIO49542.2019.8961381"},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"1027","DOI":"10.1002\/rob.21937","article-title":"Development of a sweet pepper harvesting robot","volume":"37","author":"Arad","year":"2020","journal-title":"J. F. Robot."},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1016\/j.compag.2015.10.022","article-title":"Apple crop-load estimation with over-the-row machine vision system","volume":"120","author":"Gongal","year":"2016","journal-title":"Comput. Electron. Agric."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"67","DOI":"10.1109\/MRA.2018.2852795","article-title":"An empirical evaluation of ten depth cameras: Bias, precision, lateral noise, different lighting conditions and materials, and multiple sensor setups in indoor environments","volume":"26","author":"Suchi","year":"2019","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"8741","DOI":"10.1109\/JSEN.2019.2920976","article-title":"Comparative study of intel R200, Kinect v2, and primesense RGB-D sensors performance outdoors","volume":"19","author":"Kuan","year":"2019","journal-title":"IEEE Sens. J."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/24\/7072\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T10:43:16Z","timestamp":1760179396000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/24\/7072"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,12,10]]},"references-count":47,"journal-issue":{"issue":"24","published-online":{"date-parts":[[2020,12]]}},"alternative-id":["s20247072"],"URL":"https:\/\/doi.org\/10.3390\/s20247072","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,12,10]]}}}