{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,31]],"date-time":"2026-03-31T07:27:06Z","timestamp":1774942026420,"version":"3.50.1"},"reference-count":37,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2022,8,30]],"date-time":"2022-08-30T00:00:00Z","timestamp":1661817600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,8,30]],"date-time":"2022-08-30T00:00:00Z","timestamp":1661817600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"Ministry of Sciences and Technology the People\u2019s Republic of China","award":["2019QZKK0301"],"award-info":[{"award-number":["2019QZKK0301"]}]},{"name":"Ministry of Sciences and Technology the People\u2019s Republic of China","award":["2021FY10070503"],"award-info":[{"award-number":["2021FY10070503"]}]},{"DOI":"10.13039\/501100002367","name":"Chinese Academy of Sciences","doi-asserted-by":"crossref","award":["KFJ-SW-YW043-4"],"award-info":[{"award-number":["KFJ-SW-YW043-4"]}],"id":[{"id":"10.13039\/501100002367","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100002367","name":"Chinese Academy of Sciences","doi-asserted-by":"crossref","award":["KFJ-SW-YW037-01"],"award-info":[{"award-number":["KFJ-SW-YW037-01"]}],"id":[{"id":"10.13039\/501100002367","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100002367","name":"Chinese Academy of Sciences","doi-asserted-by":"crossref","award":["XDA19050402"],"award-info":[{"award-number":["XDA19050402"]}],"id":[{"id":"10.13039\/501100002367","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100002367","name":"Chinese Academy of Sciences","doi-asserted-by":"crossref","award":["XDA26010101-4"],"award-info":[{"award-number":["XDA26010101-4"]}],"id":[{"id":"10.13039\/501100002367","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100002367","name":"Chinese Academy of Sciences","doi-asserted-by":"crossref","award":["XDA19020301"],"award-info":[{"award-number":["XDA19020301"]}],"id":[{"id":"10.13039\/501100002367","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["BMC Bioinformatics"],"abstract":"<jats:title>Abstract<\/jats:title><jats:sec><jats:title>Background<\/jats:title><jats:p>Fractional vegetation coverage (FVC) is a crucial parameter in determining vegetation structure. Automatic measurement of FVC using digital images captured by mobile smart devices is a potential direction for future research on field survey methods in plant ecology, and this algorithm is crucial for accurate FVC measurement. However, there is a lack of insight into the influence of illumination on the accuracy of FVC measurements. Therefore, the main objective of this research is to assess the adaptiveness and performance of different algorithms under varying light conditions for FVC measurements and to deepen our understanding of the influence of illumination on FVC measurement.<\/jats:p><\/jats:sec><jats:sec><jats:title>Methods and results<\/jats:title><jats:p>Based on a literature survey, we selected four algorithms that have been reported to have high accuracy in automatic FVC measurements. The first algorithm (Fun01) identifies green plants based on the combination of<jats:inline-formula><jats:alternatives><jats:tex-math>$$R\/G$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>R<\/mml:mi><mml:mo>\/<\/mml:mo><mml:mi>G<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>,<jats:inline-formula><jats:alternatives><jats:tex-math>$$B\/G$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>B<\/mml:mi><mml:mo>\/<\/mml:mo><mml:mi>G<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>, and<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExG$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>ExG<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>(<jats:inline-formula><jats:alternatives><jats:tex-math>$$R$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mi>R<\/mml:mi><\/mml:math><\/jats:alternatives><\/jats:inline-formula>,<jats:inline-formula><jats:alternatives><jats:tex-math>$$G$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mi>G<\/mml:mi><\/mml:math><\/jats:alternatives><\/jats:inline-formula>, and<jats:inline-formula><jats:alternatives><jats:tex-math>$$B$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mi>B<\/mml:mi><\/mml:math><\/jats:alternatives><\/jats:inline-formula>are the actual pixel digital numbers from the images based on each RGB channel,<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExG$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>ExG<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>is the abbreviation of the Excess Green index), the second algorithm (Fun02) is a decision tree that uses color properties to discriminate plants from the background, the third algorithm (Fun03) uses<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExG-ExR$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>E<\/mml:mi><mml:mi>x<\/mml:mi><mml:mi>G<\/mml:mi><mml:mo>-<\/mml:mo><mml:mi>E<\/mml:mi><mml:mi>x<\/mml:mi><mml:mi>R<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>(<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExR$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>ExR<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>is the abbreviation of the Excess Red index) to recognize plants in the image, and the fourth algorithm (Fun04) uses<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExG$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>ExG<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>and<jats:inline-formula><jats:alternatives><jats:tex-math>$$O{\\text{tsu}}$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>O<\/mml:mi><mml:mtext>tsu<\/mml:mtext><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>to separate the plants from the background.<jats:inline-formula><jats:alternatives><jats:tex-math>$$Otsu$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>Otsu<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>is an algorithm used to determine a threshold to transform the image into a binary image for the vegetation and background. We measured the FVC of several surveyed quadrats using these four algorithms under three scenarios, namely overcast sky, solar forenoon, and solar noon. FVC values obtained using the Photoshop-assisted manual identification method were used as a reference to assess the accuracy of the four algorithms selected. Results indicate that under the overcast sky scenario, Fun01 was more accurate than the other algorithms and the MAPE (mean absolute percentage error), BIAS, relBIAS (relative BIAS), RMSE (root mean square error), and relRMSE (relative RMSE) are 8.68%, 1.3, 3.97, 3.13, and 12.33%, respectively. Under the scenario of the solar forenoon, Fun02 (decision tree) was more accurate than other algorithms, and the MAPE, BIAS, relBIAS, RMSE, and relRMSE are 22.70%, \u2212\u20092.86, \u2212\u20097.70, 5.00, and 41.23%. Under the solar noon scenario, Fun02 was also more accurate than the other algorithms, and the MAPE, BIAS, relBIAS, RMSE, and relRMSE are 20.60%, \u2212\u20096.39, \u2212\u200920.67, 7.30, and 24.49%, respectively.<\/jats:p><\/jats:sec><jats:sec><jats:title>Conclusions<\/jats:title><jats:p>Given that each algorithm has its own optimal application scenario, among the four algorithms selected, Fun01 (the combination of<jats:inline-formula><jats:alternatives><jats:tex-math>$$R\/G$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>R<\/mml:mi><mml:mo>\/<\/mml:mo><mml:mi>G<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>,<jats:inline-formula><jats:alternatives><jats:tex-math>$$B\/G$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>B<\/mml:mi><mml:mo>\/<\/mml:mo><mml:mi>G<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>, and<jats:inline-formula><jats:alternatives><jats:tex-math>$$ExG$$<\/jats:tex-math><mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\"><mml:mrow><mml:mi>ExG<\/mml:mi><\/mml:mrow><\/mml:math><\/jats:alternatives><\/jats:inline-formula>) can be recommended for measuring FVC on cloudy days. Fun02 (decision tree) is more suitable for measuring the FVC on sunny days. However, it considerably underestimates the FVC in most cases. We expect the findings of this study to serve as a useful reference for automatic vegetation cover measurements.<\/jats:p><\/jats:sec>","DOI":"10.1186\/s12859-022-04886-6","type":"journal-article","created":{"date-parts":[[2022,8,30]],"date-time":"2022-08-30T12:02:52Z","timestamp":1661860972000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":5,"title":["Adaptiveness of RGB-image derived algorithms in the measurement of fractional vegetation coverage"],"prefix":"10.1186","volume":"23","author":[{"given":"Chuangye","family":"Song","sequence":"first","affiliation":[]},{"given":"Jiawen","family":"Sang","sequence":"additional","affiliation":[]},{"given":"Lin","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Huiming","family":"Liu","sequence":"additional","affiliation":[]},{"given":"Dongxiu","family":"Wu","sequence":"additional","affiliation":[]},{"given":"Weiying","family":"Yuan","sequence":"additional","affiliation":[]},{"given":"Chong","family":"Huang","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,8,30]]},"reference":[{"key":"4886_CR1","doi-asserted-by":"publisher","first-page":"3519","DOI":"10.1080\/014311698213795","volume":"19","author":"T Purevdorj","year":"2010","unstructured":"Purevdorj T, Tateishi R, Ishiyama T, Honda Y. Relationships between percent vegetation cover and vegetation indices. Int J Remote Sens. 2010;19:3519\u201335.","journal-title":"Int J Remote Sens"},{"key":"4886_CR2","volume-title":"Vegetation ecology","author":"YC Song","year":"2001","unstructured":"Song YC. Vegetation ecology. Shanghai: Huadong Normal University Press; 2001."},{"key":"4886_CR3","doi-asserted-by":"publisher","first-page":"437","DOI":"10.2307\/2259726","volume":"71","author":"JM Sykes","year":"1983","unstructured":"Sykes JM, Horril AD, Mountford MD. Use of visual cover assessments as quantitative estimators of British woodland taxa. J Ecol. 1983;71:437\u201350.","journal-title":"J Ecol"},{"issue":"6","key":"4886_CR4","first-page":"20","volume":"23","author":"ZG Chen","year":"2014","unstructured":"Chen ZG, Batunacun XZY, Hu YF. Measuring grassland cover using digital camera images. Acta Prataculturae Sinica. 2014;23(6):20\u20137.","journal-title":"Acta Prataculturae Sinica."},{"key":"4886_CR5","doi-asserted-by":"publisher","first-page":"29","DOI":"10.1556\/ComEc.4.2003.1.3","volume":"4","author":"I Hahn","year":"2003","unstructured":"Hahn I, Scheuring I. The effect of measurement scales on estimating vegetation cover: a computer assisted experiment. Community Ecol. 2003;4:29\u201333.","journal-title":"Community Ecol"},{"issue":"8","key":"4886_CR6","doi-asserted-by":"publisher","first-page":"10425","DOI":"10.3390\/rs70810425","volume":"7","author":"WJ Song","year":"2015","unstructured":"Song WJ, Mu XH, Yan GJ, Huang S. Extracting the green fractional vegetation cover from digital images using a Shadow-Resistant Algorithm (SHAR-LABFVC). Remote Sens. 2015;7(8):10425\u201310425.","journal-title":"Remote Sens"},{"issue":"4","key":"4886_CR7","first-page":"220","volume":"22","author":"CB Zhang","year":"2013","unstructured":"Zhang CB, Li JL, Zhang Y, Zhou W, Qian YR, Yang F. A quantitative analysis method for measuring grassland coverage based on RGB model. Acta Pratacul Sin. 2013;22(4):220\u20136.","journal-title":"Acta Pratacul Sin"},{"issue":"6","key":"4886_CR8","doi-asserted-by":"publisher","first-page":"2312","DOI":"10.2134\/agronj15.0150","volume":"107","author":"A Patrignani","year":"2015","unstructured":"Patrignani A, Ochsner TE. Canopeo: a powerful new tool for measuring fractional green canopy cover. Agron J. 2015;107(6):2312\u201320.","journal-title":"Agron J"},{"key":"4886_CR9","doi-asserted-by":"publisher","first-page":"238","DOI":"10.1016\/j.envexpbot.2008.09.013","volume":"65","author":"EA Graham","year":"2009","unstructured":"Graham EA, Yuen EM, Robertson GF, Kaiser WJ, Hamil Ton MP, Rundel PW. Budburst and leaf area expansion measured with a novel mobile camera system and simple color thresholding. Environ Exp Bot. 2009;65:238\u201344.","journal-title":"Environ Exp Bot."},{"key":"4886_CR10","doi-asserted-by":"publisher","first-page":"730","DOI":"10.2135\/cropsci2009.04.0233","volume":"50","author":"MD Richardson","year":"2010","unstructured":"Richardson MD, Karcher DE, Patton AJ, McCalla JH. Measurement of golf ball lie in various turfgrasses using digital image analysis. Crop Sci. 2010;50:730\u20136.","journal-title":"Crop Sci"},{"key":"4886_CR11","doi-asserted-by":"publisher","first-page":"406","DOI":"10.1111\/j.1654-1103.2011.01373.x","volume":"23","author":"YK Liu","year":"2012","unstructured":"Liu YK, Mu XH, Wang HX, Yan GJ. A novel method for extracting green fractional vegetation cover from digital images. J Veg Sci. 2012;23:406\u201318.","journal-title":"J Veg Sci"},{"key":"4886_CR12","doi-asserted-by":"crossref","unstructured":"Kataoka T, Kaneko T, Okamoto H, Hata S. Crop growth estimation system using machine vision. In: Proceedings of the IEEE\/ASME international conference on advanced intelligent mechatronics, Kobe, Japan, 20\u201324 July 2003; pp. b1079\u2013b1083.","DOI":"10.1109\/AIM.2003.1225492"},{"key":"4886_CR13","doi-asserted-by":"publisher","first-page":"49","DOI":"10.1016\/j.compag.2007.06.003","volume":"60","author":"C G\u00e9e","year":"2008","unstructured":"G\u00e9e C, Bossu J, Jones G, Truchetet F. Crop\/weed discrimination in perspective agronomic images. Comput Electron Agric. 2008;60:49\u201359.","journal-title":"Comput Electron Agric"},{"key":"4886_CR14","doi-asserted-by":"publisher","first-page":"66","DOI":"10.1016\/j.compag.2005.11.002","volume":"51","author":"JC Neto","year":"2006","unstructured":"Neto JC, Meyer GE, Jones DD. Individual leaf extractions from young canopy images using Gustafson-Kessel clustering and a genetic algorithm. Comput Electron Agric. 2006;51:66\u201385.","journal-title":"Comput Electron Agric"},{"key":"4886_CR15","doi-asserted-by":"crossref","unstructured":"Kirci M, Gunes EO, Cakir Y, Senturk S. Vegetation measurement using image processing methods. In: Proceedings of the IEEE third international conference on agro-geoinformatics, Beijing, China, 11\u201314 August 2014; pp. 1\u20135.","DOI":"10.1109\/Agro-Geoinformatics.2014.6910608"},{"key":"4886_CR16","doi-asserted-by":"publisher","first-page":"21","DOI":"10.1016\/j.compag.2013.08.022","volume":"99","author":"XD Bai","year":"2013","unstructured":"Bai XD, Li CN, Zhang XF, Wang Y, Cao ZG, Yu ZH. Crop segmentation from images by morphology modeling in the CIE L*a*b* color space. Comput Electron Agric. 2013;99:21\u201334.","journal-title":"Comput Electron Agric"},{"issue":"17","key":"4886_CR17","doi-asserted-by":"publisher","first-page":"3457","DOI":"10.1080\/01431160010004504","volume":"22","author":"Q Zhou","year":"2001","unstructured":"Zhou Q, Robson M. Automated rangeland vegetation cover and density estimation using ground digital images and a spectral-contextual classifier. Remote Sensing. 2001;22(17):3457\u201370.","journal-title":"Remote Sensing"},{"issue":"6","key":"4886_CR18","doi-asserted-by":"publisher","first-page":"1179","DOI":"10.1002\/rob.21727","volume":"34","author":"O Bawden","year":"2017","unstructured":"Bawden O, Kulk J, Russell R, McCool C, English A, Dayoub F, Perez T. Robot for plant species specific weed management. J Field Robot. 2017;34(6):1179\u201399.","journal-title":"J Field Robot"},{"issue":"1","key":"4886_CR19","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.jaridenv.2006.08.016","volume":"69","author":"AS Laliberte","year":"2007","unstructured":"Laliberte AS, Rango A, Herrick JE, Fredrickson EL, Burkett L. An object-based image analysis approach for determining fractional cover of senescent and green vegetation with digital plot photography. J Arid Environ. 2007;69(1):1\u201314.","journal-title":"J Arid Environ"},{"issue":"1","key":"4886_CR20","doi-asserted-by":"publisher","first-page":"103","DOI":"10.1186\/s13007-017-0253-8","volume":"13","author":"P Sadeghi-Tehran","year":"2017","unstructured":"Sadeghi-Tehran P, Virlet N, Sabermanesh K, Malcolm JH. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping. Plant Methods. 2017;13(1):103.","journal-title":"Plant Methods"},{"issue":"7","key":"4886_CR21","doi-asserted-by":"publisher","first-page":"474","DOI":"10.3390\/rs8070474","volume":"8","author":"A Coy","year":"2016","unstructured":"Coy A, Dale R, Michael T, David N, Jane C. Increasing the accuracy and automation of fractional vegetation cover estimation from digital photographs. Remote Sens. 2016;8(7):474.","journal-title":"Remote Sens"},{"key":"4886_CR22","volume-title":"Introduction to remote sensing","author":"JB Campbell","year":"1996","unstructured":"Campbell JB. Introduction to remote sensing. New York: Guilford Press; 1996."},{"key":"4886_CR23","doi-asserted-by":"publisher","first-page":"276","DOI":"10.1614\/WS-D-10-00054.1","volume":"59","author":"RN Lati","year":"2011","unstructured":"Lati RN, Filin S, Eizenberg H. Robust methods for measurement of leaf-cover area and biomass from image data. Weed Sci. 2011;59:276\u201384.","journal-title":"Weed Sci"},{"key":"4886_CR24","doi-asserted-by":"publisher","first-page":"341","DOI":"10.1080\/01904169909365631","volume":"22","author":"EV Lukina","year":"1999","unstructured":"Lukina EV, Stone ML, Rann WR. Estimating vegetation coverage in wheat using digital images. J Plant Nutr. 1999;22:341\u201350.","journal-title":"J Plant Nutr"},{"issue":"4","key":"4886_CR25","doi-asserted-by":"publisher","first-page":"519","DOI":"10.13031\/2013.16482","volume":"20","author":"GE Meyer","year":"2004","unstructured":"Meyer GE, Hindman TW, Jones DD, Mortensen DA. Digital camera operation and fuzzy logic classification of plant, soil, and residue color images. Appl Eng Agric. 2004;20(4):519\u201329.","journal-title":"Appl Eng Agric"},{"key":"4886_CR26","doi-asserted-by":"publisher","first-page":"675","DOI":"10.2111\/1551-5028(2004)057[0675:TNLCSF]2.0.CO;2","volume":"57","author":"DT Booth","year":"2004","unstructured":"Booth DT, Cox SE, Louhaichi M, Johnson DE. Technical note: lightweight camera stand for close-to-earth remote sensing. Rangel Ecol Manage. 2004;57:675\u20138.","journal-title":"Rangel Ecol Manage"},{"key":"4886_CR27","doi-asserted-by":"publisher","first-page":"190","DOI":"10.2307\/4003281","volume":"53","author":"JM Paruelo","year":"2000","unstructured":"Paruelo JM, Lauenroth WK, Roset PA. Technical note: estimating aboveground plant biomass using a photo-graphic technique. J Range Manage. 2000;53:190\u20133.","journal-title":"J Range Manage"},{"key":"4886_CR28","doi-asserted-by":"publisher","first-page":"282","DOI":"10.1016\/j.compag.2008.03.009","volume":"63","author":"GE Meyer","year":"2008","unstructured":"Meyer GE, Neto JC. Verification of color vegetation indices for automated crop imaging applications. Comput Electron Agric. 2008;63:282\u201393.","journal-title":"Comput Electron Agric"},{"key":"4886_CR29","doi-asserted-by":"publisher","first-page":"323","DOI":"10.1007\/s00442-006-0657-z","volume":"152","author":"AD Richardson","year":"2007","unstructured":"Richardson AD, Jenkins JP, Braswell BH, Hollinger DY, Ollinger SV, Smith ML. Use of digital webcam images to track spring green-up in a deciduous broadleaf forest. Oecologia. 2007;152:323\u201334.","journal-title":"Oecologia"},{"issue":"1","key":"4886_CR30","doi-asserted-by":"publisher","first-page":"62","DOI":"10.1109\/TSMC.1979.4310076","volume":"9","author":"NA Otsu","year":"1979","unstructured":"Otsu NA. threshold selection method from gray-level histograms. IEEE Trans Syst Man Cybern. 1979;9(1):62\u20136.","journal-title":"IEEE Trans Syst Man Cybern"},{"key":"4886_CR31","doi-asserted-by":"publisher","first-page":"405","DOI":"10.1111\/j.2041-210X.2011.00151.x","volume":"3","author":"C Macfarlane","year":"2012","unstructured":"Macfarlane C, Ogden GN. Automated estimation of foliage cover in forest understory from digital nadir images. Methods Ecol Evol. 2012;3:405\u201315.","journal-title":"Methods Ecol Evol"},{"key":"4886_CR32","doi-asserted-by":"publisher","first-page":"872","DOI":"10.1016\/j.isprsjprs.2011.08.005","volume":"66","author":"T Sakamoto","year":"2011","unstructured":"Sakamoto T, Shibayama M, Kimura A, Takada E. Assessment of digital camera-derived vegetation indices in quantitative monitoring of seasonal rice growth. ISPRS J Photogramm Remote Sens. 2011;66:872\u201382.","journal-title":"ISPRS J Photogramm Remote Sens"},{"issue":"6","key":"4886_CR33","doi-asserted-by":"publisher","first-page":"904","DOI":"10.3390\/electronics11060904","volume":"11","author":"TA Kumar","year":"2022","unstructured":"Kumar TA, Rajmohan R, Pavithra M, Ajagbe SA, Hodhod R, Gaber T. Automatic face mask detection system in public transportation in smart cities using IoT and deep learning. Electronics. 2022;11(6):904.","journal-title":"Electronics"},{"issue":"53","key":"4886_CR34","doi-asserted-by":"publisher","first-page":"51","DOI":"10.19101\/IJACR.2021.1152001","volume":"11","author":"SA Ajagbe","year":"2021","unstructured":"Ajagbe SA, Amuda KA, Oladipupo MA, Afe O, Okesola K. Multi-classification of Alzheimer Disease on magnetic resonance images (MRI) using deep convolution neural network approaches. Int J Adv Comput Res. 2021;11(53):51\u201360.","journal-title":"Int J Adv Comput Res"},{"key":"4886_CR35","doi-asserted-by":"crossref","unstructured":"Awotunde JB, Ajagbe SA, Oladipupo MO, Awokola JA, Afolabi OS, Timothy MO, Oguns YJ. An Improved machine learnings diagnosis technique for COVID-19 pandemic using chest X-ray images. In: Florez H, Pollo-Cattaneo MF, editors. Applied informatics. ICAI 2021. Communications in computer and information science. 2021; 1455. Springer, Cham.","DOI":"10.1007\/978-3-030-89654-6_23"},{"key":"4886_CR36","doi-asserted-by":"publisher","first-page":"67","DOI":"10.1186\/s13007-021-00748-z","volume":"17","author":"CY Song","year":"2021","unstructured":"Song CY, Yang B, Zhang L, Wu DX. A handheld device for measuring the diameter at breast height of individual trees using laser ranging and deep-learning based image recognition. Plant Methods. 2021;17:67.","journal-title":"Plant Methods."},{"issue":"3","key":"4886_CR37","doi-asserted-by":"publisher","first-page":"211","DOI":"10.1007\/s11263-015-0816-y","volume":"115","author":"O Russakovsky","year":"2015","unstructured":"Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang ZH, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L. ImageNet large scale visual recognition challenge. Int J Comput Vis. 2015;115(3):211\u201352.","journal-title":"Int J Comput Vis."}],"container-title":["BMC Bioinformatics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s12859-022-04886-6.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1186\/s12859-022-04886-6\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s12859-022-04886-6.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,10,2]],"date-time":"2024-10-02T20:53:44Z","timestamp":1727902424000},"score":1,"resource":{"primary":{"URL":"https:\/\/bmcbioinformatics.biomedcentral.com\/articles\/10.1186\/s12859-022-04886-6"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,8,30]]},"references-count":37,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2022,12]]}},"alternative-id":["4886"],"URL":"https:\/\/doi.org\/10.1186\/s12859-022-04886-6","relation":{},"ISSN":["1471-2105"],"issn-type":[{"value":"1471-2105","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,8,30]]},"assertion":[{"value":"6 May 2022","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"8 August 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"30 August 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The performance of this experiment was approved by the management institution of Beijing Botanical Garden, Chinese Academy of Sciences.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval and consent to participate"}},{"value":"Not applicable.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}},{"value":"The authors declare no conflict of interest.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"358"}}