{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,6]],"date-time":"2026-04-06T03:55:24Z","timestamp":1775447724487,"version":"3.50.1"},"reference-count":25,"publisher":"MDPI AG","issue":"9","license":[{"start":{"date-parts":[[2018,9,5]],"date-time":"2018-09-05T00:00:00Z","timestamp":1536105600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100000923","name":"Australian Research Council","doi-asserted-by":"publisher","award":["IH120100053"],"award-info":[{"award-number":["IH120100053"]}],"id":[{"id":"10.13039\/501100000923","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>In sensory evaluation, there have been many attempts to obtain responses from the autonomic nervous system (ANS) by analyzing heart rate, body temperature, and facial expressions. However, the methods involved tend to be intrusive, which interfere with the consumers\u2019 responses as they are more aware of the measurements. Furthermore, the existing methods to measure different ANS responses are not synchronized among them as they are measured independently. This paper discusses the development of an integrated camera system paired with an Android PC application to assess sensory evaluation and biometric responses simultaneously in the Cloud, such as heart rate, blood pressure, facial expressions, and skin-temperature changes using video and thermal images acquired by the integrated system and analyzed through computer vision algorithms written in Matlab\u00ae, and FaceReaderTM. All results can be analyzed through customized codes for multivariate data analysis, based on principal component analysis and cluster analysis. Data collected can be also used for machine-learning modeling based on biometrics as inputs and self-reported data as targets. Based on previous studies using this integrated camera and analysis system, it has shown to be a reliable, accurate, and convenient technique to complement the traditional sensory analysis of both food and nonfood products to obtain more information from consumers and\/or trained panelists.<\/jats:p>","DOI":"10.3390\/s18092958","type":"journal-article","created":{"date-parts":[[2018,9,6]],"date-time":"2018-09-06T02:55:07Z","timestamp":1536202507000},"page":"2958","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":53,"title":["Development of a Biosensory Computer Application to Assess Physiological and Emotional Responses from Sensory Panelists"],"prefix":"10.3390","volume":"18","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-0377-5085","authenticated-orcid":false,"given":"Sigfredo","family":"Fuentes","sequence":"first","affiliation":[{"name":"Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9207-9307","authenticated-orcid":false,"given":"Claudia","family":"Gonzalez Viejo","sequence":"additional","affiliation":[{"name":"Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia"}]},{"given":"Damir D.","family":"Torrico","sequence":"additional","affiliation":[{"name":"Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3998-1240","authenticated-orcid":false,"given":"Frank R.","family":"Dunshea","sequence":"additional","affiliation":[{"name":"Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia"}]}],"member":"1968","published-online":{"date-parts":[[2018,9,5]]},"reference":[{"key":"ref_1","unstructured":"Moya, F.I., and Angulo, Y.B. (2001). An\u00e1lisis Sensorial de Alimentos: M\u00e9todos y Aplicaciones, Springer Iberica."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"19","DOI":"10.1111\/j.1745-459X.1999.tb00102.x","article-title":"Note on computerized data collection in consumer sensory evaluation","volume":"14","author":"Plemmons","year":"1999","journal-title":"J. Sens. Stud."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Gonzalez Viejo, C., Fuentes, S., Howell, K., Torrico, D.D., and Dunshea, F.R. (2018). Integration of non-invasive biometrics with sensory analysis techniques to assess acceptability of beer by consumers. Physiol. Behav.","DOI":"10.1016\/j.physbeh.2018.02.051"},{"key":"ref_4","unstructured":"Stone, H., Bleibaum, R., and Thomas, H.A. (2012). Sensory Evaluation Practices, Elsevier\/Academic Press."},{"key":"ref_5","unstructured":"Kemp, S., Hollowood, T., and Hort, J. (2011). Sensory Evaluation: A Practical Handbook, Wiley."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"394","DOI":"10.1016\/j.biopsycho.2010.03.010","article-title":"Autonomic nervous system activity in emotion: A review","volume":"84","author":"Kreibig","year":"2010","journal-title":"Biol. Psychol."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"72","DOI":"10.1016\/j.foodcont.2018.04.037","article-title":"Robotics and computer vision techniques combined with non-invasive consumer biometrics to assess quality traits from beer foamability using machine learning: A potential for artificial intelligence applications","volume":"92","author":"Viejo","year":"2018","journal-title":"Food Control"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"60","DOI":"10.1016\/j.foodqual.2017.11.010","article-title":"Images and chocolate stimuli affect physiological and affective responses of consumers: A cross-cultural study","volume":"65","author":"Torrico","year":"2018","journal-title":"Food Qual. Preference"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"De Wijk, R.A., He, W., Mensink, M.G., Verhoeven, R.H., and de Graaf, C. (2014). ANS responses and facial expressions differentiate between the taste of commercial breakfast drinks. PLoS ONE, 9.","DOI":"10.1371\/journal.pone.0093823"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"196","DOI":"10.1016\/j.foodqual.2012.04.015","article-title":"Autonomic nervous system responses on and facial expressions to the sight, smell, and taste of liked and disliked foods","volume":"26","author":"Kooijman","year":"2012","journal-title":"Food Qual. Preference"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"68","DOI":"10.1016\/j.foodqual.2017.02.006","article-title":"A comparison of self-reported emotional and implicit responses to aromas in beer","volume":"59","author":"Beyts","year":"2017","journal-title":"Food Qual. Preference"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"478","DOI":"10.1016\/j.physbeh.2017.07.025","article-title":"Beyond expectations: The responses of the autonomic nervous system to visual food cues","volume":"179","year":"2017","journal-title":"Physiol. Behav."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Taamneh, S., Dcosta, M., Kwon, K.-A., and Pavlidis, I. (2016, January 7\u201312). SubjectBook: Hypothesis-Driven Ubiquitous Visualization for Affective Studies. Proceedings of the CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA.","DOI":"10.1145\/2851581.2892338"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"170110","DOI":"10.1038\/sdata.2017.110","article-title":"A multimodal dataset for various forms of distracted driving","volume":"4","author":"Taamneh","year":"2017","journal-title":"Sci. Data"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"21","DOI":"10.1186\/1743-0003-9-21","article-title":"A review of wearable sensors and systems with application in rehabilitation","volume":"9","author":"Patel","year":"2012","journal-title":"J. Neuroeng. Rehabil."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"186","DOI":"10.1016\/j.measurement.2016.11.039","article-title":"Evaluation of psychological effects on human postural stability","volume":"98","author":"Frelih","year":"2017","journal-title":"Measurement"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"150","DOI":"10.1016\/j.cviu.2006.11.018","article-title":"Interacting with human physiology","volume":"108","author":"Pavlidis","year":"2007","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"167","DOI":"10.1016\/j.foodqual.2013.01.004","article-title":"Make a face! Implicit and explicit measurement of facial expressions elicited by orange juices using face reading technology","volume":"32","author":"Danner","year":"2014","journal-title":"Food Qual. Preference"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Viejo, C.G., Fuentes, S., Torrico, D.D., and Dunshea, F.R. (2018). Non-Contact Heart Rate and Blood Pressure Estimations from Video Analysis and Machine Learning Modelling Applied to Food Sensory Responses: A Case Study for Chocolate. Sensors, 18.","DOI":"10.3390\/s18061802"},{"key":"ref_20","unstructured":"Viola, P., and Jones, M. (2001, January 8\u201314). Rapid object detection using a boosted cascade of simple features. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Kauai, HI, USA."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"468","DOI":"10.3758\/s13428-011-0064-1","article-title":"The Geneva affective picture database (GAPED): A new 730-picture database focusing on valence and normative significance","volume":"43","author":"Scherer","year":"2011","journal-title":"Behav. Res. Methods"},{"key":"ref_22","unstructured":"Gunaratne, N.M., Gonzalez Viejo, C., Gunaratne, T.M., Torrico, D.D., Ashman, H., and Dunshea, F.R. Image stimuli affect self-reported emotions and biometric facial expression responses, J. Sens. Stud., submitted."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"475","DOI":"10.1016\/j.lwt.2017.10.048","article-title":"Analysis of thermochromic label elements and colour transitions using sensory acceptability and eye tracking techniques","volume":"89","author":"Torrico","year":"2018","journal-title":"LWT"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1381","DOI":"10.1111\/1750-3841.14114","article-title":"Assessment of Beer Quality Based on a Robotic Pourer, Computer Vision, and Machine Learning Algorithms Using Commercial Beers","volume":"83","author":"Fuentes","year":"2018","journal-title":"J. Food Sci."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"504","DOI":"10.1016\/j.foodres.2016.08.045","article-title":"Development of a robotic pourer constructed with ubiquitous materials, open hardware and sensors to assess beer foam quality using computer vision and pattern recognition algorithms: RoboBEER","volume":"89","author":"Fuentes","year":"2016","journal-title":"Food Res. Int."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/18\/9\/2958\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T15:18:59Z","timestamp":1760195939000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/18\/9\/2958"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,9,5]]},"references-count":25,"journal-issue":{"issue":"9","published-online":{"date-parts":[[2018,9]]}},"alternative-id":["s18092958"],"URL":"https:\/\/doi.org\/10.3390\/s18092958","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,9,5]]}}}