{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T21:43:58Z","timestamp":1773870238962,"version":"3.50.1"},"reference-count":34,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2025,2,10]],"date-time":"2025-02-10T00:00:00Z","timestamp":1739145600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Computers"],"abstract":"<jats:p>Gaze tracking and estimation are essential for understanding human behavior and enhancing human\u2013computer interactions. This study introduces an innovative, cost-effective solution for real-time gaze tracking using a standard webcam, providing a practical alternative to conventional methods that rely on expensive infrared (IR) cameras. Traditional approaches, such as Pupil Center Corneal Reflection (PCCR), require IR cameras to capture corneal reflections and iris glints, demanding high-resolution images and controlled environments. In contrast, the proposed method utilizes a convolutional neural network (CNN) trained on webcam-captured images to achieve precise gaze estimation. The developed deep learning model achieves a mean squared error (MSE) of 0.0112 and an accuracy of 90.98% through a novel trajectory-based accuracy evaluation system. This system involves an animation of a ball moving across the screen, with the user\u2019s gaze following the ball\u2019s motion. Accuracy is determined by calculating the proportion of gaze points falling within a predefined threshold based on the ball\u2019s radius, ensuring a comprehensive evaluation of the system\u2019s performance across all screen regions. Data collection is both simplified and effective, capturing images of the user\u2019s right eye while they focus on the screen. Additionally, the system includes advanced gaze analysis tools, such as heat maps, gaze fixation tracking, and blink rate monitoring, which are all integrated into an intuitive user interface. The robustness of this approach is further enhanced by incorporating Google\u2019s Mediapipe model for facial landmark detection, improving accuracy and reliability. The evaluation results demonstrate that the proposed method delivers high-accuracy gaze prediction without the need for expensive equipment, making it a practical and accessible solution for diverse applications in human\u2013computer interactions and behavioral research.<\/jats:p>","DOI":"10.3390\/computers14020057","type":"journal-article","created":{"date-parts":[[2025,2,12]],"date-time":"2025-02-12T04:37:04Z","timestamp":1739335024000},"page":"57","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":5,"title":["Real-Time Gaze Estimation Using Webcam-Based CNN Models for Human\u2013Computer Interactions"],"prefix":"10.3390","volume":"14","author":[{"given":"Visal","family":"Vidhya","sequence":"first","affiliation":[{"name":"School of Physics, Engineering and Computer Science, University of Hertfordshire, College Lane, Hertfordshire, Hatfield AL10 9AB, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2771-1713","authenticated-orcid":false,"given":"Diego","family":"Resende Faria","sequence":"additional","affiliation":[{"name":"School of Physics, Engineering and Computer Science, University of Hertfordshire, College Lane, Hertfordshire, Hatfield AL10 9AB, UK"}]}],"member":"1968","published-online":{"date-parts":[[2025,2,10]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Clay, V., K\u00f6nig, P., and K\u00f6nig, S.U. (2019). Eye tracking in virtual reality. J. Eye Mov. Res., 12.","DOI":"10.16910\/jemr.12.1.3"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Suzuki, Y., Shirahada, K., Kosaka, M., and Maki, A. (2012, January 2\u20134). A new marketing methodology by integrating brain measurement, eye tracking, and questionnaire analysis. Proceedings of the ICSSSM12, Shanghai, China.","DOI":"10.1109\/ICSSSM.2012.6252344"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Rotariu, C., Costin, H., Bozomitu, R.G., Petroiu-Andruseac, G., Ursache, T.I., and Cojocaru, C.D. (2019, January 21\u201323). New assistive technology for communicating with disabled people based on gaze interaction. Proceedings of the 2019 E-Health and Bioengineering Conference (EHB), Iasi, Romania.","DOI":"10.1109\/EHB47216.2019.8969981"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"De Silva, S., Dayarathna, S., Ariyarathne, G., Meedeniya, D., Jayarathna, S., Michalek, A.M.P., and Jayawardena, G. (2019, January 3\u20135). A Rule-Based System for ADHD Identification using Eye Movement Data. Proceedings of the 2019 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka.","DOI":"10.1109\/MERCon.2019.8818865"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Outram, B., Pai, Y.S., Person, T., Minamizawa, K., and Kunze, K. (2018, January 14\u201317). Anyorbit. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland.","DOI":"10.1145\/3204493.3209579"},{"key":"ref_6","first-page":"e34512","article-title":"Examining Visual Attention to Tobacco Marketing Materials Among Young Adult Smokers: Protocol for a Remote Webcam-Based Eye-Tracking Experiment","volume":"12","author":"Elhabashy","year":"2023","journal-title":"JMIR Res. Protoc."},{"key":"ref_7","unstructured":"Zhu, Z., and Ji, Q. (2005, January 20\u201325). Eye Gaze Tracking under Natural Head Movements. Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR\u201905), San Diego, CA, USA."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Macinnes, J.J., Iqbal, S., Pearson, J., and Johnson, E.N. (2018). Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy\/precision comparisons across devices. bioRxiv.","DOI":"10.1101\/299925"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Wood, E., Baltrusaitis, T., Morency, L.-P., Robinson, P.N., and Bulling, A. (2016, January 14\u201317). Learning an appearance-based gaze estimator from one million synthesised images. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA.","DOI":"10.1145\/2857491.2857492"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., and Torralba, A. (2016, January 27\u201330). Eye Tracking for Everyone. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.239"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Deng, H., and Zhu, W. (2017, January 22\u201329). Monocular Free-Head 3D Gaze Tracking with Deep Learning and Geometry Constraints. Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy.","DOI":"10.1109\/ICCV.2017.341"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Jigang, L., Lee, S., and Rajan, D. (2019, January 11\u201313). Free-Head Appearance-Based Eye Gaze Estimation on Mobile Devices. Proceedings of the 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Okinawa, Japan.","DOI":"10.1109\/ICAIIC.2019.8669057"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Chen, H.-H., Hwang, B.-J., Wu, J.-S., and Liu, P.-T. (2020). The Effect of Different Deep Network Architectures upon CNN-Based Gaze Tracking. Algorithms, 13.","DOI":"10.3390\/a13050127"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"36","DOI":"10.24018\/ejece.2021.5.2.314","article-title":"Convolutional Neural Networks(CNN) based Eye-Gaze Tracking System using Machine Learning Algorithm","volume":"5","author":"Kanade","year":"2021","journal-title":"Eur. J. Electr. Eng. Comput. Sci."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Ansari, M.F., Kasprowski, P., and Obetkal, M. (2021). Gaze Tracking Using an Unmodified Web Camera and Convolutional Neural Network. Appl. Sci., 11.","DOI":"10.3390\/app11199068"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"409","DOI":"10.1080\/10494820.2022.2088561","article-title":"A robust, real-time camera-based eye gaze tracking system to analyze users\u2019 visual attention using deep learning","volume":"32","author":"Singh","year":"2022","journal-title":"Interact. Learn. Environ."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Darapaneni, N., Prakash, M.D., Sau, B., Madineni, M., Jangwan, R., Paduri, A.R., Jairajan, K.P., Belsare, M., and Madhavankutty, P. (2022, January 24\u201326). Eye Tracking Analysis Using Convolutional Neural Network. Proceedings of the 2022 Interdisciplinary Research in Technology and Management (IRTM), Kolkata, India.","DOI":"10.1109\/IRTM54583.2022.9791826"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"39103","DOI":"10.1007\/s11042-022-13085-7","article-title":"A CNN based real-time eye tracker for web mining applications","volume":"81","author":"Donuk","year":"2022","journal-title":"Multimed. Tools Appl."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Zhang, X., Fan, C.-T., Yuan, S.M., and Peng, Z.-Y. (2015, January 19\u201321). An Advertisement Video Analysis System Based on Eye-Tracking. Proceedings of the 2015 IEEE International Conference on Smart City, Chengdu, China.","DOI":"10.1109\/SmartCity.2015.118"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Lee, I., Cha, J., Seo, J., and Kwon, O. (2015, January 1\u20133). User interest visualizing and analysing system using eye gaze. Proceedings of the 2015 17th International Conference on Advanced Communication Technology (ICACT), PyeongChang, Republic of Korea.","DOI":"10.1109\/ICACT.2015.7224802"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Okano, M., and Asakawa, M. (2017, January 8\u201310). Eye tracking analysis of consumer\u2019s attention to the product message of web advertisements and TV commercials. Proceedings of the 2017 5th International Conference on Cyber and IT Service Management (CITSM), Denpasar, Indonesia.","DOI":"10.1109\/CITSM.2017.8089270"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"10699","DOI":"10.1109\/ACCESS.2018.2802206","article-title":"An Eye Tracking Analysis for Video Advertising: Relationship Between Advertisement Elements and Effectiveness","volume":"6","author":"Zhang","year":"2018","journal-title":"IEEE Access"},{"key":"ref_23","first-page":"97","article-title":"Discovering prominent themes of the application of eye tracking technology in marketing research","volume":"22","year":"2022","journal-title":"Cuad. Gesti\u00f3n"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1489","DOI":"10.1007\/s10055-022-00642-6","article-title":"Real-time camera-based eye gaze tracking using convolutional neural network: A case study on social media website","volume":"26","author":"Modi","year":"2022","journal-title":"Virtual Real."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"109","DOI":"10.1159\/000529114","article-title":"An Eye Tracking Investigation of Young People\u2019s Gaze Behaviour to Gambling and Non-Gambling Moving Adverts","volume":"29","author":"Onwuegbusi","year":"2023","journal-title":"Eur. Addict. Res."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"397","DOI":"10.1080\/00913367.2023.2258388","article-title":"Understanding Consumers\u2019 Visual Attention in Mobile Advertisements: An Ambulatory Eye-Tracking Study with Machine Learning Techniques","volume":"53","author":"Xie","year":"2023","journal-title":"J. Advert."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Tsubouchi, K., Taoka, K., Ikematsu, K., Yamanaka, S., Narumi, K., and Kawahara, Y. (2024, January 11\u201315). Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User\u2019s Gaze. Proceedings of the 2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops), Biarritz, France.","DOI":"10.1109\/PerComWorkshops59983.2024.10502602"},{"key":"ref_28","unstructured":"Soukupova, T., and Cech, J. (2016, January 3\u20135). Eye blink detection using facial landmarks. Proceedings of the 21st Computer Vision Winter Workshop, Rimske Toplice, Slovenia."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Atchison, D.A. (2023). Optics of the Human Eye, CRC Press.","DOI":"10.1201\/9781003128601"},{"key":"ref_30","unstructured":"Encyclopedia Britannica (2025, February 01). Human Eye\u2014Extraocular Muscles. Available online: https:\/\/www.britannica.com\/science\/human-eye\/Extraocular-muscles#\/media\/1\/1688997\/3421."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"2037","DOI":"10.1007\/s11042-012-1220-z","article-title":"Gaze direction estimation using support vector machine with active appearance model","volume":"70","author":"Wu","year":"2012","journal-title":"Multimed. Tools Appl."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"19581","DOI":"10.1109\/ACCESS.2017.2754299","article-title":"Webcam-Based Eye Movement Analysis Using CNN","volume":"5","author":"Meng","year":"2017","journal-title":"IEEE Access"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"369","DOI":"10.1016\/j.neucom.2020.01.028","article-title":"Deep Gaze Pooling: Inferring and Visually Decoding Search Intents From Human Gaze Fixations","volume":"387","author":"Sattar","year":"2020","journal-title":"Neurocomputing"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Ou, W.-L., Kuo, T.-L., Chang, C.-C., and Fan, C.-P. (2021). Deep-Learning-Based Pupil Center Detection and Tracking Technology for Visible-Light Wearable Gaze Tracking Devices. Appl. Sci., 11.","DOI":"10.3390\/app11020851"}],"container-title":["Computers"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-431X\/14\/2\/57\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,9]],"date-time":"2025-10-09T16:30:16Z","timestamp":1760027416000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-431X\/14\/2\/57"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,2,10]]},"references-count":34,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2025,2]]}},"alternative-id":["computers14020057"],"URL":"https:\/\/doi.org\/10.3390\/computers14020057","relation":{},"ISSN":["2073-431X"],"issn-type":[{"value":"2073-431X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,2,10]]}}}