{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,18]],"date-time":"2026-01-18T05:38:35Z","timestamp":1768714715475,"version":"3.49.0"},"reference-count":26,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2024,2,15]],"date-time":"2024-02-15T00:00:00Z","timestamp":1707955200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>In this paper, we present and evaluate a calibration-free mobile eye-traking system. The system\u2019s mobile device consists of three cameras: an IR eye camera, an RGB eye camera, and a front-scene RGB camera. The three cameras build a reliable corneal imaging system that is used to estimate the user\u2019s point of gaze continuously and reliably. The system auto-calibrates the device unobtrusively. Since the user is not required to follow any special instructions to calibrate the system, they can simply put on the eye tracker and start moving around using it. Deep learning algorithms together with 3D geometric computations were used to auto-calibrate the system per user. Once the model is built, a point-to-point transformation from the eye camera to the front camera is computed automatically by matching corneal and scene images, which allows the gaze point in the scene image to be estimated. The system was evaluated by users in real-life scenarios, indoors and outdoors. The average gaze error was 1.6\u2218 indoors and 1.69\u2218 outdoors, which is considered very good compared to state-of-the-art approaches.<\/jats:p>","DOI":"10.3390\/s24041237","type":"journal-article","created":{"date-parts":[[2024,2,15]],"date-time":"2024-02-15T05:55:59Z","timestamp":1707976559000},"page":"1237","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["Calibration-Free Mobile Eye-Tracking Using Corneal Imaging"],"prefix":"10.3390","volume":"24","author":[{"given":"Moayad","family":"Mokatren","sequence":"first","affiliation":[{"name":"The Department of Information Systems, University of Haifa, Haifa 3498838, Israel"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0096-4240","authenticated-orcid":false,"given":"Tsvi","family":"Kuflik","sequence":"additional","affiliation":[{"name":"The Department of Information Systems, University of Haifa, Haifa 3498838, Israel"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5276-0242","authenticated-orcid":false,"given":"Ilan","family":"Shimshoni","sequence":"additional","affiliation":[{"name":"The Department of Information Systems, University of Haifa, Haifa 3498838, Israel"}]}],"member":"1968","published-online":{"date-parts":[[2024,2,15]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"57","DOI":"10.1016\/j.compbiomed.2019.03.025","article-title":"Video-oculography eye tracking towards clinical applications: A review","volume":"108","author":"Larrazabal","year":"2019","journal-title":"Comput. Biol. Med."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"192","DOI":"10.1007\/s10916-020-01656-w","article-title":"Integrating eye-tracking to augmented reality system for surgical training","volume":"44","author":"Lu","year":"2020","journal-title":"J. Med. Syst."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"445","DOI":"10.1016\/j.jbusres.2017.09.028","article-title":"Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research","volume":"100","author":"Pfeiffer","year":"2019","journal-title":"J. Bus. Res."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"5","DOI":"10.1186\/s41235-019-0156-5","article-title":"How does navigation system behavior influence human behavior?","volume":"4","author":"Richter","year":"2019","journal-title":"Cogn. Res. Princ. Implic."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"103842","DOI":"10.1016\/j.jesp.2019.103842","article-title":"Understanding cognitive and affective mechanisms in social psychology through eye-tracking","volume":"85","author":"Rahal","year":"2019","journal-title":"J. Exp. Soc. Psychol."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Lim, J.Z., Mountstephens, J., and Teo, J. (2020). Emotion recognition using eye-tracking: Taxonomy, review and current challenges. Sensors, 20.","DOI":"10.3390\/s20082384"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"528","DOI":"10.1016\/j.future.2017.07.007","article-title":"Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device: A case study in cultural heritage","volume":"81","author":"Mokatren","year":"2018","journal-title":"Future Gener. Comput. Syst."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Kassner, M., Patera, W., and Bulling, A. (2014, January 13\u201317). Pupil: An open source platform for pervasive eye tracking and mobile gaze-based interaction. Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, Seattle, WA, USA.","DOI":"10.1145\/2638728.2641695"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"478","DOI":"10.1109\/TPAMI.2009.30","article-title":"In the eye of the beholder: A survey of models for eyes and gaze","volume":"32","author":"Hansen","year":"2009","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"223","DOI":"10.1007\/s11263-017-1014-x","article-title":"Auto-calibrated gaze estimation using human gaze patterns","volume":"124","author":"Alnajar","year":"2017","journal-title":"Int. J. Comput. Vis."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"750","DOI":"10.1109\/THMS.2015.2400434","article-title":"Appearance-based gaze estimation with online calibration from mouse operations","volume":"45","author":"Sugano","year":"2015","journal-title":"IEEE Trans. Hum.-Mach. Syst."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"B\u00e2ce, M., Staal, S., and S\u00f6r\u00f6s, G. (2018, January 14\u201317). Wearable eye tracker calibration at your fingertips. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, Warsaw, Poland.","DOI":"10.1145\/3204493.3204592"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"104207","DOI":"10.1109\/ACCESS.2020.2999633","article-title":"3D gaze estimation for head-mounted eye-tracking system with auto-calibration method","volume":"8","author":"Liu","year":"2020","journal-title":"IEEE Access"},{"key":"ref_14","unstructured":"Nishino, K., and Nayar, S.K. (July, January 27). The World in an Eye. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Mokatren, M., Kuflik, T., and Shimshoni, I. (2022). 3D Gaze Estimation Using RGB-IR Cameras. Sensors, 23.","DOI":"10.3390\/s23010381"},{"key":"ref_16","unstructured":"Tan, K.H., Kriegman, D.J., and Ahuja, N. (2002, January 4). Appearance-based eye gaze estimation. Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, 2002 (WACV 2002), Orlando, FL, USA."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"2033","DOI":"10.1109\/TPAMI.2014.2313123","article-title":"Adaptive linear regression for appearance-based gaze estimation","volume":"36","author":"Lu","year":"2014","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"503645","DOI":"10.1155\/2014\/503645","article-title":"Variations in eyeball diameters of the healthy adults","volume":"2014","author":"Bekerman","year":"2014","journal-title":"J. Ophthalmol."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"1330","DOI":"10.1109\/34.888718","article-title":"A flexible new technique for camera calibration","volume":"22","author":"Zhang","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Geyer, C., and Daniilidis, K. (1999, January 20\u201325). Catadioptric camera calibration. Proceedings of the Seventh IEEE International Conference on Computer Vision, Corfu, Greece.","DOI":"10.1109\/ICCV.1999.791248"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"L\u00e9braly, P., Deymier, C., Ait-Aider, O., Royer, E., and Dhome, M. (2010, January 18\u201322). Flexible extrinsic calibration of non-overlapping cameras using a planar mirror: Application to vision-based robotics. Proceedings of the 2010 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan.","DOI":"10.1109\/IROS.2010.5651552"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Lowe, D.G. (1999, January 20\u201327). Object recognition from local scale-invariant features. Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece.","DOI":"10.1109\/ICCV.1999.790410"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"404","DOI":"10.1007\/11744023_32","article-title":"Surf: Speeded up robust features","volume":"3951","author":"Bay","year":"2006","journal-title":"Lect. Notes Comput. Sci."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Rublee, E., Rabaud, V., Konolige, K., and Bradski, G. (2011, January 6\u201313). ORB: An efficient alternative to SIFT or SURF. Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain.","DOI":"10.1109\/ICCV.2011.6126544"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"381","DOI":"10.1145\/358669.358692","article-title":"Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography","volume":"24","author":"Fischler","year":"1981","journal-title":"Commun. ACM"},{"key":"ref_26","unstructured":"Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G.S., Davis, A., Dean, J., and Devin, M. (2016). Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/4\/1237\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T14:00:02Z","timestamp":1760104802000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/24\/4\/1237"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,2,15]]},"references-count":26,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2024,2]]}},"alternative-id":["s24041237"],"URL":"https:\/\/doi.org\/10.3390\/s24041237","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,2,15]]}}}