{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,22]],"date-time":"2026-04-22T20:32:17Z","timestamp":1776889937356,"version":"3.51.2"},"reference-count":39,"publisher":"MDPI AG","issue":"18","license":[{"start":{"date-parts":[[2021,9,18]],"date-time":"2021-09-18T00:00:00Z","timestamp":1631923200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Monitoring driver attention using the gaze estimation is a typical approach used on road scenes. This indicator is of great importance for safe driving, specially on Level 3 and Level 4 automation systems, where the take over request control strategy could be based on the driver\u2019s gaze estimation. Nowadays, gaze estimation techniques used in the state-of-the-art are intrusive and costly, and these two aspects are limiting the usage of these techniques on real vehicles. To test this kind of application, there are some databases focused on critical situations in simulation, but they do not show real accidents because of the complexity and the danger to record them. Within this context, this paper presents a low-cost and non-intrusive camera-based gaze mapping system integrating the open-source state-of-the-art OpenFace 2.0 Toolkit to visualize the driver focalization on a database composed of recorded real traffic scenes through a heat map using NARMAX (Nonlinear AutoRegressive Moving Average model with eXogenous inputs) to establish the correspondence between the OpenFace 2.0 parameters and the screen region the user is looking at. This proposal is an improvement of our previous work, which was based on a linear approximation using a projection matrix. The proposal has been validated using the recent and challenging public database DADA2000, which has 2000 video sequences with annotated driving scenarios based on real accidents. We compare our proposal with our previous one and with an expensive desktop-mounted eye-tracker, obtaining on par results. We proved that this method can be used to record driver attention databases.<\/jats:p>","DOI":"10.3390\/s21186262","type":"journal-article","created":{"date-parts":[[2021,9,21]],"date-time":"2021-09-21T22:35:20Z","timestamp":1632263720000},"page":"6262","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":12,"title":["Gaze Focalization System for Driving Applications Using OpenFace 2.0 Toolkit with NARMAX Algorithm in Accidental Scenarios"],"prefix":"10.3390","volume":"21","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-5101-0485","authenticated-orcid":false,"given":"Javier","family":"Araluce","sequence":"first","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0087-3077","authenticated-orcid":false,"given":"Luis M.","family":"Bergasa","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8875-1866","authenticated-orcid":false,"given":"Manuel","family":"Oca\u00f1a","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8145-9045","authenticated-orcid":false,"given":"Elena","family":"L\u00f3pez-Guill\u00e9n","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2550-5972","authenticated-orcid":false,"given":"Pedro A.","family":"Revenga","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"given":"J. Felipe","family":"Arango","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]},{"given":"Oscar","family":"P\u00e9rez","sequence":"additional","affiliation":[{"name":"Electronics Department, University of Alcal\u00e1, 28801 Alcal\u00e1 de Henares, Spain"}]}],"member":"1968","published-online":{"date-parts":[[2021,9,18]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Baltrusaitis, T., Zadeh, A., Lim, Y.C., and Morency, L.P. (2018, January 15\u201319). Openface 2.0: Facial behavior analysis toolkit. Proceedings of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi\u2019an, China.","DOI":"10.1109\/FG.2018.00019"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Araluce, J., Bergasa, L.M., G\u00f3mez-Hu\u00e9lamo, C., Barea, R., L\u00f3pez-Guill\u00e9n, E., Arango, F., and P\u00e9rez-Gil, \u00d3. (2020). Integrating OpenFace 2.0 Toolkit for Driver Attention Estimation in Challenging Accidental Scenarios. Workshop of Physical Agents, Springer.","DOI":"10.1007\/978-3-030-62579-5_19"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Fang, J., Yan, D., Qiao, J., Xue, J., Wang, H., and Li, S. (2019, January 27\u201330). DADA-2000: Can Driving Accident be Predicted by Driver Attention\u0192 Analyzed by A Benchmark. Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, NZ, USA.","DOI":"10.1109\/ITSC.2019.8917218"},{"key":"ref_4","unstructured":"SAE On-Road Automated Vehicle Standards Committee (2014). Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems J3016_201401. SAE Stand. J., 3016, 1\u201316."},{"key":"ref_5","unstructured":"Jimenez, F. (2017). Intelligent Vehicles: Enabling Technologies and Future Developments, Butterworth-Heinemann."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"4318","DOI":"10.1109\/TITS.2019.2939676","article-title":"A Dual-Cameras-Based Driver Gaze Mapping System With an Application on Non-Driving Activities Monitoring","volume":"21","author":"Yang","year":"2019","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Dalmaijer, E., Math\u00f4t, S., and Stigchel, S. (2013). PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behav. Res. Methods, 46.","DOI":"10.3758\/s13428-013-0422-2"},{"key":"ref_8","first-page":"2055668318773991","article-title":"Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances","volume":"5","author":"Cognolato","year":"2018","journal-title":"J. Rehabil. Assist. Technol. Eng."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Shen, J., Zafeiriou, S., Chrysos, G.G., Kossaifi, J., Tzimiropoulos, G., and Pantic, M. (2015, January 7\u201313). The First Facial Landmark Tracking in-the-Wild Challenge: Benchmark and Results. Proceedings of the 2015 IEEE International Conference on Computer Vision Workshop (ICCVW), Santiago, Chile.","DOI":"10.1109\/ICCVW.2015.132"},{"key":"ref_10","unstructured":"Xia, Y., Zhang, D., Kim, J., Nakayama, K., Zipser, K., and Whitney, D. (2018). Predicting driver attention in critical situations. Asian Conference on Computer Vision, Springer."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Mizuno, N., Yoshizawa, A., Hayashi, A., and Ishikawa, T. (2017, January 26\u201328). Detecting driver\u2019s visual attention area by using vehicle-mounted device. Proceedings of the 2017 IEEE 16th International Conference on Cognitive Informatics & Cognitive Computing (ICCI* CC), Oxford, UK.","DOI":"10.1109\/ICCI-CC.2017.8109772"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"2014","DOI":"10.1109\/TITS.2015.2396031","article-title":"Driver gaze tracking and eyes off the road detection system","volume":"16","author":"Vicente","year":"2015","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Naqvi, R.A., Arsalan, M., Batchuluun, G., Yoon, H.S., and Park, K.R. (2018). Deep learning-based gaze detection system for automobile drivers using a NIR camera sensor. Sensors, 18.","DOI":"10.3390\/s18020456"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"1167","DOI":"10.1109\/TITS.2012.2187517","article-title":"Gaze fixation system for the evaluation of driver distractions induced by IVIS","volume":"13","author":"Bergasa","year":"2012","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Khan, M.Q., and Lee, S. (2019). Gaze and Eye Tracking: Techniques and Applications in ADAS. Sensors, 19.","DOI":"10.3390\/s19245540"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"83","DOI":"10.1016\/j.cviu.2004.07.008","article-title":"Estimating the eye gaze from one eye","volume":"98","author":"Wang","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"663","DOI":"10.1016\/j.imavis.2005.06.001","article-title":"Eye tracking: Pupil orientation geometrical modeling","volume":"24","author":"Villanueva","year":"2006","journal-title":"Image Vis. Comput."},{"key":"ref_18","unstructured":"Beymer, D., and Flickner, M. (2003, January 18\u201320). Eye gaze tracking using an active stereo head. Proceedings of the 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Madison, WI, USA."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Ohno, T., and Mukawa, N. (2004, January 26\u201328). A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, Safety Harbor, FL, USA.","DOI":"10.1145\/968363.968387"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Meyer, A., B\u00f6hme, M., Martinetz, T., and Barth, E. (2006, January 19\u201321). A single-camera remote eye tracker. Proceedings of the International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems, Kloster Irsee, Germany.","DOI":"10.1007\/11768029_25"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"155","DOI":"10.1016\/j.cviu.2004.07.013","article-title":"Eye tracking in the wild","volume":"98","author":"Hansen","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_22","unstructured":"Hansen, D.W., Hansen, J.P., Nielsen, M., Johansen, A.S., and Stegmann, M.B. (2002, January 3\u20134). Eye typing using Markov and active appearance models. Proceedings of the Sixth IEEE Workshop on Applications of Computer Vision, Orlando, FL, USA."},{"key":"ref_23","unstructured":"Brolly, X.L., and Mulligan, J.B. (July, January 27). Implicit calibration of a remote gaze tracker. Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop, Washington, DC, USA."},{"key":"ref_24","unstructured":"Ebisawa, Y., and Satoh, S.I. (1993, January 28\u201331). Effectiveness of pupil area detection technique using two light sources and image difference method. Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Societ, San Diego, CA, USA."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Bin Suhaimi, M.S.A., Matsushita, K., Sasaki, M., and Njeri, W. (2019). 24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation. Sensors, 19.","DOI":"10.3390\/s19173650"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"357","DOI":"10.1006\/rtim.2002.0279","article-title":"Real-time eye, gaze, and face pose tracking for monitoring driver vigilance","volume":"8","author":"Ji","year":"2002","journal-title":"Real-Time Imaging"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"4","DOI":"10.1016\/j.cviu.2004.07.010","article-title":"Eye gaze tracking techniques for interactive applications","volume":"98","author":"Morimoto","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_28","first-page":"1","article-title":"Human eye tracking and related issues: A review","volume":"2","author":"Singh","year":"2012","journal-title":"Int. J. Sci. Res. Publ."},{"key":"ref_29","unstructured":"Papoutsaki, A., Sangkloy, P., Laskey, J., Daskalova, N., Huang, J., and Hays, J. (2016, January 9\u201315). WebGazer: Scalable Webcam Eye Tracking Using User Interactions. Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI 2016), New York, NY, USA."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Wood, E., and Bulling, A. (2014, January 26\u201328). Eyetab: Model-based gaze estimation on unmodified tablet computers. Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA.","DOI":"10.1145\/2578153.2578185"},{"key":"ref_31","unstructured":"(2021, August 21). OKAO\u2122 Vision: Technology. Available online: https:\/\/plus-sensing.omron.com\/technology\/."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1720","DOI":"10.1109\/TPAMI.2018.2845370","article-title":"Predicting the Driver\u2019s Focus of Attention: The DR (eye) VE Project","volume":"41","author":"Palazzi","year":"2018","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_33","unstructured":"Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., and Ng, A.Y. (2009, January 12\u201316). ROS: An open-source Robot Operating System. Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"162","DOI":"10.1109\/TPAMI.2017.2778103","article-title":"Mpiigaze: Real-world dataset and deep appearance-based gaze estimation","volume":"41","author":"Zhang","year":"2017","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"1013","DOI":"10.1080\/00207178908559683","article-title":"Representations of non-linear systems: The NARMAX model","volume":"49","author":"Chen","year":"1989","journal-title":"Int. J. Control"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Billings, S.A. (2013). Nonlinear System Identification: NARMAX Methods in the Time, Frequency, and Spatio-Temporal Domains, John Wiley & Sons.","DOI":"10.1002\/9781118535561"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"1559","DOI":"10.1080\/00207728808964057","article-title":"Identification of non-linear output-affine systems using an orthogonal least-squares algorithm","volume":"19","author":"Billings","year":"1988","journal-title":"Int. J. Syst. Sci."},{"key":"ref_38","unstructured":"Alletto, S., Palazzi, A., Solera, F., Calderara, S., and Cucchiara, R. (July, January 26). Dr (eye) ve: A dataset for attention-based tasks with applications to autonomous and assisted driving. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, Las Vegas, NV, USA."},{"key":"ref_39","unstructured":"Dosovitskiy, A., Ros, G., Codevilla, F., Lopez, A., and Koltun, V. (2017, January 13\u201315). CARLA: An Open Urban Driving Simulator. Proceedings of the 1st Annual Conference on Robot Learning, Mountain View, CA, USA."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/18\/6262\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T07:01:50Z","timestamp":1760166110000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/18\/6262"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,9,18]]},"references-count":39,"journal-issue":{"issue":"18","published-online":{"date-parts":[[2021,9]]}},"alternative-id":["s21186262"],"URL":"https:\/\/doi.org\/10.3390\/s21186262","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,9,18]]}}}