{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,1]],"date-time":"2026-02-01T05:12:23Z","timestamp":1769922743065,"version":"3.49.0"},"reference-count":35,"publisher":"MDPI AG","issue":"6","license":[{"start":{"date-parts":[[2022,3,17]],"date-time":"2022-03-17T00:00:00Z","timestamp":1647475200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Research Project of China Disabled Persons\u2019 Federation - on Assistive Technology","award":["2021CDPFAT-09"],"award-info":[{"award-number":["2021CDPFAT-09"]}]},{"DOI":"10.13039\/501100018617","name":"Liaoning Revitalization Talents Program","doi-asserted-by":"publisher","award":["XLYC1908007"],"award-info":[{"award-number":["XLYC1908007"]}],"id":[{"id":"10.13039\/501100018617","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100017683","name":"Dalian Science and Technology Innovation Fund","doi-asserted-by":"publisher","award":["2019J11CY001"],"award-info":[{"award-number":["2019J11CY001"]}],"id":[{"id":"10.13039\/501100017683","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100017683","name":"Dalian Science and Technology Innovation Fund","doi-asserted-by":"publisher","award":["2021JJ12GX028"],"award-info":[{"award-number":["2021JJ12GX028"]}],"id":[{"id":"10.13039\/501100017683","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The human eye gaze plays a vital role in monitoring people\u2019s attention, and various efforts have been made to improve in-vehicle driver gaze tracking systems. Most of them build the specific gaze estimation model by pre-annotated data training in an offline way. These systems usually tend to have poor generalization performance during the online gaze prediction, which is caused by the estimation bias between the training domain and the deployment domain, making the predicted gaze points shift from their correct location. To solve this problem, a novel driver\u2019s eye gaze tracking method with non-linear gaze point refinement is proposed in a monitoring system using two cameras, which eliminates the estimation bias and implicitly fine-tunes the gaze points. Supported by the two-stage gaze point clustering algorithm, the non-linear gaze point refinement method can gradually extract the representative gaze points of the forward and mirror gaze zone and establish the non-linear gaze point re-mapping relationship. In addition, the Unscented Kalman filter is utilized to track the driver\u2019s continuous status features. Experimental results show that the non-linear gaze point refinement method outperforms several previous gaze calibration and gaze mapping methods, and improves the gaze estimation accuracy even on the cross-subject evaluation. The system can be used for predicting the driver\u2019s attention.<\/jats:p>","DOI":"10.3390\/s22062326","type":"journal-article","created":{"date-parts":[[2022,3,20]],"date-time":"2022-03-20T21:37:17Z","timestamp":1647812237000},"page":"2326","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["Dual-Cameras-Based Driver\u2019s Eye Gaze Tracking System with Non-Linear Gaze Point Refinement"],"prefix":"10.3390","volume":"22","author":[{"given":"Yafei","family":"Wang","sequence":"first","affiliation":[{"name":"School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China"}]},{"given":"Xueyan","family":"Ding","sequence":"additional","affiliation":[{"name":"School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China"}]},{"given":"Guoliang","family":"Yuan","sequence":"additional","affiliation":[{"name":"School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China"}]},{"given":"Xianping","family":"Fu","sequence":"additional","affiliation":[{"name":"School of Information Science and Technology, Dalian Maritime University, Dalian 116026, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,3,17]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"596","DOI":"10.1109\/TITS.2010.2092770","article-title":"Driver inattention monitoring system for intelligent vehicles: A review","volume":"12","author":"Dong","year":"2010","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"3017","DOI":"10.1109\/TITS.2015.2462084","article-title":"Driver behavior analysis for safe driving: A survey","volume":"16","author":"Kaplan","year":"2015","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"2339","DOI":"10.1109\/TITS.2018.2868499","article-title":"Driver fatigue detection systems: A review","volume":"20","author":"Sikander","year":"2018","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Khan, M.Q., and Lee, S. (2019). Gaze and eye tracking: Techniques and applications in ADAS. Sensors, 19.","DOI":"10.3390\/s19245540"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"4318","DOI":"10.1109\/TITS.2019.2939676","article-title":"A dual-cameras-based driver gaze mapping system with an application on non-driving activities monitoring","volume":"21","author":"Yang","year":"2019","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Dua, I., John, T.A., Gupta, R., and Jawahar, C. (January, January 24). DGAZE: Driver Gaze Mapping on Road. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9341782"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1109\/TITS.2012.2217377","article-title":"Automatic calibration method for driver\u2019s head orientation in natural driving environment","volume":"14","author":"Fu","year":"2012","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Yamashiro, K., Deguchi, D., Takahashi, T., Ide, I., Murase, H., Higuchi, K., and Naito, T. (2009, January 3\u20135). Automatic calibration of an in-vehicle gaze tracking system using driver\u2019s typical gaze behavior. Proceedings of the 2009 IEEE Intelligent Vehicles Symposium (IV), Xi\u2019an, China.","DOI":"10.1109\/IVS.2009.5164417"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"107630","DOI":"10.1016\/j.knosys.2021.107630","article-title":"Self-calibrated driver gaze estimation via gaze pattern learning","volume":"235","author":"Yuan","year":"2022","journal-title":"Knowl.-Based Syst."},{"key":"ref_10","unstructured":"Wang, J., Chai, W., Venkatachalapathy, A., Tan, K.L., Haghighat, A., Velipasalar, S., Adu-Gyamfi, Y., and Sharma, A. (2021). A Survey on Driver Behavior Analysis from In-Vehicle Cameras. IEEE Trans. Intell. Transp. Syst., 1\u201324."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Shehu, I.S., Wang, Y., Athuman, A.M., and Fu, X. (2021). Remote Eye Gaze Tracking Research: A Comparative Evaluation on Past and Recent Progress. Electronics, 10.","DOI":"10.37247\/PAELEC.1.22.12"},{"key":"ref_12","unstructured":"Wang, Y., Zhao, T., Ding, X., Bian, J., and Fu, X. (2017, January 13\u201316). Head pose-free eye gaze prediction for driver attention study. Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Korea."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"293","DOI":"10.1016\/j.knosys.2016.07.038","article-title":"Appearance-based gaze estimation using deep features and random forest regression","volume":"110","author":"Wang","year":"2016","journal-title":"Knowl.-Based Syst."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"41","DOI":"10.1016\/j.knosys.2017.10.010","article-title":"Learning a gaze estimator with neighbor selection from large-scale synthetic eye images","volume":"139","author":"Wang","year":"2018","journal-title":"Knowl.-Based Syst."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Tawari, A., and Trivedi, M.M. (2014, January 8\u201311). Robust and continuous estimation of driver gaze zone by dynamic analysis of multiple face videos. Proceedings of the 2014 IEEE Intelligent Vehicles Symposium (IV), Dearborn, MI, USA.","DOI":"10.1109\/IVS.2014.6856607"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Jha, S., and Busso, C. (2016, January 1\u20134). Analyzing the relationship between head pose and gaze to model driver visual attention. Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil.","DOI":"10.1109\/ITSC.2016.7795905"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"818","DOI":"10.1109\/TITS.2014.2300870","article-title":"Continuous head movement estimator for driver assistance: Issues, algorithms, and on-road evaluations","volume":"15","author":"Tawari","year":"2014","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Jha, S., and Busso, C. (2017, January 16\u201319). Probabilistic estimation of the driver\u2019s gaze from head orientation and position. Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan.","DOI":"10.1109\/ITSC.2017.8317841"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Tawari, A., Chen, K.H., and Trivedi, M.M. (2014, January 8\u201311). Where is the driver looking: Analysis of head, eye and iris for robust gaze zone estimation. Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China.","DOI":"10.1109\/ITSC.2014.6957817"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"2014","DOI":"10.1109\/TITS.2015.2396031","article-title":"Driver gaze tracking and eyes off the road detection system","volume":"16","author":"Vicente","year":"2015","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Wang, Y., Yuan, G., Mi, Z., Peng, J., Ding, X., Liang, Z., and Fu, X. (2019). Continuous driver\u2019s gaze zone estimation using rgb-d camera. Sensors, 19.","DOI":"10.3390\/s19061287"},{"key":"ref_22","unstructured":"Jha, S., and Busso, C. (2020). Estimation of Driver\u2019s Gaze Region from Head Position and Orientation Using Probabilistic Confidence Regions. arXiv."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"2739","DOI":"10.1109\/TITS.2016.2526050","article-title":"Driver-gaze zone estimation using Bayesian filtering and Gaussian processes","volume":"17","author":"Lundgren","year":"2016","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Yu, Z., Huang, X., Zhang, X., Shen, H., Li, Q., Deng, W., Tang, J., Yang, Y., and Ye, J. (2020, January 25\u201329). A Multi-Modal Approach for Driver Gaze Prediction to Remove Identity Bias. Proceedings of the 2020 International Conference on Multimodal Interaction, Online.","DOI":"10.1145\/3382507.3417961"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Lyu, K., Wang, M., and Meng, L. (2020, January 25\u201329). Extract the Gaze Multi-dimensional Information Analysis Driver Behavior. Proceedings of the 2020 International Conference on Multimodal Interaction, Virtual Event, The Netherlands.","DOI":"10.1145\/3382507.3417972"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Lollett, C., Kamezaki, M., and Sugano, S. (2021, January 11\u201317). Towards a Driver\u2019s Gaze Zone Classifier using a Single Camera Robust to Temporal and Permanent Face Occlusions. Proceedings of the 2021 IEEE Intelligent Vehicles Symposium (IV), Nagoya, Japan.","DOI":"10.1109\/IV48863.2021.9575367"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"7155","DOI":"10.1007\/s11042-018-6490-7","article-title":"Driver\u2019s eye-based gaze tracking system by one-point calibration","volume":"78","author":"Yoon","year":"2019","journal-title":"Multimed. Tools Appl."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"980","DOI":"10.1109\/TITS.2015.2493451","article-title":"Detecting drivers\u2019 mirror-checking actions and its application to maneuver and secondary task recognition","volume":"17","author":"Li","year":"2015","journal-title":"IEEE Trans. Intell. Transp. Syst."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Xing, Y., Tang, J., Liu, H., Lv, C., Cao, D., Velenis, E., and Wang, F.Y. (2018, January 26\u201330). End-to-end driving activities and secondary tasks recognition using deep convolutional neural network and transfer learning. Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China.","DOI":"10.1109\/IVS.2018.8500548"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"5379","DOI":"10.1109\/TVT.2019.2908425","article-title":"Driver activity recognition for intelligent vehicles: A deep learning approach","volume":"68","author":"Xing","year":"2019","journal-title":"IEEE Trans. Veh. Technol."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"113240","DOI":"10.1016\/j.eswa.2020.113240","article-title":"Driver behavior detection and classification using deep convolutional neural networks","volume":"149","author":"Shahverdy","year":"2020","journal-title":"Expert Syst. Appl."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Baltrusaitis, T., Zadeh, A., Lim, Y.C., and Morency, L.P. (2018, January 15\u201319). Openface 2.0: Facial behavior analysis toolkit. Proceedings of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG), Xi\u2019an, China.","DOI":"10.1109\/FG.2018.00019"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Jha, S., and Busso, C. (2017, January 16\u201319). Challenges in head pose estimation of drivers in naturalistic recordings using existing tools. Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan.","DOI":"10.1109\/ITSC.2017.8317870"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Araluce, J., Bergasa, L.M., Oca\u00f1a, M., L\u00f3pez-Guill\u00e9n, E., Revenga, P.A., Arango, J.F., and P\u00e9rez, O. (2021). Gaze Focalization System for Driving Applications Using OpenFace 2.0 Toolkit with NARMAX Algorithm in Accidental Scenarios. Sensors, 21.","DOI":"10.3390\/s21186262"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Shirpour, M., Beauchemin, S.S., and Bauer, M.A. (December, January 18). A probabilistic model for visual driver gaze approximation from head pose estimation. Proceedings of the 2020 IEEE 3rd Connected and Automated Vehicles Symposium (CAVS), Victoria, BC, Canada.","DOI":"10.1109\/CAVS51000.2020.9334636"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/6\/2326\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T22:38:10Z","timestamp":1760135890000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/6\/2326"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,3,17]]},"references-count":35,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2022,3]]}},"alternative-id":["s22062326"],"URL":"https:\/\/doi.org\/10.3390\/s22062326","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,3,17]]}}}