{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T04:32:03Z","timestamp":1760243523893,"version":"build-2065373602"},"reference-count":32,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2013,8,16]],"date-time":"2013-08-16T00:00:00Z","timestamp":1376611200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Most conventional gaze-tracking systems require that users look at many points during the initial calibration stage, which is inconvenient for them. To avoid this requirement, we propose a new gaze-tracking method with four important characteristics. First, our gaze-tracking system uses a large screen located at a distance from the user, who wears a lightweight device. Second, our system requires that users look at only four calibration points during the initial calibration stage, during which four pupil centers are noted. Third, five additional points (virtual pupil centers) are generated with a multilayer perceptron using the four actual points (detected pupil centers) as inputs. Fourth, when a user gazes at a large screen, the shape defined by the positions of the four pupil centers is a distorted quadrangle because of the nonlinear movement of the human eyeball. The  gaze-detection accuracy is reduced if we map the pupil movement area onto the screen area using a single transform function. We overcame this problem by calculating the gaze position based on multi-geometric transforms using the five virtual points and the four actual points. Experiment results show that the accuracy of the proposed method is better than that of other methods.<\/jats:p>","DOI":"10.3390\/s130810802","type":"journal-article","created":{"date-parts":[[2013,8,16]],"date-time":"2013-08-16T12:14:49Z","timestamp":1376655289000},"page":"10802-10822","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["A Novel Gaze Tracking Method Based on the Generation of Virtual Calibration Points"],"prefix":"10.3390","volume":"13","author":[{"given":"Ji","family":"Lee","sequence":"first","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Hwan","family":"Heo","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Kang","family":"Park","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]}],"member":"1968","published-online":{"date-parts":[[2013,8,16]]},"reference":[{"key":"ref_1","unstructured":"Ishii, H., Okada, Y., Shimoda, H., and Yoshikawa, H. (2002, January 5\u20137). Construction of the Measurement System and its Experimental Study for Diagnosing Cerebral Functional Disorders Using Eye-Sensing HMD.. Osaka, Japan."},{"key":"ref_2","unstructured":"Aotake, Y., Sasai, H., Ozawa, T., Fukushima, S., Shimoda, H., and Yoshikawa, H. (1999, January 12\u201315). A New Adaptive CAI System Based on Bio-Informatic Sensing: Study on Real-Time Method of Analyzing Ocular Information by Using Eye-Sensing HMD and Method of Adaptive CAI System Configuration. Tokyo, Japan."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"333","DOI":"10.1016\/S0143-8166(02)00034-9","article-title":"An eye behavior measuring device for VR system","volume":"38","author":"Lin","year":"2002","journal-title":"Opt. Laser Eng."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"127202-1","DOI":"10.1117\/1.3275453","article-title":"Robust gaze-tracking method using frontal-viewing and eye-tracking cameras","volume":"48","author":"Cho","year":"2009","journal-title":"Opt. Eng."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1474","DOI":"10.1016\/j.patrec.2008.02.026","article-title":"A robust gaze detection method by compensating for facial movements based on corneal specularities","volume":"29","author":"Ko","year":"2008","journal-title":"Pattern Recognit. Lett."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1646","DOI":"10.1109\/TCE.2011.6131137","article-title":"New computer interface combining gaze tracking and brainwave measurements","volume":"57","author":"Bang","year":"2011","journal-title":"IEEE Trans. Consum. Electron."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"25","DOI":"10.1016\/j.cviu.2004.07.011","article-title":"A novel non-intrusive eye gaze estimation using cross-ratio under large head motion","volume":"98","author":"Yoo","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"332","DOI":"10.1109\/TSMCB.2002.999809","article-title":"Study on eye gaze estimation","volume":"32","author":"Wang","year":"2002","journal-title":"IEEE Trans. Syst., Man, Cybern. B"},{"key":"ref_9","unstructured":"Murphy-Chutorian, E., Doshi, A., and Trivedi, M.M. (October, January 30). Head Pose Estimation for Driver Assistance Systems: A Robust Algorithm and Experimental Evaluation. Washington, USA."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"542","DOI":"10.4218\/etrij.12.0111.0193","article-title":"Gaze Detection by Wearable Eye-Tracking and NIR LED-Based Head-Tracking Device Based on SVR","volume":"34","author":"Cho","year":"2012","journal-title":"ETRI J."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"736","DOI":"10.1016\/j.optlaseng.2011.12.001","article-title":"3D gaze tracking method using purkinje images on eye optical model and pupil","volume":"50","author":"Lee","year":"2012","journal-title":"Opt. Laser Eng."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1031","DOI":"10.1109\/TBME.2009.2039351","article-title":"An automatic personal calibration procedure for advanced gaze estimation systems","volume":"57","author":"Model","year":"2010","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"234","DOI":"10.1109\/TSMCB.2003.811128","article-title":"A novel approach to 3-D gaze tracking using stereo cameras","volume":"34","author":"Shih","year":"2004","journal-title":"IEEE Trans. Syst., Man, Cybern. B"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Ohno, T., Mukawa, N., and Yoshikawa, A. (2002, January 25\u201327). FreeGaze: A Gaze Tracking System for Everyday Gaze Interaction. New Orleans, LA, USA.","DOI":"10.1145\/507072.507098"},{"key":"ref_15","unstructured":"Kondou, Y., and Ebisawa, Y. (2008, January 14\u201316). Easy Eye-Gaze Calibration Using a Moving Visual Target in the Head-Free Remote Eye-Gaze Detection System. Istanbul, Turkey."},{"key":"ref_16","unstructured":"Zhu, J., and Yang, J. (2002, January 20\u201321). Subpixel Eye Gaze Tracking. Washington, DC, USA."},{"key":"ref_17","unstructured":"Yang, X., Sun, J., Liu, J., Chu, J., Liu, W., and Gao, Y. (, January 7\u2013July). A Gaze Tracking Scheme for Eye-Based Intelligent Control. Jinan, China."},{"key":"ref_18","unstructured":"Colombo, C., Andronico, S., and Dario, P. (1995, January 5\u20139). Prototype of a Vision-Based Gaze-Driven Man-Machine Interface. Pittsburgh, PA, USA."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"359","DOI":"10.1016\/S0921-8890(96)00062-0","article-title":"Interacting through eyes","volume":"19","author":"Colombo","year":"1997","journal-title":"Robot. Auton. Syst."},{"key":"ref_20","unstructured":"Morimoto, C.H., Koons, D., Amir, A., Flickner, M., and Zhai, S. (1999, January 17\u201320). Keeping an Eye for HCI. In Proceedings of the XII Brazilian Symposium on Computer Graphics and Image Processing. Campinas, SP, Brazil."},{"key":"ref_21","unstructured":"Mimica, M.R. M., and Morimoto, C.H. (2003, January 12\u201315). A Computer Vision Framework for Eye Gaze Tracking. Brazil."},{"key":"ref_22","unstructured":"Agustin, J.S., Skovsgaard, H., Mollenbach, E., Barret, M., Tall, M., Hansen, D.W., and Hansen, J.P. (2010, January 21\u201323). Evaluation of a Low-Cost Open-Source Gaze Tracker. Austin, TX, USA."},{"key":"ref_23","unstructured":"Sugioka, A., Ebisawa, Y., and Ohtani, M. (November, January 31). Noncontact Video-Based Eye-Gaze Detection Method Allowing Large Head Displacements. Amsterdam, Netherlands."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"25","DOI":"10.1007\/s10055-010-0171-9","article-title":"An integrated head pose and eye gaze tracking approach to non-intrusive visual attention measurement for wide FOV simulators","volume":"16","author":"Cai","year":"2012","journal-title":"Virtual Real."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"476","DOI":"10.1016\/j.cviu.2010.11.013","article-title":"A wearable gaze tracking system for children in unconstrained environments","volume":"115","author":"Noris","year":"2011","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"620","DOI":"10.1016\/j.concog.2006.01.001","article-title":"The effects of eye movements, age, and expertise on inattentional blindness","volume":"15","author":"Memmert","year":"2006","journal-title":"Conscious. Cogn."},{"key":"ref_27","unstructured":"Webcam C600. Available online: http:\/\/www.logitech.com\/en-us\/support\/5869?crid=405&osid=14&bit=32."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"6478","DOI":"10.1167\/iovs.10-6423","article-title":"Experimental investigations of pupil accommodation factors","volume":"52","author":"Lee","year":"2011","journal-title":"Investig. Ophthalmol. Vis. Sci."},{"key":"ref_29","unstructured":"Jain, R., Kasturi, R., and Schunck, B.G. (1995). Machine Vision, McGraw-Hill. [International ed.]."},{"key":"ref_30","unstructured":"Gonzalez, R.C., and Woods, R.E. (2002). Digital Image Processing, Prentice-Hall. [2rd ed.]."},{"key":"ref_31","unstructured":"Freeman, J.A., and Skapura, D.M. (1991). Neural Networks: Algorithms, Applications, and Programming Techniques, Addison-Wesley Publishing Company. [International ed.]."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Hartley, R., and Zisserman, A. (2004). Multiple View Geometry in Computer Vision, Cambridge University Press. [2rd ed.].","DOI":"10.1017\/CBO9780511811685"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/13\/8\/10802\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T21:48:41Z","timestamp":1760219321000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/13\/8\/10802"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2013,8,16]]},"references-count":32,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2013,8]]}},"alternative-id":["s130810802"],"URL":"https:\/\/doi.org\/10.3390\/s130810802","relation":{},"ISSN":["1424-8220"],"issn-type":[{"type":"electronic","value":"1424-8220"}],"subject":[],"published":{"date-parts":[[2013,8,16]]}}}