{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,8]],"date-time":"2026-03-08T00:54:48Z","timestamp":1772931288393,"version":"3.50.1"},"reference-count":31,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2014,1,27]],"date-time":"2014-01-27T00:00:00Z","timestamp":1390780800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Conventional gaze tracking systems are limited in cases where the user is wearing glasses because the glasses usually produce noise due to reflections caused by the gaze tracker\u2019s lights. This makes it difficult to locate the pupil and the specular reflections (SRs) from the cornea of the user\u2019s eye. These difficulties increase the likelihood of gaze detection errors because the gaze position is estimated based on the location of the pupil center and the positions of the corneal SRs. In order to overcome these problems, we propose a new gaze tracking method that can be used by subjects who are wearing glasses. Our research is novel in the following four ways: first, we construct a new control device for the illuminator, which includes four illuminators that are positioned at the four corners of a monitor. Second, our system automatically determines whether a user is wearing glasses or not in the initial stage by counting the number of white pixels in an image that is captured using the low exposure setting on the camera. Third, if it is determined that the user is wearing glasses, the four illuminators are turned on and off sequentially in order to obtain an image that has a minimal amount of noise due to reflections from the glasses. As a result, it is possible to avoid the reflections and accurately locate the pupil center and the positions of the four corneal SRs. Fourth, by turning off one of the four illuminators, only three corneal SRs exist in the captured image. Since the proposed gaze detection method requires four corneal SRs for calculating the gaze position, the unseen SR position is estimated based on the parallelogram shape that is defined by the three SR positions and the gaze position is calculated. Experimental results showed that the average gaze detection error with 20 persons was about 0.70\u00b0 and the processing time is 63.72 ms per each frame.<\/jats:p>","DOI":"10.3390\/s140202110","type":"journal-article","created":{"date-parts":[[2014,1,28]],"date-time":"2014-01-28T03:27:01Z","timestamp":1390879621000},"page":"2110-2134","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":16,"title":["Gaze Tracking System for User Wearing Glasses"],"prefix":"10.3390","volume":"14","author":[{"given":"Su","family":"Gwon","sequence":"first","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Chul","family":"Cho","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Hyeon","family":"Lee","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Won","family":"Lee","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Kang","family":"Park","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 26 Pil-dong 3-ga, Jung-gu, Seoul 100-715, Korea"}]}],"member":"1968","published-online":{"date-parts":[[2014,1,27]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"60","DOI":"10.1145\/1897816.1897838","article-title":"Vision-based hand-gesture applications","volume":"54","author":"Wachs","year":"2011","journal-title":"Commun. ACM"},{"key":"ref_2","unstructured":"Ren, Z., Yuan, J., and Zhang, Z. (December, January 28). Robust Hand Gesture Recognition Based on Finger-Earth Mover's Distance with a Commodity Depth Camera. Scottsdale, AZ, USA."},{"key":"ref_3","first-page":"26","article-title":"High security human recognition system using iris images","volume":"1","author":"Prashanth","year":"2010","journal-title":"Aceee Int. J. Signal Image Process."},{"key":"ref_4","unstructured":"Bhowmik, M.K., Saha, K., Majumder, S., Majumder, G., Saha, A., Sarma, A.N., Bhattacharjee, D., Basu, D.K., and Nasipuri, M. (2011). Reviews, Refinements and New Ideas in Face Recognition, Intech."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"476","DOI":"10.1016\/j.cviu.2010.11.013","article-title":"A wearable gaze tracking system for children in unconstrained environments","volume":"115","author":"Noris","year":"2011","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1830","DOI":"10.1111\/j.1467-8659.2010.01651.x","article-title":"Using a visual attention model to improve gaze tracking systems in interactive 3D applications","volume":"29","author":"Hillaire","year":"2010","journal-title":"Comput. Graph. Forum"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"2577","DOI":"10.1109\/TCE.2010.5681143","article-title":"Gaze tracking system at a distance for controlling IPTV","volume":"56","author":"Lee","year":"2010","journal-title":"IEEE Trans. Consum. Electron."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"52","DOI":"10.1016\/j.cviu.2004.07.005","article-title":"A non-contact device for tracking gaze in a human computer interface","volume":"98","author":"Noureddin","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"790","DOI":"10.1109\/TBME.2008.2005943","article-title":"Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions","volume":"56","author":"Hennessey","year":"2009","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"1123","DOI":"10.1109\/TSMCB.2008.926606","article-title":"A novel gaze estimation system with one calibration point","volume":"38","author":"Villanueva","year":"2008","journal-title":"IEEE Trans. Syst. Man Cybern. Part B"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"1124","DOI":"10.1109\/TBME.2005.863952","article-title":"General theory of remote gaze estimation using the pupil center and corneal reflections","volume":"53","author":"Guestrin","year":"2006","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_12","first-page":"231","article-title":"Robust feature extraction for non-contact gaze tracking with eyeglasses","volume":"22","author":"Ying","year":"2013","journal-title":"Chin. J. Electron."},{"key":"ref_13","unstructured":"Yang, C., Sun, J., Liu, J., Yang, X., Wang, D., and Liu, W. A Gray Difference-Based Pre-Processing for Gaze Tracking. Beijing, China."},{"key":"ref_14","unstructured":"Ohtani, M., and Ebisawa, Y. (1995, January 20\u201323). Eye-Gaze Detection Based on the Pupil Detection Technique Using Two Light Sources and the Image Difference Method. Montreal, Canada."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"357","DOI":"10.1006\/rtim.2002.0279","article-title":"Real-time eye, gaze, and face pose tracking for monitoring driver vigilance","volume":"8","author":"Ji","year":"2002","journal-title":"Real-Time Imaging"},{"key":"ref_16","unstructured":"B\u00f6hme, M., Meyer, A., Martinetz, T., and Barth, E. (2006, January 4\u20135). Remote Eye Tracking: State of the Art and Directions for Future Development. Turin, Italy."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"3454","DOI":"10.3390\/s130303454","article-title":"Enhanced perception of user intention by combining EEG and gaze-tracking for brain-computer interfaces (BCIs)","volume":"13","author":"Choi","year":"2013","journal-title":"Sensors"},{"key":"ref_18","unstructured":"Products for USB Sensing and Control. Available online: http:\/\/www.phidgets.com\/."},{"key":"ref_19","unstructured":"Wu, B., Ai, H., and Liu, R. (2004, January 23\u201326). Glasses Detection by Boosting Simple Wavelet Features. Cambridge, UK."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"137","DOI":"10.1023\/B:VISI.0000013087.49260.fb","article-title":"Robust real-time face detection","volume":"57","author":"Viola","year":"2004","journal-title":"Int. J. Comput. Vis."},{"key":"ref_21","unstructured":"OpenCV API Reference. Available online: http:\/\/docs.opencv.org\/2.4.2\/modules\/refman.html."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"127202:1","DOI":"10.1117\/1.3275453","article-title":"Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras","volume":"48","author":"Cho","year":"2009","journal-title":"Opt. Eng."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"542","DOI":"10.4218\/etrij.12.0111.0193","article-title":"Gaze detection by wearable eye-tracking and NIR LED-based head-tracking device based on SVR","volume":"34","author":"Cho","year":"2012","journal-title":"ETRI J."},{"key":"ref_24","unstructured":"Gonzalez, R.C., and Woods, R.E. (2002). Digital Image Processing, Prentice-Hall. [2nd ed.]."},{"key":"ref_25","unstructured":"Fitzgibbon, A.W., and Fisher, R.B. (, January September,). A Buyer's Guide to Conic Fitting. Birmingham, UK."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"736","DOI":"10.1016\/j.optlaseng.2011.12.001","article-title":"3D gaze tracking method using Purkinje images on eye optical model and pupil","volume":"50","author":"Lee","year":"2012","journal-title":"Opt. Lasers Eng."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1167\/6.1.1","article-title":"The human eye is an example of robust optical design","volume":"6","author":"Artal","year":"2006","journal-title":"J. Vis."},{"key":"ref_28","unstructured":"Independent samples t-test. Available online: http:\/\/www.medcalc.org\/manual\/ttest.php."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"3963","DOI":"10.1080\/03610928908830135","article-title":"The two-sample t-test versus satterthwaite's approximate f-test","volume":"18","author":"Moser","year":"1989","journal-title":"Commun. Stat. Theory Methods"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"25","DOI":"10.1016\/j.cviu.2004.07.011","article-title":"A novel non-intrusive eye gaze estimation using cross-ratio under large head motion","volume":"98","author":"Yoo","year":"2005","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_31","unstructured":"Tobii TX300 Eye Tracker. Available online: http:\/\/www.tobii.com\/Global\/Analysis\/Marketing\/Brochures\/ProductBrochures\/Tobii_TX300_Brochure.pdf."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/14\/2\/2110\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T21:07:42Z","timestamp":1760216862000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/14\/2\/2110"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2014,1,27]]},"references-count":31,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2014,2]]}},"alternative-id":["s140202110"],"URL":"https:\/\/doi.org\/10.3390\/s140202110","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2014,1,27]]}}}