{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,8,12]],"date-time":"2024-08-12T14:30:00Z","timestamp":1723473000021},"reference-count":18,"publisher":"IGI Global","issue":"1","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2014,1,1]]},"abstract":"<p>Eye tracking reveals a person's state of mind. Thus, representing personal cognitive states using eye tracking leads to objective evaluations of these states, and this representation can be applied to various application fields. In this paper, the authors focus on the cognitive distraction state as a cognitive state, and the authors propose a model that evaluates personal cognitive distraction. The model takes as input eye tracking data and outputs the degree of personal cognitive distraction. The authors use a simple recurrent neural network, which is a type of neural network, to build the proposed model. In addition, the authors apply the proposed model to eye tracking for a person driving a car.<\/p>","DOI":"10.4018\/ijssci.2014010101","type":"journal-article","created":{"date-parts":[[2014,9,8]],"date-time":"2014-09-08T13:15:36Z","timestamp":1410182136000},"page":"1-16","source":"Crossref","is-referenced-by-count":25,"title":["Evaluation Model of Cognitive Distraction State Based on Eye Tracking Data Using Neural Networks"],"prefix":"10.4018","volume":"6","author":[{"given":"Taku","family":"Harada","sequence":"first","affiliation":[{"name":"Tokyo University of Science, Chiba, Japan"}]},{"given":"Hirotoshi","family":"Iwasaki","sequence":"additional","affiliation":[{"name":"Denso IT Laboratory, Inc., Tokyo, Japan"}]},{"given":"Kazuaki","family":"Mori","sequence":"additional","affiliation":[{"name":"Tokyo University of Science, Chiba, Japan"}]},{"given":"Akira","family":"Yoshizawa","sequence":"additional","affiliation":[{"name":"Denso IT Laboratory, Inc., Tokyo, Japan"}]},{"given":"Fumio","family":"Mizoguchi","sequence":"additional","affiliation":[{"name":"Tokyo University of Science, Chiba, Japan"}]}],"member":"2432","reference":[{"key":"ijssci.2014010101-0","unstructured":"Akiyama, T., Inagaki, T., Furukawa, H., & Itoh, M. (2005). Eye movement analysis for detecting driver\u2019s inattentiveness. Human Interface, 345-350."},{"key":"ijssci.2014010101-1","doi-asserted-by":"publisher","DOI":"10.1016\/j.learninstruc.2009.02.015"},{"key":"ijssci.2014010101-2","doi-asserted-by":"publisher","DOI":"10.1016\/j.knosys.2007.04.010"},{"key":"ijssci.2014010101-3","doi-asserted-by":"publisher","DOI":"10.1207\/s15516709cog1402_1"},{"key":"ijssci.2014010101-4","article-title":"Modeling the cognitive states based on eye tracking using neural networks","author":"T.Harada","year":"2012","journal-title":"Proc. of the technical meetings of Electronics, Information and Systems (Division C)"},{"key":"ijssci.2014010101-5","unstructured":"Kurogawa, K., Okuda, H., Inagaki, S., Suzuki, T., & Hayakawa, S. (2009). The estimation of the potential risk recognition by driver\u2019s eye movement. In Proc. of SSI2009 (The Society of Instrument and Control Engineers) (pp. 15-18)."},{"key":"ijssci.2014010101-6","doi-asserted-by":"publisher","DOI":"10.1109\/TITS.2007.895298"},{"key":"ijssci.2014010101-7","doi-asserted-by":"publisher","DOI":"10.1007\/s11257-010-9077-1"},{"key":"ijssci.2014010101-8","doi-asserted-by":"publisher","DOI":"10.1016\/j.trf.2011.04.004"},{"key":"ijssci.2014010101-9","article-title":"Identifying driver\u2019s cognitive load using inductive logic programming.","author":"F.Mizoguchi","year":"2012","journal-title":"Proceedings of the 22nd International Conference on Inductive Logic Programming"},{"issue":"6","key":"ijssci.2014010101-10","first-page":"967","article-title":"A feasibility study of driver\u2019s cognitive process estimation from driving behavior.","volume":"125","author":"K.Mizutani","year":"2005","journal-title":"The Transactions of the Institute of Electrical Engineers of Japan"},{"key":"ijssci.2014010101-11","doi-asserted-by":"publisher","DOI":"10.1007\/s00221-011-2866-x"},{"key":"ijssci.2014010101-12","doi-asserted-by":"publisher","DOI":"10.4018\/jssci.2011100102"},{"key":"ijssci.2014010101-13","unstructured":"Toda, K., Nakamichi, N., Shima, K., Ohira, M., Sakai, M., & Matsumoto, K. (2005). An information exploration model based on eye movements during browsing web pages. IPSJ SIG Technical Report, 35-42."},{"key":"ijssci.2014010101-14","first-page":"216","article-title":"Quantifying driver\u2019s mental workload using saccadic eye movements.","author":"S.Tokuda","year":"2011","journal-title":"Proc. of The 54th Japan Automatic Control Conference"},{"key":"ijssci.2014010101-15","unstructured":"Tomanek, K., Hahn, U., Lohmann, S., & Siegler, J. (2010). A cognitive cost model of annotations based on eye-tracking data. In ACL \u201910 Proceeding of the 48th Annual Meeting of the Association for Computational Linguistics (pp. 1158-1167)."},{"key":"ijssci.2014010101-16","doi-asserted-by":"publisher","DOI":"10.1109\/IVS.2010.5547957"},{"key":"ijssci.2014010101-17","doi-asserted-by":"publisher","DOI":"10.1007\/s11768-010-8043-0"}],"container-title":["International Journal of Software Science and Computational Intelligence"],"original-title":[],"language":"ng","link":[{"URL":"https:\/\/www.igi-global.com\/viewtitle.aspx?TitleId=114093","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,6,2]],"date-time":"2022-06-02T02:33:05Z","timestamp":1654137185000},"score":1,"resource":{"primary":{"URL":"https:\/\/services.igi-global.com\/resolvedoi\/resolve.aspx?doi=10.4018\/ijssci.2014010101"}},"subtitle":[""],"short-title":[],"issued":{"date-parts":[[2014,1,1]]},"references-count":18,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2014,1]]}},"URL":"https:\/\/doi.org\/10.4018\/ijssci.2014010101","relation":{},"ISSN":["1942-9045","1942-9037"],"issn-type":[{"value":"1942-9045","type":"print"},{"value":"1942-9037","type":"electronic"}],"subject":[],"published":{"date-parts":[[2014,1,1]]}}}