{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,6]],"date-time":"2025-10-06T09:30:51Z","timestamp":1759743051443},"reference-count":47,"publisher":"Walter de Gruyter GmbH","issue":"1","license":[{"start":{"date-parts":[[2019,1,1]],"date-time":"2019-01-01T00:00:00Z","timestamp":1546300800000},"content-version":"unspecified","delay-in-days":0,"URL":"http:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2019,1,1]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Measuring emotions non-intrusively via affective computing provides a promising source of information for adaptive learning and intelligent tutoring systems. Using non-intrusive, simultaneous measures of emotions, such systems could steadily adapt to students emotional states. One drawback, however, is the lack of evidence on how such modern measures of emotions relate to traditional self-reports. The aim of this study was to compare a prominent area of affective computing, facial emotion recognition, to students\u2019 self-reports of interest, boredom, and valence. We analyzed different types of aggregation of the simultaneous facial emotion recognition estimates and compared them to self-reports after reading a text. Analyses of 103 students revealed no relationship between the aggregated facial emotion recognition estimates of the software FaceReader and self-reports. Irrespective of different types of aggregation of the facial emotion recognition estimates, neither the epistemic emotions (<jats:italic>i.e.<\/jats:italic>, boredom and interest), nor the estimates of valence predicted the respective self-report measure. We conclude that assumptions on the subjective experience of emotions cannot necessarily be transferred to other emotional components, such as estimated by affective computing. We advise to wait for more comprehensive evidence on the<jats:italic>predictive validity<\/jats:italic>of facial emotion recognition for learning before relying on it in educational practice.<\/jats:p>","DOI":"10.1515\/comp-2019-0020","type":"journal-article","created":{"date-parts":[[2019,12,15]],"date-time":"2019-12-15T09:05:42Z","timestamp":1576400742000},"page":"308-317","source":"Crossref","is-referenced-by-count":12,"title":["Measuring emotions during learning: lack of coherence between automated facial emotion recognition and emotional experience"],"prefix":"10.1515","volume":"9","author":[{"given":"Franziska","family":"Hirt","sequence":"first","affiliation":[{"name":"Swiss Distance University of Applied Sciences (FFHS) , Brig , Switzerland"}]},{"given":"Egon","family":"Werlen","sequence":"additional","affiliation":[{"name":"Swiss Distance University of Applied Sciences (FFHS) , Brig , Switzerland"}]},{"given":"Ivan","family":"Moser","sequence":"additional","affiliation":[{"name":"Swiss Distance University of Applied Sciences (FFHS) , Brig , Switzerland"}]},{"given":"Per","family":"Bergamin","sequence":"additional","affiliation":[{"name":"Swiss Distance University of Applied Sciences (FFHS) , Brig , Switzerland"}]}],"member":"374","published-online":{"date-parts":[[2019,12,13]]},"reference":[{"key":"2022042707443477635_j_comp-2019-0020_ref_001_w2aab3b7c19b1b6b1ab1ab1Aa","doi-asserted-by":"crossref","unstructured":"[1] Wu C.H., Huang Y.M., Hwang J.P., Review of affective computing in education\/learning: Trends and challenges, British Journal of Educational Technology, 47(6), 2016, 1304\u20131323, 10.1111\/bjet.1232410.1111\/bjet.12324","DOI":"10.1111\/bjet.12324"},{"key":"2022042707443477635_j_comp-2019-0020_ref_002_w2aab3b7c19b1b6b1ab1ab2Aa","doi-asserted-by":"crossref","unstructured":"[2] Bosch N., D\u2019Mello S.K., Ocumpaugh J., Baker R.S., Shute V., Using video to automatically detect learner affect in computer-enabled classrooms, 2016, 10.1145\/0000000.000000010.1145\/2946837","DOI":"10.1145\/2946837"},{"key":"2022042707443477635_j_comp-2019-0020_ref_003_w2aab3b7c19b1b6b1ab1ab3Aa","doi-asserted-by":"crossref","unstructured":"[3] Wang C.H., Lin H.C.K., Constructing an Affective Tutoring System for Designing Course Learning and Evaluation, Journal of Educational Computing Research, 55(8), 2018, 1111\u20131128, 10.1177\/073563311769995510.1177\/0735633117699955","DOI":"10.1177\/0735633117699955"},{"key":"2022042707443477635_j_comp-2019-0020_ref_004_w2aab3b7c19b1b6b1ab1ab4Aa","doi-asserted-by":"crossref","unstructured":"[4] Calvo R.A., D\u2019Mello S., Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications, IEEE Transactions on Affective Computing, 1(1), 2010, 18\u201337, 10.1109\/TAFFC.2010.110.1109\/T-AFFC.2010.1","DOI":"10.1109\/T-AFFC.2010.1"},{"key":"2022042707443477635_j_comp-2019-0020_ref_005_w2aab3b7c19b1b6b1ab1ab5Aa","doi-asserted-by":"crossref","unstructured":"[5] Scherer K.R., What are emotions? And how can they be measured?, Social Science Information, 44(4), 2005, 695\u2013729, 10.1177\/053901840505821610.1177\/0539018405058216","DOI":"10.1177\/0539018405058216"},{"key":"2022042707443477635_j_comp-2019-0020_ref_006_w2aab3b7c19b1b6b1ab1ab6Aa","doi-asserted-by":"crossref","unstructured":"[6] D\u2019Mello S.K., Kappas A., Gratch J., The affective computing approach to affect measurement, Emotion Review, 10(2), 2018, 174\u201318310.1177\/1754073917696583","DOI":"10.1177\/1754073917696583"},{"key":"2022042707443477635_j_comp-2019-0020_ref_007_w2aab3b7c19b1b6b1ab1ab7Aa","doi-asserted-by":"crossref","unstructured":"[7] D\u2019mello S.K., Kory J., A Review and Meta-Analysis of Multimodal Affect Detection Systems, ACM Computing Surveys, 47(3), 2015, 1\u201336, 10.1145\/268289910.1145\/2682899","DOI":"10.1145\/2682899"},{"key":"2022042707443477635_j_comp-2019-0020_ref_008_w2aab3b7c19b1b6b1ab1ab8Aa","doi-asserted-by":"crossref","unstructured":"[8] Soleymani M., Mortillaro M., Behavioral and Physiological Responses to Visual Interest and Appraisals: Multimodal Analysis and Automatic Recognition, Frontiers in ICT, 5(17), 2018, 10.3389\/fict.2018.0001710.3389\/fict.2018.00017","DOI":"10.3389\/fict.2018.00017"},{"key":"2022042707443477635_j_comp-2019-0020_ref_009_w2aab3b7c19b1b6b1ab1ab9Aa","doi-asserted-by":"crossref","unstructured":"[9] Bosch N., D\u2019Mello S., Mills C., What emotions do novices experience during their first computer programming learning session?, Technical report, 2013, 10.1007\/978-3-642-39112-5-210.1007\/978-3-642-39112-5_2","DOI":"10.1007\/978-3-642-39112-5_2"},{"key":"2022042707443477635_j_comp-2019-0020_ref_010_w2aab3b7c19b1b6b1ab1ac10Aa","doi-asserted-by":"crossref","unstructured":"[10] Trigwell K., Ellis R.A., Han F., Relations between students\u2019 approaches to learning, experienced emotions and outcomes of learning, Studies in Higher Education, 37(7), 2012, 811\u2013824, 10.1080\/03075079.2010.54922010.1080\/03075079.2010.549220","DOI":"10.1080\/03075079.2010.549220"},{"key":"2022042707443477635_j_comp-2019-0020_ref_011_w2aab3b7c19b1b6b1ab1ac11Aa","doi-asserted-by":"crossref","unstructured":"[11] Tze V.M.C., Daniels L.M., Klassen R.M., Evaluating the Relationship Between Boredom and Academic Outcomes: A Meta-Analysis, Educational Psychology Review, 28(1), 2016, 119\u2013144, 10.1007\/s10648-015-9301-y10.1007\/s10648-015-9301-y","DOI":"10.1007\/s10648-015-9301-y"},{"key":"2022042707443477635_j_comp-2019-0020_ref_012_w2aab3b7c19b1b6b1ab1ac12Aa","doi-asserted-by":"crossref","unstructured":"[12] Ekman P., Cordaro D., What is meant by calling emotions basic, Emotion Review, 3(4), 2011, 364\u201337010.1177\/1754073911410740","DOI":"10.1177\/1754073911410740"},{"key":"2022042707443477635_j_comp-2019-0020_ref_013_w2aab3b7c19b1b6b1ab1ac13Aa","doi-asserted-by":"crossref","unstructured":"[13] Moors A., Ellsworth P.C., Scherer K., Frijda N., Appraisal theories of emotion: State of the art and future development, Emotion Review, 5(2), 2013, 119\u201312410.1177\/1754073912468165","DOI":"10.1177\/1754073912468165"},{"key":"2022042707443477635_j_comp-2019-0020_ref_014_w2aab3b7c19b1b6b1ab1ac14Aa","doi-asserted-by":"crossref","unstructured":"[14] Soutschek A., Weinreich A., Schuber T., Facial Electromyography reveals dissociable affective responses in social and non-social cooperation, Motivation and Emotion, 42(1), 2018, 118\u201312510.1007\/s11031-017-9662-2","DOI":"10.1007\/s11031-017-9662-2"},{"key":"2022042707443477635_j_comp-2019-0020_ref_015_w2aab3b7c19b1b6b1ab1ac15Aa","unstructured":"[15] Amos B., Ludwiczuk Bartosz Satyanarayanan M., Openface: A general-purpose face recognition library with mobile applications, 2016, 10.5281\/zenodo.32148"},{"key":"2022042707443477635_j_comp-2019-0020_ref_016_w2aab3b7c19b1b6b1ab1ac16Aa","unstructured":"[16] Affectiva Homepage"},{"key":"2022042707443477635_j_comp-2019-0020_ref_017_w2aab3b7c19b1b6b1ab1ac17Aa","unstructured":"[17] Noldus, Noldus Homepage"},{"key":"2022042707443477635_j_comp-2019-0020_ref_018_w2aab3b7c19b1b6b1ab1ac18Aa","doi-asserted-by":"crossref","unstructured":"[18] Ekman P., Friesen W.V., Measuring facial movement, Environmental Psychology and Nonverbal Behavior, 1, 1976, 56\u20137510.1007\/BF01115465","DOI":"10.1007\/BF01115465"},{"key":"2022042707443477635_j_comp-2019-0020_ref_019_w2aab3b7c19b1b6b1ab1ac19Aa","unstructured":"[19] Loijens L., Krips O., FaceReader Methodology Note. A white paper by Noldus Information Technology, Technical report, Amsterdam: Noldus, 2018"},{"key":"2022042707443477635_j_comp-2019-0020_ref_020_w2aab3b7c19b1b6b1ab1ac20Aa","unstructured":"[20] Soleymani M., Detecting cognitive appraisals from facial expressions for interest recognition, preprint arXiv, 2016, arXiv:1609.09761v2"},{"key":"2022042707443477635_j_comp-2019-0020_ref_021_w2aab3b7c19b1b6b1ab1ac21Aa","doi-asserted-by":"crossref","unstructured":"[21] Bonanno G., Keltner D., Brief Report The coherence of emotion systems: Comparing \u201con-line\u201d measures of appraisal and facial expressions, and self-report, Cognition & Emotion, 18(3), 2004, 431\u2013444, 10.1080\/0269993034100014910.1080\/02699930341000149","DOI":"10.1080\/02699930341000149"},{"key":"2022042707443477635_j_comp-2019-0020_ref_022_w2aab3b7c19b1b6b1ab1ac22Aa","doi-asserted-by":"crossref","unstructured":"[22] Lewinski P., den Uyl T.M., Butler C., Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader., Journal of Neuroscience, Psychology, and Economics, 7(4), 2014, 227\u2013236, 10.1037\/npe000002810.1037\/npe0000028","DOI":"10.1037\/npe0000028"},{"key":"2022042707443477635_j_comp-2019-0020_ref_023_w2aab3b7c19b1b6b1ab1ac23Aa","doi-asserted-by":"crossref","unstructured":"[23] Harley J.M., Bouchet F., Azevedo R., Aligning and comparing data on emotions experienced during learning with metatutor, in H. Lane, K. Yacef, J. Mostow, P. Pavlik, eds., Artificial Intelligence in Education. AIED 2013. Lecture Notes in Computer Science, vol 7926, Springer, Berlin, Heidelberg, 2013, 61\u201370, 10.1007\/978-3-642-39112-5-710.1007\/978-3-642-39112-5_7","DOI":"10.1007\/978-3-642-39112-5_7"},{"key":"2022042707443477635_j_comp-2019-0020_ref_024_w2aab3b7c19b1b6b1ab1ac24Aa","doi-asserted-by":"crossref","unstructured":"[24] Brodny G., Kolakowska A., Landowska A., Szwoch M., Szwoch W., Wrobel M.R., Comparison of selected off-the-shelf solutions for emotion recognition based on facial expressions, in 29th International Conference on Human System Interactions (HSI), IEEE, 2016, 397\u2013404, 10.1109\/HSI.2016.752966410.1109\/HSI.2016.7529664","DOI":"10.1109\/HSI.2016.7529664"},{"key":"2022042707443477635_j_comp-2019-0020_ref_025_w2aab3b7c19b1b6b1ab1ac25Aa","unstructured":"[25] Suhr Y.T., FaceReader, a promising instrument for measuring facial emotion expression? A comparison to facial electromyography and self-reports, Ph.D. thesis, Master thesis, Utrecht University, 2017"},{"key":"2022042707443477635_j_comp-2019-0020_ref_026_w2aab3b7c19b1b6b1ab1ac26Aa","doi-asserted-by":"crossref","unstructured":"[26] Sneddon I., McRorie M., McKeown G., Hanratty J., The Belfast induced natural emotion database, IEEE Transactions on Affective Computing, 3(1), 2012, 32\u201341, 10.1109\/T-AFFC.2011.2610.1109\/T-AFFC.2011.26","DOI":"10.1109\/T-AFFC.2011.26"},{"key":"2022042707443477635_j_comp-2019-0020_ref_027_w2aab3b7c19b1b6b1ab1ac27Aa","doi-asserted-by":"crossref","unstructured":"[27] Pekrun R., Vogl E., Muis K.R., Sinatra G.M., Measuring emotions during epistemic activities: the Epistemically-Related Emotion Scales, Cognition and Emotion, 31(6), 2017, 1268\u20131276, 10.1080\/02699931.2016.120498910.1080\/02699931.2016.1204989","DOI":"10.1080\/02699931.2016.1204989"},{"key":"2022042707443477635_j_comp-2019-0020_ref_028_w2aab3b7c19b1b6b1ab1ac28Aa","unstructured":"[28] Krapp A., Hidi S., Renninger A.K., Interest, learning, and development, in The role of interest in learning and development, Erlbaum, Hilsdale, NJ, 1991, 3\u201325"},{"key":"2022042707443477635_j_comp-2019-0020_ref_029_w2aab3b7c19b1b6b1ab1ac29Aa","doi-asserted-by":"crossref","unstructured":"[29] Russell J.A., A circumplex model of affect, Journal of Personality and Social Psychology, 39(6), 1980, 1161\u2013117810.1037\/h0077714","DOI":"10.1037\/h0077714"},{"key":"2022042707443477635_j_comp-2019-0020_ref_030_w2aab3b7c19b1b6b1ab1ac30Aa","doi-asserted-by":"crossref","unstructured":"[30] Flesch R., A new readability yardstick, Journal of Applied Psychology, 32(3), 1948, 221\u201323310.1037\/h0057532","DOI":"10.1037\/h0057532"},{"key":"2022042707443477635_j_comp-2019-0020_ref_031_w2aab3b7c19b1b6b1ab1ac31Aa","unstructured":"[31] Amstad T., Wie verst\u00e4ndlich sind unsere Zeitungen?, Studenten-Schreib-Service, Z\u00fcrich, 1978"},{"key":"2022042707443477635_j_comp-2019-0020_ref_032_w2aab3b7c19b1b6b1ab1ac32Aa","unstructured":"[32] Suk H.J., Color and emotion - a study on the affective judgment across media and in relation to visual stimuli, Ph.D. thesis, Doctoral dissertation, University of Mannheim, 2006"},{"key":"2022042707443477635_j_comp-2019-0020_ref_033_w2aab3b7c19b1b6b1ab1ac33Aa","doi-asserted-by":"crossref","unstructured":"[33] Math\u00f4t S., Schreij D., Theeuwes J., OpenSesame: An open-source, graphical experiment builder for the social sciences, Behavior Research Methods, 44(2), 2012, 314\u2013324, 10.3758\/s13428-011-0168-710.3758\/s13428-011-0168-7","DOI":"10.3758\/s13428-011-0168-7"},{"key":"2022042707443477635_j_comp-2019-0020_ref_034_w2aab3b7c19b1b6b1ab1ac34Aa","unstructured":"[34] Grafsgaard J., Wiggins J.B., Boyer K.E., Wiebe E.N., Lester J., Automatically recognizing facial expression: predicting engagement and frustration, Educational Data Mining, 2013"},{"key":"2022042707443477635_j_comp-2019-0020_ref_035_w2aab3b7c19b1b6b1ab1ac35Aa","unstructured":"[35] Kapoor A., Mota S., Picard R.W., Towards a learning companion that recognizes affect, Technical Report 543, 2001"},{"key":"2022042707443477635_j_comp-2019-0020_ref_036_w2aab3b7c19b1b6b1ab1ac36Aa","unstructured":"[36] McDaniel B., D\u2019Mello S., King B., Chipman P., Tapp K., Graesser A.C., Facial Features for Affective State Detection in Learning Environments Permalink, in Proceedings of the 29th Annual Cognitive Science Society, 2007, 467\u2013472"},{"key":"2022042707443477635_j_comp-2019-0020_ref_037_w2aab3b7c19b1b6b1ab1ac37Aa","doi-asserted-by":"crossref","unstructured":"[37] Lewinski P., Don\u2032t look blank, happy, or sad: Patterns of facial expressions of speakers in banks\u2032 YouTube Videos predict video\u2032s popularity over time, Journal of Neuroscience, Psychology, and Economics, 8(4), 2015, 1\u20139, 10.13140\/RG.2.1.4653.640910.1037\/npe0000046","DOI":"10.1037\/npe0000046"},{"key":"2022042707443477635_j_comp-2019-0020_ref_038_w2aab3b7c19b1b6b1ab1ac38Aa","doi-asserted-by":"crossref","unstructured":"[38] B\u00fcrkner P.C., Vuorre M., Ordinal regression models in psychology: A tutorial, Advances in Methods and Practices in Psychological Science, 2(1), 2019, 251524591882319, 10.1177\/251524591882319910.1177\/2515245918823199","DOI":"10.1177\/2515245918823199"},{"key":"2022042707443477635_j_comp-2019-0020_ref_039_w2aab3b7c19b1b6b1ab1ac39Aa","doi-asserted-by":"crossref","unstructured":"[39] B\u00fcrkner P.C., brms: An R package for Bayesian multilevel models using Stan, Journal of Statistical Software, 80(1), 2017, 1\u201328, 10.18637\/jss.v080.i0110.18637\/jss.v080.i01","DOI":"10.18637\/jss.v080.i01"},{"key":"2022042707443477635_j_comp-2019-0020_ref_040_w2aab3b7c19b1b6b1ab1ac40Aa","unstructured":"[40] R Core Team, R: A language and environment for statistical computing., 2018"},{"key":"2022042707443477635_j_comp-2019-0020_ref_041_w2aab3b7c19b1b6b1ab1ac41Aa","doi-asserted-by":"crossref","unstructured":"[41] Heino M.T.J., Vuorre M., Hankonen N., Bayesian evaluation of behavior change interventions: a brief introduction and a practical example, Health Psychology and Behavioral Medicine, 6(1), 2018, 49\u201378, 10.1080\/21642850.2018.142810210.1080\/21642850.2018.1428102","DOI":"10.1080\/21642850.2018.1428102"},{"key":"2022042707443477635_j_comp-2019-0020_ref_042_w2aab3b7c19b1b6b1ab1ac42Aa","doi-asserted-by":"crossref","unstructured":"[42] Scherer K.R., What are emotions? and how can they be measured?, Social Science Information, 44(4), 2005, 695\u2013729, 10.1177\/053901840505821610.1177\/0539018405058216","DOI":"10.1177\/0539018405058216"},{"key":"2022042707443477635_j_comp-2019-0020_ref_043_w2aab3b7c19b1b6b1ab1ac43Aa","doi-asserted-by":"crossref","unstructured":"[43] Zimmermann P., Guttormsen S., Danuser B., Gomez P., Affective computing - A rationale for measuring mood with mouse and keyboard, International Journal of Occupational Safety and Ergonomics, 9(4), 2003, 539\u2013551, 10.1080\/10803548.2003.1107658910.1080\/10803548.2003.11076589","DOI":"10.1080\/10803548.2003.11076589"},{"key":"2022042707443477635_j_comp-2019-0020_ref_044_w2aab3b7c19b1b6b1ab1ac44Aa","doi-asserted-by":"crossref","unstructured":"[44] Feldman Barrett L., Adolphs R., Marsella S., Martinez A.M., Pollak S.D., Emotional expressions reconsidered: challenges to inferring emotion from human facial movements, Psychological Science in the Public Interest, 20(1), 2019, 1\u201368, 10.1177\/152910061983293010.1177\/1529100619832930","DOI":"10.1177\/1529100619832930"},{"key":"2022042707443477635_j_comp-2019-0020_ref_045_w2aab3b7c19b1b6b1ab1ac45Aa","doi-asserted-by":"crossref","unstructured":"[45] Feldman Barrett L., Quigley K.S., Bliss-Moreau E., Aronson K.R., Interoceptive sensitivity and self-reports of emotional experience, Journal of Personality and Social Psychology, 87(5), 2005, 684\u2013697, 10.1016\/j.molcel.2009.10.020.The10.1037\/0022-3514.87.5.684","DOI":"10.1037\/0022-3514.87.5.684"},{"key":"2022042707443477635_j_comp-2019-0020_ref_046_w2aab3b7c19b1b6b1ab1ac46Aa","doi-asserted-by":"crossref","unstructured":"[46] Rogosa D., Saner H., Longitudinal Data Analysis Examples with Random Coefficient Models, Journal of Educational and Behavioral Statistics, 20(2), 1995, 149\u2013170, https:\/\/doi.org\/10.3102\/1076998602000214910.3102\/10769986020002149","DOI":"10.3102\/10769986020002149"},{"key":"2022042707443477635_j_comp-2019-0020_ref_047_w2aab3b7c19b1b6b1ab1ac47Aa","doi-asserted-by":"crossref","unstructured":"[47] Lewinski P., Automated facial coding software outperforms people in recognizing neutral faces as neutral from standardized datasets, Frontiers in Psychology, 6, 2015, 1386, 10.3389\/fpsyg.2015.0138610.3389\/fpsyg.2015.01386","DOI":"10.3389\/fpsyg.2015.01386"}],"container-title":["Open Computer Science"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.degruyter.com\/view\/journals\/comp\/9\/1\/article-p308.xml","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/comp-2019-0020\/xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/comp-2019-0020\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,10,8]],"date-time":"2022-10-08T14:04:31Z","timestamp":1665237871000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.degruyter.com\/document\/doi\/10.1515\/comp-2019-0020\/html"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,1,1]]},"references-count":47,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2019,9,26]]},"published-print":{"date-parts":[[2019,1,1]]}},"alternative-id":["10.1515\/comp-2019-0020"],"URL":"https:\/\/doi.org\/10.1515\/comp-2019-0020","relation":{},"ISSN":["2299-1093"],"issn-type":[{"value":"2299-1093","type":"electronic"}],"subject":[],"published":{"date-parts":[[2019,1,1]]}}}