{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,6,19]],"date-time":"2025-06-19T04:08:19Z","timestamp":1750306099981,"version":"3.41.0"},"reference-count":46,"publisher":"Association for Computing Machinery (ACM)","issue":"1","license":[{"start":{"date-parts":[[2017,10,6]],"date-time":"2017-10-06T00:00:00Z","timestamp":1507248000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM Trans. Appl. Percept."],"published-print":{"date-parts":[[2018,1,31]]},"abstract":"<jats:p>Comprehension of computer programs is daunting, due in part to clutter in the software developer's visual environment and the need for frequent visual context changes. Previous research has shown that nonspeech sound can be useful in understanding the runtime behavior of a program. We explore the viability and advantages of using nonspeech sound in an ecological framework to help understand the static structure of software. We describe a novel concept for auditory display of program elements in which sounds indicate characteristics and relationships among a Java program's classes, interfaces, and methods. An empirical study employing this concept was used to evaluate 24 sighted software professionals and students performing maintenance-oriented tasks using a 2\u00d72 crossover. Viability is strong for differentiation and characterization of software entities, less so for identification. The results suggest that sonification can be advantageous under certain conditions, though they do not indicate the overall advantage of using sound in terms of task duration at a 5% level of significance. The results uncover other findings such as differences in comprehension strategy based on the available tool environment. The participants reported enthusiasm for the idea of software sonification, mitigated by lack of familiarity with the concept and the brittleness of the tool. Limitations of the present research include restriction to particular types of comprehension tasks, a single sound mapping, a single programming language, and limited training time, but the use of sound in program comprehension shows sufficient promise for continued research.<\/jats:p>","DOI":"10.1145\/3129456","type":"journal-article","created":{"date-parts":[[2017,10,6]],"date-time":"2017-10-06T12:48:39Z","timestamp":1507294119000},"page":"1-20","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":2,"title":["Evaluating the Use of Sound in Static Program Comprehension"],"prefix":"10.1145","volume":"15","author":[{"given":"Lewis","family":"Berman","sequence":"first","affiliation":[{"name":"Digital Innovation Inc."}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1469-9866","authenticated-orcid":false,"given":"Keith","family":"Gallagher","sequence":"additional","affiliation":[{"name":"Florida Institute of Technology, West University Blvd., Melbourne, FL"}]},{"given":"Suzanne","family":"Kozaitis","sequence":"additional","affiliation":[{"name":"Florida Institute of Technology, West University Blvd., Melbourne, FL"}]}],"member":"320","published-online":{"date-parts":[[2017,10,6]]},"reference":[{"key":"e_1_2_1_1_1","doi-asserted-by":"publisher","DOI":"10.4236\/jsea.2014.75038"},{"key":"e_1_2_1_2_1","doi-asserted-by":"publisher","DOI":"10.1145\/2355598.2355600"},{"volume-title":"Proceedings of the 12th International Conference on Auditory Display (ICAD\u201906)","author":"Berman L.","key":"e_1_2_1_4_1","unstructured":"L. Berman and K. Gallagher . 2006. Listening to program slices . In Proceedings of the 12th International Conference on Auditory Display (ICAD\u201906) . L. Berman and K. Gallagher. 2006. Listening to program slices. In Proceedings of the 12th International Conference on Auditory Display (ICAD\u201906)."},{"key":"e_1_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.1207\/s15327051hci0401_1"},{"key":"e_1_2_1_6_1","doi-asserted-by":"publisher","DOI":"10.1109\/CMPSAC.1995.524778"},{"key":"e_1_2_1_7_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICSM.2008.4658085"},{"key":"e_1_2_1_8_1","unstructured":"G. Booch J. Rumbaugh and I. Jacobson. 1998. The Unified Modeling Language User Guide. Addison Wesley Longman Redwood City CA.  G. Booch J. Rumbaugh and I. Jacobson. 1998. The Unified Modeling Language User Guide. Addison Wesley Longman Redwood City CA."},{"volume-title":"Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903)","author":"Brown L.","key":"e_1_2_1_9_1","unstructured":"L. Brown and S. Brewster . 2003. Drawing by ear: Interpreting sonified line graphs . In Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903) . L. Brown and S. Brewster. 2003. Drawing by ear: Interpreting sonified line graphs. In Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903)."},{"key":"e_1_2_1_10_1","doi-asserted-by":"publisher","DOI":"10.1109\/ESEM.2013.12"},{"key":"e_1_2_1_11_1","doi-asserted-by":"publisher","DOI":"10.1145\/2543581.2543586"},{"volume-title":"Proceedings of the 14th International Conference on Auditory Display (ICAD\u201908)","author":"Dingler T.","key":"e_1_2_1_12_1","unstructured":"T. Dingler , J. Lindsay , and B. Walker . 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech . In Proceedings of the 14th International Conference on Auditory Display (ICAD\u201908) . T. Dingler, J. Lindsay, and B. Walker. 2008. Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the 14th International Conference on Auditory Display (ICAD\u201908)."},{"key":"e_1_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.1002\/smr.567"},{"key":"e_1_2_1_14_1","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0082491"},{"key":"e_1_2_1_15_1","unstructured":"Eclipse Foundation. 2013. Retrieved October 12 2015 from http:\/\/www.eclipse.org\/.  Eclipse Foundation. 2013. Retrieved October 12 2015 from http:\/\/www.eclipse.org\/."},{"volume-title":"Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905)","author":"Finlayson J.","key":"e_1_2_1_16_1","unstructured":"J. Finlayson and C. Mellish . 2005. The audioview - providing a glance at Java source code . In Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905) . J. Finlayson and C. Mellish. 2005. The audioview - providing a glance at Java source code. In Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905)."},{"key":"e_1_2_1_18_1","doi-asserted-by":"publisher","DOI":"10.1207\/s15327051hci0202_3"},{"volume-title":"Proceedings of the USENIX System Administration Conference (LISA\u201900)","author":"Gilfix M.","key":"e_1_2_1_19_1","unstructured":"M. Gilfix and A. Couch . 2000. Peep (the network auralizer): Monitoring your network with sound . In Proceedings of the USENIX System Administration Conference (LISA\u201900) . 109--117. M. Gilfix and A. Couch. 2000. Peep (the network auralizer): Monitoring your network with sound. In Proceedings of the USENIX System Administration Conference (LISA\u201900). 109--117."},{"key":"e_1_2_1_20_1","unstructured":"T. Hermann A. Hunt and J. Neuhoff. 2011. The Sonification Handbook. Logos Verlag Berlin Germany.  T. Hermann A. Hunt and J. Neuhoff. 2011. The Sonification Handbook. Logos Verlag Berlin Germany."},{"key":"e_1_2_1_21_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICPC.2009.5090035"},{"volume-title":"Proceedings of the Interactive Sonification Workshops. Retrieved","year":"2016","key":"e_1_2_1_22_1","unstructured":"Interactive-sonification.org. 2016 . Proceedings of the Interactive Sonification Workshops. Retrieved March 6, 2016, from http:\/\/interactive-sonification.org\/proceedings\/. Interactive-sonification.org. 2016. Proceedings of the Interactive Sonification Workshops. Retrieved March 6, 2016, from http:\/\/interactive-sonification.org\/proceedings\/."},{"key":"e_1_2_1_23_1","unstructured":"International Community for Auditory Display. 2015. Retrieved April 16 2015 from http:\/\/www.icad.org\/.  International Community for Auditory Display. 2015. Retrieved April 16 2015 from http:\/\/www.icad.org\/."},{"key":"e_1_2_1_24_1","doi-asserted-by":"publisher","DOI":"10.1109\/RTTAS.1996.509518"},{"volume-title":"Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905)","author":"Kildal J.","key":"e_1_2_1_25_1","unstructured":"J. Kildal and S. Brewster . 2005. Explore the matrix: Browsing numerical data tables using sound . In Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905) . J. Kildal and S. Brewster. 2005. Explore the matrix: Browsing numerical data tables using sound. In Proceedings of the 11th International Conference on Auditory Display (ICAD\u201905)."},{"volume-title":"Proceedings of the 6th International Conference on Auditory Display (ICAD\u201900)","author":"Lepl\u00e2tre G.","key":"e_1_2_1_26_1","unstructured":"G. Lepl\u00e2tre and S. Brewster . 2000. Designing non-speech sounds to support navigation in mobile phone menus . In Proceedings of the 6th International Conference on Auditory Display (ICAD\u201900) . 190--199. G. Lepl\u00e2tre and S. Brewster. 2000. Designing non-speech sounds to support navigation in mobile phone menus. In Proceedings of the 6th International Conference on Auditory Display (ICAD\u201900). 190--199."},{"key":"e_1_2_1_27_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijhcs.2015.08.008"},{"volume-title":"Proceedings of the 18th International Conference on Auditory Display (ICAD\u201912)","author":"McLachlan R.","key":"e_1_2_1_28_1","unstructured":"R. McLachlan , M. McGee-Lennon , and S. Brewster . 2012. The sound of musicons: Investigating the design of musically derived audio cues . In Proceedings of the 18th International Conference on Auditory Display (ICAD\u201912) . R. McLachlan, M. McGee-Lennon, and S. Brewster. 2012. The sound of musicons: Investigating the design of musically derived audio cues. In Proceedings of the 18th International Conference on Auditory Display (ICAD\u201912)."},{"key":"e_1_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.1007\/s00426-015-0647-z"},{"key":"e_1_2_1_30_1","volume-title":"Proceedings of the 16th International Conference on Auditory Display (ICAD\u201908)","author":"Mustonen M.","year":"2008","unstructured":"M. Mustonen . 2008 . A review-based conceptual analysis of auditory signs and their design . In Proceedings of the 16th International Conference on Auditory Display (ICAD\u201908) . M. Mustonen. 2008. A review-based conceptual analysis of auditory signs and their design. In Proceedings of the 16th International Conference on Auditory Display (ICAD\u201908)."},{"volume-title":"Proceedings of the 4th Interactive Sonification Workshop (ISon\u201913)","author":"Neate T.","key":"e_1_2_1_31_1","unstructured":"T. Neate and A. Hunt . 2013. Interactive spatial auditory display of graphical data . In Proceedings of the 4th Interactive Sonification Workshop (ISon\u201913) . 29--36. T. Neate and A. Hunt. 2013. Interactive spatial auditory display of graphical data. In Proceedings of the 4th Interactive Sonification Workshop (ISon\u201913). 29--36."},{"volume-title":"Proceedings of the 13th International Conference on Auditory Display (ICAD\u201907)","author":"Nees M.","key":"e_1_2_1_32_1","unstructured":"M. Nees and B. Walker . 2007. Model of auditory graph comprehension . In Proceedings of the 13th International Conference on Auditory Display (ICAD\u201907) . 266--273. M. Nees and B. Walker. 2007. Model of auditory graph comprehension. In Proceedings of the 13th International Conference on Auditory Display (ICAD\u201907). 266--273."},{"key":"e_1_2_1_33_1","doi-asserted-by":"publisher","DOI":"10.1145\/2993283.2993285"},{"key":"e_1_2_1_34_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICPC.2006.14"},{"key":"e_1_2_1_35_1","doi-asserted-by":"publisher","DOI":"10.1109\/TMM.2016.2531978"},{"key":"e_1_2_1_36_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijhcs.2015.08.005"},{"key":"e_1_2_1_37_1","doi-asserted-by":"publisher","DOI":"10.1017\/S1355771813000423"},{"volume-title":"Readings in Information Visualization: Using Vision to Think","author":"Shneiderman B.","key":"e_1_2_1_38_1","unstructured":"B. Shneiderman . 1999. Dynamic queries for visual information seeking . In Readings in Information Visualization: Using Vision to Think , S. Card, J. Mackinlay, and B. Shneiderman (Eds.). Morgan Kaufman , San Francisco , 236--243. B. Shneiderman. 1999. Dynamic queries for visual information seeking. In Readings in Information Visualization: Using Vision to Think, S. Card, J. Mackinlay, and B. Shneiderman (Eds.). Morgan Kaufman, San Francisco, 236--243."},{"key":"e_1_2_1_39_1","doi-asserted-by":"publisher","DOI":"10.1109\/HICSS.1990.205229"},{"key":"e_1_2_1_40_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICPC.2009.5090034"},{"key":"e_1_2_1_41_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijhcs.2011.07.002"},{"key":"e_1_2_1_42_1","doi-asserted-by":"publisher","DOI":"10.3366\/sound.2014.0057"},{"volume-title":"Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903)","author":"Vargas M.","key":"e_1_2_1_43_1","unstructured":"M. Vargas and S. Anderson . 2003. Combining speech and earcons to assist menu navigation . In Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903) . 38--41. M. Vargas and S. Anderson. 2003. Combining speech and earcons to assist menu navigation. In Proceedings of the 9th International Conference on Auditory Display (ICAD\u201903). 38--41."},{"key":"e_1_2_1_44_1","doi-asserted-by":"publisher","DOI":"10.1145\/1101530.1101547"},{"key":"e_1_2_1_45_1","doi-asserted-by":"publisher","DOI":"10.1016\/S0953-5438(02)00026-7"},{"key":"e_1_2_1_46_1","doi-asserted-by":"publisher","DOI":"10.1145\/792704.792734"},{"key":"e_1_2_1_47_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.displa.2016.05.002"},{"key":"e_1_2_1_48_1","doi-asserted-by":"publisher","DOI":"10.1109\/2.402076"}],"container-title":["ACM Transactions on Applied Perception"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3129456","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3129456","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T03:30:32Z","timestamp":1750217432000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3129456"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2017,10,6]]},"references-count":46,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2018,1,31]]}},"alternative-id":["10.1145\/3129456"],"URL":"https:\/\/doi.org\/10.1145\/3129456","relation":{},"ISSN":["1544-3558","1544-3965"],"issn-type":[{"type":"print","value":"1544-3558"},{"type":"electronic","value":"1544-3965"}],"subject":[],"published":{"date-parts":[[2017,10,6]]},"assertion":[{"value":"2016-07-01","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2017-05-01","order":1,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2017-10-06","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}