Proceedings of the 18th International Conference on Auditory Displays (ICAD 2012), Atlanta, 2012, edited by Nees M. A., Walker B. N. and Freeman J., ICAD(2012)
Full proceedings pdf
Alexander R. L., Modhrain S., Gilbert J. A., Zurbuchen T. H. (2012). Recognition of Audified Data in Untrained Listeners. pdf
Andre C. R., Embrecht J., Verly J. G., Rebillat M., Katz B. F. G. (2012). Sound for 3D Cinema and the Sense of Presence. pdf
Ash K. (2012). Affective States: Analysis and Sonification of Twitter Music Trends. pdf
Banf M., Blanz V. (2012). A Modular Computer Vision Sonification Model for the Visually Impaired. pdf
Bethancourt M. (2012). The Sounds of the Discussion of Music. pdf
Boyd J. E., Godbout A. (2012). Multi-dimensional Synchronization for Rhythmic Sonification. pdf
Brock D., McClimens B, Camille Peres S. (2012). Evaluating Listeners Attention to and Comprehension of Serialy Interleaved, Rate Accelerated Speech. pdf
Bruckner H., Wielage M., Blume H. (2012). Intuitive and Interactive Movement Sonification on a RISC/DSP Platform. pdf
Choi H. (2012). An Alternate Implementation of VBAP with Graphical Interface for Sound Motion Design. pdf
Drouvmeva M., McGregor I. (2012). Everyday Listening to Auditory Displays: Lessons from Acoustic Technology. pdf
Gonzalez C., Lewis B. A., Baldwin C. L. (2012). Revisiting Pulse Rate, Frequency and Perceived Urgency: Have relationships changed and why?. pdf
Gossmann J. (2012). A Perspective on the Limited Potential for Simultaneity in Auditory Display. pdf
Hermann T., Ungerechts B., Toussaint H., Grote M. (2012). Sonification of Pressure changes in Swimming for Analysis and Optimization. pdf
Hermann T., Nehls A. V., Eitel F., Barri T., Gammel M. (2012). Tweetscapes - Real Time Sonification of Twitter data streams for Radio Broadcasting. pdf
Hoferlin B., Hoferlin M., Goloubets B., Heidemann G., Weiskopf D. (2012). Auditory Support for Situation Awareness in Video Surveillance. pdf
McGee R., Dickinson J., Legrady G. (2012). Voice of Sisyphus: An Image Sonification Multimedia Installation. pdf
McLachlan R., McGee-Lennon M., Brewster S.(2012). The Sound of Musicons: Investigating the Design of Musically Drived Audio Cues. pdf
Metatla O., Bryan-Kinns N., Stockman T., Martin F. (2012). Cross-Modal Collaborative Interaction between Visually-Impaired and Sighted Users in the Workplace. pdf
Nguyen V. X. (2012). CircoSonic: A Sonification of Circos, A Circular Graph of Table Data. pdf
Oswald D. (2012). Non-Speech Audio Semiotics a Review and Revision of Auditory Icon and Earcon Theory. pdf
Parseihian G., Katz B. FG., Conan S. (2012). Sound Effect Metaphors for Near Field Distance Sonification. pdf
Perkins R. (2012). Sonification of a Real-time Physics Simulation within a Virtual Environment. pdf
Schaffert N., Mattes K. (2012). Acoustic Feedback Training in Adaptive Rowing. pdf
Schedel M., Yager K. G. (2012). Hearing Nano-Structures: A Case Study in Timbral Sonification. pdf
Schmele T., Gomez I. (2012). Exploring 3D Audio for Brain Sonification. pdf
Schmitz G., Effenberg A. O. (2012). Perceptual Effects of Auditory Information About Own and Other Movements. pdf
Supper A. (2012). "Trained Ears" and "Correlation Coefficients": A Social Science Perspective on Sonification. pdf
Terasawa H., Parvizi J., Chafe C. (2012). Sonifying ECOG Seizure Data with Overtone Mapping: A Strategy for Creating Auditory Gestalt from Correlated Multichannel Data. pdf
Vigani A.(2012). Sonic Window #1 [2011] - A Real Time Sonification. pdf
Vogt K., Goudarzi V., Holdrich R. (2012). Chirping Stars. pdf
Ericson M. A., Vella M. N. (2012). Demonstration on an Outdoor Audio Shooting Gallery. pdf
Bonebright T. L. (2012). Were those Coconuts or Horse Hoofs? Visual Context Effects on Identification and Perceived Veracity of Everyday Sounds. pdf
Nees M. A. (2012). Correlation and Scatterplots; A Comparison of Auditory and Visual Modes of Learning and Testing. pdf
Sanz P. R., Pena J. M. S., Mezcua B. R. (2012). Sonification as a Social Right Implementation. pdf
Brent W. (2012). Physical Navigation of Virtual Timbre Spaces with Timbreid and DiLib. pdf
Nambiar A., Jacobson J. (2012). Spatialized Audio for Mixed Reality Theater: The Egyptian Oracle. pdf
Pirro D., Wankhammer A., Schwingenschuh P., Sontacchi A., Holdrich R. (2012). Acoustic Interface for Tremor Analysis. pdf
Fan Y., Weber R. (2012). Capturing Audience Experience via Mobile Biometrics. pdf
Winton R., Gable T. M., Schuett J., Walker B. N. (2012). A Sonification of Kepler Space Telescope Star Data. pdf
Wersenyi G. (2012). Evaluation of MATLAB based Virtual Audio Simulator with HRTF-Synthesis and Headphone Equalization. pdf
Giller C. A., Murro A. M., Park Y., Strickland S., Smith J. R. (2012). EEG Sonification for Epilepsy Surgery: A Clinical Work-in-Progress. pdf
Winters R. M., Wanderley M. M. (2012). New Directions for Sonification of Expressive Movement in Music. pdf
Vogt K., Goudarzi V., Holdrich R. (2012). SYSSON - A Systematic Procedure to Develop Sonifications. pdf
Bearman N., Brown E. (2012). Who's Sonifying data and how are they doing it? A comparison of icad and other Venues since 2009. pdf
Sanz P. R., Pena J. M. S., Mezcua B. R. (2012). A Sonification Proposal for Safe Travels of Blind People. pdf
Giot R., Courbe Y. (2012). InteNtion - Interactive Network Sonification. pdf
Beyls P. (2012). Interfacing the Earth. pdf
Gillard J., Schutz M. (2012). Improving the Efficacy of Auditory Alarms in Medical Devices by Exploring the Effect of Amplitude Envelope on Learning and Retention. pdf
Grohn M., Ahonen L., Huotilainen M. (2012). Effects of Pleasant and Unpleasant Auditory Mood Induction on the Performance and in Brain Activity in Cognitive Tasks. pdf
Bretan M., Weinberg G., Freeman J. (2012). Sonification for the Art Installation Drawn Together. pdf
Jeon M., Winton R. J., Yim J., Bruce C. M., Walker B. N. (2012). Aquarium Fugue: Interactive Sonification for Children and Visually Impaired Audience in Informal Learning Environments. pdf
Hildebrandt T., Kriglstein S., Rinderle-Ma S. (2012). Beyond Visualization: On Using Sonification Methods to make Business Processes more Accessible to Users. pdf
Aiken C., Peng Z., Simpson D., Michael A., Kilb D., Enescu B., Shelly D. (2012). Shaking up Earth Science: Visual and Auditory Representations of Earth Quake Interactions. pdf
Mynatt is the Executive Director at the Institute for People and Technology and a Professor in the College of Computing. She is an internationally recognized expert in the areas of ubiquitous computing and assistive technologies. Her research contributes to ongoing work in personal health informatics, computer-supported collaborative work and human-computer interface design. She has published more than 100 scientific papers and chaired the CHI 2010 conference, the premier international conference in human-computer interaction. Her research is supported by multiple grants from NSF including a five-year NSF CAREER award. Other honorary awards include being named the Top Woman Innovator in Technology by Atlanta Woman magazine in 2005 and the 2003 College of Computing’s Dean’s Award. Read more
Jonathan Berger is a composer and researcher who explores effective ways of using sound to convey information. Berger is the Billie Bennett Achilles Professor in Performance, the William R. and Gretchen B. Kimball University Fellow in Undergraduate Education, Co-Director of the Stanford Institute for Creativity and the Arts (SiCa), and Co-Director of Stanford’s Art Initiative. He is also affiliated with the Center for Computer Research in Music and Acoustics (CCRMA), where he teaches composition and music theory and cognition. He is a composer and researcher, with over 60 publications in a wide range of fields relating to music, science and technology. Research includes studies in music cognition, snal processing and statistical methods for automatic music recognition, classification and transcription, sonification and audio restoration.