Skip to content →

Community Spotlight: Tony Stockman

The 6th ICAD President

So, who are you?

Tony Stockman

I am a Senior Lecturer in Computer Science at Queen Mary University of London. My research interests are in Human-Computer Interaction, Auditory Displays/Sonification and Assistive Technology.

How did you end up working on Sonification and Auditory Displays?

My PhD examined the influence of breathing on heart rate variability (HRV). As a blind person, I needed a way of checking the quality of the recorded signals and of presenting the results of the signal analysis in an accessible form. I used a basic parameter mapping approach to sonify them to check for noise and other artefacts, and used a similar approach for rendering various time and frequency domain measures of the signals. It was a simple sonification approach, but it made all the difference in enabling me to do the research. When I found something interesting, I could always check the underlying values in the data. 20 years later, I discovered ICAD, and was delighted to attend my first ICAD conference in Boston, 2003. Sonification/auditory display has played a major role in my research since then.

What is the role of Sonification/Auditory Display in your research?

In relation to Assistive Technology for visually impaired users, I and my co-workers have used sonification to improve and speed access to spreadsheet and database content, line graphs and node-link diagrams and graphically displayed peak meter information in Digital Audio Workstations. Similarly, we have employed sonification as a component of auditory displays to present map and route information. More generally, I am interested in the use of audio to present overviews of information, for both sighted and visually impaired users. We have also employed sonification in studies of cross-modal correspondences, that is, examining whether the perception and/or reactions to stimuli in one sensory mode, e.g. audio or visual, are enhanced or reduced in the presence of complimentary or contradictory stimuli in another sensory mode.

What is the most challenging part in your work?

Given the range of educational and employment opportunities in the area of Data Science currently, I would love to develop a platform independent, accessible interactive library that used sonification to render the output of libraries such as Matplotlib and Seaborn accessible to visually impaired users.

Anything special ICADders should know about you?

I’ve always loved sport, particularly ball games such as football and cricket. I was lucky enough to take part as a member of the Great Britain squad in the first football world cup for blind players that took place in São Paulo, Brazil, in 1998. There is a lot of work that has been done generally on audio games (see, but relatively little work has been done in relation to audio representation of ball games and player dynamics. Developing such systems has some really interesting underpinning research questions concerning our ability to perceive and process multiple concurrent sound sources and make decisions about them. I am particularly keen to hear from anyone with interests in this area.

What would you like to say to ICADders? 

Firstly, a sincere and very warm thank you. On attending that first ICAD conference, I was immediately struck by the warmth and positive attitude of the community, the encouragement given and willingness to help and support newcomers. The other main thing I would like to say it to echo what Jordan said in the last spotlight: I am always happy to hear from ICADders interested in working with me, so please do get in touch if you are interested in collaborating.

What is your favorite sonification/auditory display ever?

The sonifications from the 2006 ICAD concert is a remarkable collection presenting many aspects of life expectancy data with amazing creativity. Particularly if you’ve not heard them before, I invite you to listen at

What is the study/tool/work in the field of auditory display you are most proud of? 

The Collaborative, Cross-Modal Interfaces (CCMI) system is certainly one of the projects I have been involved with where I was pleased with the final product. It involved the development of an interactive tool to enable blind and sighted people to collaborate on the creation or editing of node-link diagrams such as road or rail networks, organisation charts, database design diagrams, etc. The project employed visual, auditory and haptic technologies to support synchronous work on diagrams. Details of the project including a video and related publications can be found at

Any way to learn more about your work or reach out to you?

You are very welcome to contact me at this email address: To read more about my and my colleagues’ research, I suggest looking up my entry on Google Scholar or ResearchGate.

Published in Community Spotlight