ICAD 2004 titled its sonification competition 'Listening to the Mind Listening' and invited entrants to sonify EEG data captured while a subject was listening to music. The ICAD 2012 competition is inspired by this same idea: music (or sound) about listening to music. It is also inspired by the radical changes over the past decade in how we listen to music and how we share our listening activities with others. As portable media players and always-connected smartphones have become our primary listening platforms, social media services have become our primary sharing platforms.
This competition adopts the theme 'Listening to the World Listening' as it challenges us to explore what we can learn about listening through the analysis and sonification of social media data about listening.
Sonifications may be fixed-media audio files, interactive software programs or web sites, smartphone apps, musical performances, or sound installations. They may sonify the data in real time or out of real time. They may work with as little as a single set of fifty artists from Twitter Music Trends or as much as several months worth of data collected across all of the APIs.
Submissions must include an audio or video recording, no more than five minutes in length, of the complete sonification, excerpts from the sonification, or documentation of the sonification, as appropriate. Submissions must also include a 2-4 page statement, following the ICAD 2012 paper template, that describes the techniques used to create the sonification and the motivations behind them.
All submissions must be made electronically through the ICAD 2012 web site:
All questions should be addressed to Jason Freeman, ICAD 2012 Music Chair:
Vogt K., Goudarzi V., Holdrich R. (2012). Chirping Stars. pdf
Bethancourt M. (2012). The Sounds of the Discussion of Music. pdf
Ash K. (2012). Affective States: Analysis and Sonification of Twitter Music Trends. pdf
Vigani A.(2012). Sonic Window #1  - A Real Time Sonification. pdf
Adam Lindsay and our friends at SocialGenius have created a web service, Twitter Music Trends, which greatly simplifies the process of extracting listening data from Twitter. According to Adam: "It listens to a vast selection of music-related tweets, and automatically tries to detect if each is actually discussing a musician or group." It then amasses this information in three forms: a daily trend of popular artists; the trending artists at the current moment; and the latest artists to be identified from the Twitter stream. The data for each can be accessed as both a simple visualization and as JSON data:
The Twitter REST API provides access to information about each tweet whose ID is listed in the Twitter Music Trends JSON data.
To access JSON data about a tweet, simply use an HTTP GET request like this:
Substituting the ID number with that of the tweet you wish to obtain.
Further documentation on the entire Twitter REST API is available here:
The MusicBrainz REST API provides additional information about each artist whose MusicBrainz ID is listed in the Twitter Music Trends JSON data.
To access XML data about an artist, simply use an HTTP GET request like this:
Substituting the ID number with that of the artist you wish to obtain. You can request additional information about the artist by adding parameters to the end of the request, e.g.
The complete API documentation is available here:
No registration with MusicBrainz is necessary to obtain this data.
The Echo Nest REST API provides additional information about each artist whose Echo Nest ID is listed in the Twitter Music Trends JSON data.
To access JSON data about an artist, simply use an HTTP GET request like this:
Substituting the ID number with that of the artist you wish to obtain and the API key for your developer API key. You can add multiple "bucket" terms to the same query to retrieve different information about the artist, including: audio, biographies, blogs, doc_counts, familiarity, hotttnesss, images, location, news, reviews, songs, terms, urls, video, and years_active. The basic documentation for the REST call is available here:
The full documentation of the REST API is available here:
Use of the Echo Nest API requires a (free) API key, which can be obtained here:
You may also wish to use the Echo Nest Python API and/or the Echo Nest Remix API (also in Python):
Professor of Computational Art, Universität der Künste Berlin
Assistant Professor of Integrated Digital Media, NYU-Poly
chief scientist, SocialGenius
co-founder and CTO, The Echo Nest