There is a growing interest in touch-based and gestural interfaces as alternatives to the dominant mouse, keyboard and monitor interaction. Content and context-aware visualizations of audio collections have been proposed as a more effective way to interact with the increasing amounts of audio data available digitally. Audioscapes is a framework for prototyping and exploring how touch-based and gestural controllers can be used with state-of-the-art content and context-aware visualizations. By providing well-defined interfaces and conventions a variety of different audio collections, controllers and visualization methods can be combined to create innovative ways of interacting with large audio collections. We describe the overall system architecture, the currently available components and specific case studies.
The positions of the songs on the grid are not placed by hand, but are rather an emergent property of the music itself.