Skip to end of metadata
Go to start of metadata

Overview

The Floe Sonification Framework is intended to provide a set of tools to help web developers more easily create sound-based representations of interactive data. The goal is to provide a robust and performant means for:

  • synthesizing complex timbres using digitial signal processing techniques such as additive synthesis, frequency modulation, granualar synthesis
  • triggering the playback and manipulation of recorded audio files and sound samples
  • synchronizing changes in an application model with changes to a sonification
  • connecting with diverse sources of data via protocols such as Open Sound Control (OSC) and MIDI

The Floe Sonification Framework is inspired by the architectural approach of Fluid Infusion, and is intended to be used in web applications running in modern web browsers.

Technologies

The Sonification Framework is based on three major technologies:

  1. The W3C's Web Audio API, which is implemented in Firefox, Chrome, and Safari (IE support is pending)
  2. Fluid Infusion
  3. Flocking, a framework for audio synthesis and music on the web

Flocking has been used by designers and artists to create interactive sonifications such as Synthstagram, which represents the usage data of Instagram users in New York City as sound.

Roadmap

The Sonification Framework is still in early development. Planned features include:

  • A new declarative, JSON-based format for blending native Web Audio Nodes with custom JavaScript synthesis graphs, Web Audio Core Sonification Library
  • A binding between Infusion's Model Relay system and the inputs to a Flocking synth
  • Improved scheduling support for both Web Audio AudioParams and Flocking synth inputs
  • Documentation and tutorials

 

 

  • No labels