The list is not sorted according to any algorithm.
-
Gestural Audification (non-funded research project)
Description: Audification is the direct transformation of data values into sound. In our project we have created a new interface (using a webcam and two colored gloves) so that navigation (scrubbing) and manipulation (filtering) of the data-driven sound stream can be done interactively, using bimanual controls. The project emerged from a ConGAS STSM of Stella Pashalidou at Bielefeld University. Contact:
Website: project website
-
ANVIL ()
Description: Development and extension of a video annotation tool that allows the manual annotation of video on multiple tiers with a user-defined scheme. Extensions include spatiotemporal annotations (storing graphical mark-up of trajectories, distances etc. on the video screen) and import/export interfaces to other tools (e.g. FEELTRACE emotion annotation tool). Contact: Website: project website
-
Research on violin-based interfaces (Dissertation at the University of Birmingham)
Description: Aim: development of violin-based interfaces and synthesis algorithms that allow traditional trained string players to use their expertise in order to play electronic sounds. Contact: Website: project website
-
hot_strings SIG (SIG)
Description: >hot_strings SIG< is a group of people who are interested in inventing the instruments of the violin family. This might be developments using gesture tracking systems (virtual instruments), instruments using new material in order to let the acoustic instrument sound different or this might be developments using new speaker systems, pickup systems, signal processing methods or synthesis methods. Every year two meetings are done. Developments are presented and discussed. A mailinglist is online. The members are instrumentalists, violin makers, compsers and researchers. They come from various locations in europe. Members work on projects like instrument development, compositions and concerts together. Contact:
Website: not available
-
Personal Orchestra (Industry-funded exhibit development projects)
Description: A series of interactive conducting systems since 2000 that use real audio and video data. Includes the first system worldwide to provide interactive conducting of real audio and video recordings. Contact:
Website: project website
-
libSTF (PhD thesis work, deployed in industry-funded exhibits)
Description: libSTF is a "Semantic Time Framework", a multimedia framework that supports working with changing time bases, for applications such as audio time-stretching in the Personal Orchestra projects. Contact:
Website: project website
-
PhaVoRIT: A Phase Vocoder for Real-Time Interactive Time-Stretching (Research work, deployed in industry-funded exhibits)
Description: PhaVoRIT is currently the best open-source phase vocoder implementation available that can time-stretch audio data in real time at high quality, suitable for use with orchestral music. It can be used standalone or as plugin to our libSTF framework, and has been used in our Personal Orchestra projects. Contact:
Website: project website
-
CLOSED (Closing the Loop of Sound Evaluation and Design) (IST STREP)
Description: When designing the sonic aspect of an artefact, the designer wants to be able to explore a variety of what-if possibilities at the phenomenological, experiential, and emotional level. The CLOSED project provides a measurement tool that is capable of analyzing sounds in context at the same high level of interpretation used by designers, in a way that is closely linked to the experience of users, thus aiming at boosting the emerging discipline of sound design. It is believed that the objective measurement of functional-aesthetic sound qualities of artefacts is the key component that will allow the effective closure of the iterative loop of sound evaluation and design.
Coordinator:
IRCAM, FR (Institut de Recherche et Coordination Acoustique/Musique)
Partners:
1. Università degli Studi di Verona (VIPS Lab)
2. HGKZ (Hochschule für Gestaltung und Kunst Zürich)
3.TU Berlin, DE (Technischen Universität Berlin)
Contact:
Website: