The Interactive Music Environments (iMe)

iMe is an interactive intelligent music system based on software agents that is capable of learning how to generate music autonomously and in real-time.

Musical ontogenesis

iMe tries artificially to simulate ‘musical ontogenesis’, here defined as the sequence of events involved in the development of an individual agent from its birth to its death.

Screen Shot 2016-06-15 at 22.30.17.png

 Learning

In iMe, learning involves feature extraction and segmentation:

segmentation.jpg

Memes

Segments (or musical memes) are stored in the agent’s memories with a multi parametric (melody direction, leap, etc.) representation:

Screen Shot 2016-06-15 at 22.21.59.png

Composition

Composition follows the ‘Compositional and Performance Map’ model:

Screen Shot 2016-06-15 at 22.25.34.png

Applications

Musicology

This graph shows the analysis of the evolution of an agent’s memory during a simulation in which it listened to the 12 Inventions by J.S.Bach. The ‘x’ axis shows the time cycles and the ‘y’ axis the distance between the consecutive memories after the execution of each listening task.

Screen Shot 2016-06-15 at 22.36.03.png

Performance

iMe has also been used in public performances. Here, Marcelo Gimenes plays with an artificial agent that learns from what he is playing in real-time:

iMe at the Peninsula Arts Contemporary Music Festival 2008 at Plymouth University

This is a more compact version of the same video (‘best moments’):

Publications