iMe is an interactive intelligent music system based on software agents that is capable of learning how to generate music autonomously and in real-time.
iMe tries artificially to simulate ‘musical ontogenesis’, here defined as the sequence of events involved in the development of an individual agent from its birth to its death.
In iMe, learning involves feature extraction and segmentation:
Segments (or musical memes) are stored in the agent’s memories with a multi parametric (melody direction, leap, etc.) representation:
Composition follows the ‘Compositional and Performance Map’ model:
This graph shows the analysis of the evolution of an agent’s memory during a simulation in which it listened to the 12 Inventions by J.S.Bach. The ‘x’ axis shows the time cycles and the ‘y’ axis the distance between the consecutive memories after the execution of each listening task.
iMe has also been used in public performances. Here, Marcelo Gimenes plays with an artificial agent that learns from what he is playing in real-time:
iMe at the Peninsula Arts Contemporary Music Festival 2008 at Plymouth University
This is a more compact version of the same video (‘best moments’):
- Gimenes, M. (2013). Improved Believability in Agent-Based Computer Musical Systems Designed to Study Music Evolution. International Symposium on Computer Music Multidisciplinary Research, Marseille.
- Gimenes, M. and Miranda, E. (2011) Emergent Worldviews: An Ontomemetic Approach to Musical Intelligence, in Eduardo R. Miranda (ed.), A-Life for Music: On Music and Computer Models of Living Systems (Middleton, Wisconsin: A-R Editions).
- Gimenes, M. and Miranda, E. R. (2008). An A-Life Approach to Machine Learning of Musical Worldviews for Improvisation Systems. Proceedings of 5th Sound and Music Computing Conference, Berlin. Germany.