Interactive Machine Learning for Music
Prof. Rebecca Fiebrink
4.1 From the Waters
4.4 Gabriel Vigliensoni's experiments with IML for realtime control of generative models
Much of the excitement around ML in the last decade has centred on generative ML algorithms, those capable of outputting not just a label (like classification) or number (like regression), but media content itself, such as images, video, or music. In a live musical performance context, a key question is how to exercise effective musical control over a generative model. My collaborator Gabriel Vigliensoni has been exploring how IML (using both Wekinator and Flucoma) can be used to build instruments that allow gestural control over generative models in realtime (Vigliensoni and Fiebrink 2023). The following short video demonstrates how this is achieved:
Using IML to build an instrument controlling the latent space of a generative model, specifically RAVE by Caillon and Esling (2021).
The following videos show Vigliensoni using this technique in performance:
Vigliensoni performing with dedosmuertos, in Paris in 2022.
Vigliensoni performing at CMMAS in 2022.