Interactive Machine Learning for Music
Prof. Rebecca Fiebrink
5. Musical and creative benefits of IML
5.2 Data and ML is better than math and code for communicating embodied practices to a computer
A programming language is one “interface” through which people can communicate to a computer how they would like it to behave. A training dataset containing examples of what a computer should do in response to particular human actions is an alternative “interface” for accomplishing this, and one that is arguably more natural for many contexts. When a violin teacher instructs a student how to move their bow arm, or a conductor instructs an orchestra about the sound they would like to get, they do not communicate these things through mathematical functions or programming code; often, they don’t even use language. Rather, they typically use demonstrations of movement and sound—these are often the most natural ways for us to communicate about music to other people. ML allows us to use these modalities to communicate to a computer.
This capability can be key for instrument builders, as illustrated by Michelle Nagai’s statement: “I have never before been able to work with a musical interface … that allowed me to really ‘feel’ the music as I was playing it and developing it. The Wekinator allowed me to approach composing with electronics and the computer more in the way I might if I was writing a piece for cello, where I would actually sit down with a cello and try things out” ([qtd. in] Fiebrink 2011, p. 258).