Zum Hauptinhalt

Interactive Machine Learning for Music

Prof. Rebecca Fiebrink

References

Bencina, Ross. “The metasurface: applying natural neighbour interpolation to two-to-many mapping.” In Proceedings of the 2005 Conference on New Interfaces for Musical Expression, pp. 101-104. 2005.

Bevilacqua, Frédéric, Rémy Müller, and Norbert Schnell. “MnM: a Max/MSP mapping toolbox.” In Proceedings of the International Conference on New Interfaces for Musical Expression, pp. 85-88. 2005.

Bongers, Bert. “Physical interfaces in the electronic arts.” Trends in gestural control of music (2000): 41-70.

Breiman, Leo. “Random forests.” Machine learning 45 (2001): 5-32.

Caillon, Antoine, and Philippe Esling. “RAVE: A variational autoencoder for fast and high-quality neural audio synthesis.” arXiv preprint arXiv: 2111.05011 (2021).

Dahl, Luke. “Wicked problems and design considerations in composing for laptop orchestra.” In Proceedings of the International Conference on New Interfaces for Musical Expression. 2012.

Fails, Jerry Alan, and Dan R. Olsen Jr. “Interactive machine learning.” In Proceedings of the 8th International Conference on Intelligent User Interfaces, pp. 39-45. 2003.

Fiebrink, Rebecca. Real-time human interaction with supervised learning algorithms for music composition and performance. PhD Thesis. Princeton University, 2011.

Fiebrink, Rebecca, and Laetitia Sonami. “Reflections on eight years of instrument creation with machine learning.” In Proceedings of the International Conference on New Interfaces for Musical Expression. 2020.

Fiebrink, Rebecca, Daniel Trueman, and Perry R. Cook. “A meta-instrument for interactive, on-the-fly machine learning.” In Proceedings of the International Conference on New Interfaces for Musical Expression. 2009.

Hunt, Andy, Marcelo M. Wanderley, and Matthew Paradis. “The importance of parameter mapping in electronic instrument design.” Journal of New Music Research 32, no. 4 (2003): 429-440.

Lee, Michael, Adrian Freed, and David Wessel. “Real-time neural network processing of gestural and acoustic signals.” In Proceedings of the International Computer Music Conference, pp. 277-277. 1991.

Mathews, Max V. “The radio baton and conductor program, or: Pitch, the most important and least expressive part of music.” Computer Music Journal 15, no. 4 (1991): 37-46.

Parke-Wolfe, Samuel Thompson, Hugo Scurto, and Rebecca Fiebrink. “Sound control: Supporting custom musical interface design for children with disabilities.” In Proceedings of the International Conference on New Interfaces for Musical Expression. 2019.

Stiefel, Van, Dan Trueman, and Perry Cook. “Re-coupling: the uBlotar Synthesis Instrument and the sHowl Speaker-feedback Controller.” In Proceedings of the International Computer Music Conference. 2004.

Vigliensoni, Gabriel, and Rebecca Fiebrink. “Steering latent audio models through interactive machine learning.” In Proceedings of the International Conference on Computational Creativity. 2023.