Crowd-driven Music: Interactive and Generative Approaches using Machine Vision and Manhattan
Chris Nash
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2020
- Location: Birmingham, UK
- Pages: 259–264
- DOI: 10.5281/zenodo.4813346 (Link to paper)
- PDF link
- Presentation Video
Abstract:
This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (a busy railway platform) is used to drive the automated composition and performance of music. The approach presented uses realtime machine vision applied to a live video feed of a scene, from which detected objects and people are fed into Manhattan (Nash, 2014), a digital music notation that integrates sequencing and programming to support the live creation of complex musical works that combine static, algorithmic, and interactive elements. The paper discusses the technical details of the system and artistic development of specific musical works, introducing novel techniques for mapping chaotic systems to musical expression and exploring issues of agency, aesthetic, accessibility and adaptability relating to composing interactive music for crowds and public spaces. In particular, performances as part of an installation for BBC Music Day 2018 are described. The paper subsequently details a practical workshop, delivered digitally, exploring the development of interactive performances in which the audience or general public actively or passively control live generation of a musical piece. Exercises support discussions on technical, aesthetic, and ontological issues arising from the identification and mapping of structure, order, and meaning in non-musical domains to analogous concepts in musical expression. Materials for the workshop are available freely with the Manhattan software.
Citation:
Chris Nash. 2020. Crowd-driven Music: Interactive and Generative Approaches using Machine Vision and Manhattan. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.4813346BibTeX Entry:
@inproceedings{NIME20_49, abstract = {This paper details technologies and artistic approaches to crowd-driven music, discussed in the context of a live public installation in which activity in a public space (a busy railway platform) is used to drive the automated composition and performance of music. The approach presented uses realtime machine vision applied to a live video feed of a scene, from which detected objects and people are fed into Manhattan (Nash, 2014), a digital music notation that integrates sequencing and programming to support the live creation of complex musical works that combine static, algorithmic, and interactive elements. The paper discusses the technical details of the system and artistic development of specific musical works, introducing novel techniques for mapping chaotic systems to musical expression and exploring issues of agency, aesthetic, accessibility and adaptability relating to composing interactive music for crowds and public spaces. In particular, performances as part of an installation for BBC Music Day 2018 are described. The paper subsequently details a practical workshop, delivered digitally, exploring the development of interactive performances in which the audience or general public actively or passively control live generation of a musical piece. Exercises support discussions on technical, aesthetic, and ontological issues arising from the identification and mapping of structure, order, and meaning in non-musical domains to analogous concepts in musical expression. Materials for the workshop are available freely with the Manhattan software.}, address = {Birmingham, UK}, author = {Nash, Chris}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.4813346}, editor = {Romain Michon and Franziska Schroeder}, issn = {2220-4806}, month = {July}, pages = {259--264}, presentation-video = {https://youtu.be/DHIowP2lOsA}, publisher = {Birmingham City University}, title = {Crowd-driven Music: Interactive and Generative Approaches using Machine Vision and Manhattan}, url = {https://www.nime.org/proceedings/2020/nime2020_paper49.pdf}, year = {2020} }