Duet Interaction: Learning Musicianship for Automatic Accompaniment
Guangyu Xia, and Roger Dannenberg
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2015
- Location: Baton Rouge, Louisiana, USA
- Pages: 259–264
- DOI: 10.5281/zenodo.1179198 (Link to paper)
- PDF link
Abstract:
Computer music systems can interact with humans at different levels, including scores, phrases, notes, beats, and gestures. However, most current systems lack basic musicianship skills. As a consequence, the results of human-computer interaction are often far less musical than the interaction between human musicians. In this paper, we explore the possibility of learning some basic music performance skills from rehearsal data. In particular, we consider the piano duet scenario where two musicians expressively interact with each other. Our work extends previous automatic accompaniment systems. We have built an artificial pianist that can automatically improve its ability to sense and coordinate with a human pianist, learning from rehearsal experience. We describe different machine learning algorithms to learn musicianship for duet interaction, explore the properties of the learned models, such as dominant features, limits of validity, and minimal training size, and claim that a more human-like interaction is achieved.
Citation:
Guangyu Xia, and Roger Dannenberg. 2015. Duet Interaction: Learning Musicianship for Automatic Accompaniment. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1179198BibTeX Entry:
@inproceedings{rdannenbergc2015, abstract = {Computer music systems can interact with humans at different levels, including scores, phrases, notes, beats, and gestures. However, most current systems lack basic musicianship skills. As a consequence, the results of human-computer interaction are often far less musical than the interaction between human musicians. In this paper, we explore the possibility of learning some basic music performance skills from rehearsal data. In particular, we consider the piano duet scenario where two musicians expressively interact with each other. Our work extends previous automatic accompaniment systems. We have built an artificial pianist that can automatically improve its ability to sense and coordinate with a human pianist, learning from rehearsal experience. We describe different machine learning algorithms to learn musicianship for duet interaction, explore the properties of the learned models, such as dominant features, limits of validity, and minimal training size, and claim that a more human-like interaction is achieved.}, address = {Baton Rouge, Louisiana, USA}, author = {Guangyu Xia and Roger Dannenberg}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1179198}, editor = {Edgar Berdahl and Jesse Allison}, issn = {2220-4806}, month = {May}, pages = {259--264}, publisher = {Louisiana State University}, title = {Duet Interaction: Learning Musicianship for Automatic Accompaniment}, url = {http://www.nime.org/proceedings/2015/nime2015_202.pdf}, year = {2015} }