x2Gesture: how machines could learn expressive gesture variations of expert musicians
Christina Volioti, Sotiris Manitsaris, Eleni Katsouli, and Athanasios Manitsaris
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2016
- Location: Brisbane, Australia
- Track: Papers
- Pages: 310–315
- DOI: 10.5281/zenodo.1176137 (Link to paper)
- PDF link
Abstract:
There is a growing interest in `unlocking' the motor skills of expert musicians. Motivated by this need, the main objective of this paper is to present a new way of modeling expressive gesture variations in musical performance. For this purpose, the 3D gesture recognition engine `x2Gesture' (eXpert eXpressive Gesture) has been developed, inspired by the Gesture Variation Follower, which is initially designed and developed at IRCAM in Paris and then extended at Goldsmiths College in London. x2Gesture supports both learning of musical gestures and live performing, through gesture sonification, as a unified user experience. The deeper understanding of the expressive gestural variations permits to define the confidence bounds of the expert's gestures, which are used during the decoding phase of the recognition. The first experiments show promising results in terms of recognition accuracy and temporal alignment between template and performed gesture, which leads to a better fluidity and immediacy and thus gesture sonification.
Citation:
Christina Volioti, Sotiris Manitsaris, Eleni Katsouli, and Athanasios Manitsaris. 2016. x2Gesture: how machines could learn expressive gesture variations of expert musicians. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1176137BibTeX Entry:
@inproceedings{Volioti2016, abstract = {There is a growing interest in `unlocking' the motor skills of expert musicians. Motivated by this need, the main objective of this paper is to present a new way of modeling expressive gesture variations in musical performance. For this purpose, the 3D gesture recognition engine `x2Gesture' (eXpert eXpressive Gesture) has been developed, inspired by the Gesture Variation Follower, which is initially designed and developed at IRCAM in Paris and then extended at Goldsmiths College in London. x2Gesture supports both learning of musical gestures and live performing, through gesture sonification, as a unified user experience. The deeper understanding of the expressive gestural variations permits to define the confidence bounds of the expert's gestures, which are used during the decoding phase of the recognition. The first experiments show promising results in terms of recognition accuracy and temporal alignment between template and performed gesture, which leads to a better fluidity and immediacy and thus gesture sonification. }, address = {Brisbane, Australia}, author = {Christina Volioti and Sotiris Manitsaris and Eleni Katsouli and Athanasios Manitsaris}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1176137}, isbn = {978-1-925455-13-7}, issn = {2220-4806}, pages = {310--315}, publisher = {Queensland Conservatorium Griffith University}, title = {x2Gesture: how machines could learn expressive gesture variations of expert musicians}, track = {Papers}, url = {http://www.nime.org/proceedings/2016/nime2016_paper0061.pdf}, year = {2016} }