x2Gesture: how machines could learn expressive gesture variations of expert musicians

Volioti, Christina and Manitsaris, Sotiris and Katsouli, Eleni and Manitsaris, Athanasios

Proceedings of the International Conference on New Interfaces for Musical Expression

There is a growing interest in ‘unlocking’ the motor skills of expert musicians. Motivated by this need, the main objective of this paper is to present a new way of modeling expressive gesture variations in musical performance. For this purpose, the 3D gesture recognition engine ‘x2Gesture’ (eXpert eXpressive Gesture) has been developed, inspired by the Gesture Variation Follower, which is initially designed and developed at IRCAM in Paris and then extended at Goldsmiths College in London. x2Gesture supports both learning of musical gestures and live performing, through gesture sonification, as a unified user experience. The deeper understanding of the expressive gestural variations permits to define the confidence bounds of the expert’s gestures, which are used during the decoding phase of the recognition. The first experiments show promising results in terms of recognition accuracy and temporal alignment between template and performed gesture, which leads to a better fluidity and immediacy and thus gesture sonification.