An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation
Axel Tidemann
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2011
- Location: Oslo, Norway
- Pages: 268–271
- Keywords: artificial intelli-,drumming,modeling human behaviour
- DOI: 10.5281/zenodo.1178175 (Link to paper)
- PDF link
Abstract:
Interacting with musical avatars have been increasingly popular over the years, with the introduction of games likeGuitar Hero and Rock Band. These games provide MIDIequipped controllers that look like their real-world counterparts (e.g. MIDI guitar, MIDI drumkit) that the users playto control their designated avatar in the game. The performance of the user is measured against a score that needs tobe followed. However, the avatar does not move in responseto how the user plays, it follows some predefined movementpattern. If the user plays badly, the game ends with theavatar ending the performance (i.e. throwing the guitar onthe floor). The gaming experience would increase if theavatar would move in accordance with user input. This paper presents an architecture that couples musical input withbody movement. Using imitation learning, a simulated human robot learns to play the drums like human drummersdo, both visually and auditory. Learning data is recordedusing MIDI and motion tracking. The system uses an artificial intelligence approach to implement imitation learning,employing artificial neural networks.
Citation:
Axel Tidemann. 2011. An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178175BibTeX Entry:
@inproceedings{Tidemann2011, abstract = {Interacting with musical avatars have been increasingly popular over the years, with the introduction of games likeGuitar Hero and Rock Band. These games provide MIDIequipped controllers that look like their real-world counterparts (e.g. MIDI guitar, MIDI drumkit) that the users playto control their designated avatar in the game. The performance of the user is measured against a score that needs tobe followed. However, the avatar does not move in responseto how the user plays, it follows some predefined movementpattern. If the user plays badly, the game ends with theavatar ending the performance (i.e. throwing the guitar onthe floor). The gaming experience would increase if theavatar would move in accordance with user input. This paper presents an architecture that couples musical input withbody movement. Using imitation learning, a simulated human robot learns to play the drums like human drummersdo, both visually and auditory. Learning data is recordedusing MIDI and motion tracking. The system uses an artificial intelligence approach to implement imitation learning,employing artificial neural networks.}, address = {Oslo, Norway}, author = {Tidemann, Axel}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1178175}, issn = {2220-4806}, keywords = {artificial intelli-,drumming,modeling human behaviour}, pages = {268--271}, title = {An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation}, url = {http://www.nime.org/proceedings/2011/nime2011_268.pdf}, year = {2011} }