Enabling Multimodal Mobile Interfaces for Musical Performance
Charles Roberts, Angus Forbes, and Tobias Höllerer
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2013
- Location: Daejeon, Republic of Korea
- Pages: 102–105
- Keywords: Music, mobile, multimodal, interaction
- DOI: 10.5281/zenodo.1178646 (Link to paper)
- PDF link
Abstract:
We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfaces are declared using JSON and canbe extended with JavaScript to add complex behaviors, including the concurrentfusion of multimodal signals. By simplifying the creation of interfaces viathese simple markup files, Control allows musicians and artists to make novelapplications that use and combine both discrete and continuous data from thewide range of sensors available on commodity mobile devices.
Citation:
Charles Roberts, Angus Forbes, and Tobias Höllerer. 2013. Enabling Multimodal Mobile Interfaces for Musical Performance. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178646BibTeX Entry:
@inproceedings{Roberts2013, abstract = {We present research that extends the scope of the mobile application Control, aprototyping environment for defining multimodal interfaces that controlreal-time artistic and musical performances. Control allows users to rapidlycreate interfaces employing a variety of modalities, including: speechrecognition, computer vision, musical feature extraction, touchscreen widgets,and inertial sensor data. Information from these modalities can be transmittedwirelessly to remote applications. Interfaces are declared using JSON and canbe extended with JavaScript to add complex behaviors, including the concurrentfusion of multimodal signals. By simplifying the creation of interfaces viathese simple markup files, Control allows musicians and artists to make novelapplications that use and combine both discrete and continuous data from thewide range of sensors available on commodity mobile devices.}, address = {Daejeon, Republic of Korea}, author = {Charles Roberts and Angus Forbes and Tobias H{\''o}llerer}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1178646}, issn = {2220-4806}, keywords = {Music, mobile, multimodal, interaction}, month = {May}, pages = {102--105}, publisher = {Graduate School of Culture Technology, KAIST}, title = {Enabling Multimodal Mobile Interfaces for Musical Performance}, url = {http://www.nime.org/proceedings/2013/nime2013_303.pdf}, year = {2013} }