Mobile Controls On-The-Fly: An Abstraction for Distributed NIMEs
Charles Roberts, Graham Wakefield, and Matt Wright
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2012
- Location: Ann Arbor, Michigan
- Keywords: NIME, OSC, Zeroconf, iOS, Android, Max/MSP/Jitter, LuaAV, SuperCollider, Mobile
- DOI: 10.5281/zenodo.1180581 (Link to paper)
- PDF link
Abstract:
Designing mobile interfaces for computer-based musical performance is generally a time-consuming task that can be exasperating for performers. Instead of being able to experiment freely with physical interfaces' affordances, performers must spend time and attention on non-musical tasks including network configuration, development environments for the mobile devices, defining OSC address spaces, and handling the receipt of OSC in the environment that will control and produce sound. Our research seeks to overcome such obstacles by minimizing the code needed to both generate and read the output of interfaces on mobile devices. For iOS and Android devices, our implementation extends the application Control to use a simple set of OSC messages to define interfaces and automatically route output. On the desktop, our implementations in Max/MSP/Jitter, LuaAV, and Su-perCollider allow users to create mobile widgets mapped to sonic parameters with a single line of code. We believe the fluidity of our approach will encourage users to incorporate mobile devices into their everyday performance practice.
Citation:
Charles Roberts, Graham Wakefield, and Matt Wright. 2012. Mobile Controls On-The-Fly: An Abstraction for Distributed NIMEs. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1180581BibTeX Entry:
@inproceedings{Roberts2012, abstract = {Designing mobile interfaces for computer-based musical performance is generally a time-consuming task that can be exasperating for performers. Instead of being able to experiment freely with physical interfaces' affordances, performers must spend time and attention on non-musical tasks including network configuration, development environments for the mobile devices, defining OSC address spaces, and handling the receipt of OSC in the environment that will control and produce sound. Our research seeks to overcome such obstacles by minimizing the code needed to both generate and read the output of interfaces on mobile devices. For iOS and Android devices, our implementation extends the application Control to use a simple set of OSC messages to define interfaces and automatically route output. On the desktop, our implementations in Max/MSP/Jitter, LuaAV, and Su-perCollider allow users to create mobile widgets mapped to sonic parameters with a single line of code. We believe the fluidity of our approach will encourage users to incorporate mobile devices into their everyday performance practice.}, address = {Ann Arbor, Michigan}, author = {Charles Roberts and Graham Wakefield and Matt Wright}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1180581}, issn = {2220-4806}, keywords = {NIME, OSC, Zeroconf, iOS, Android, Max/MSP/Jitter, LuaAV, SuperCollider, Mobile}, publisher = {University of Michigan}, title = {Mobile Controls On-The-Fly: An Abstraction for Distributed {NIME}s}, url = {http://www.nime.org/proceedings/2012/nime2012_303.pdf}, year = {2012} }