Introducing Locus: a NIME for Immersive Exocentric Aural Environments
Disha Sardana, Woohun Joo, Ivica Ico Bukvic, and Greg Earle
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2019
- Location: Porto Alegre, Brazil
- Pages: 250–255
- DOI: 10.5281/zenodo.3672946 (Link to paper)
- PDF link
Abstract:
Locus is a NIME designed specifically for an interactive, immersive high density loudspeaker array environment. The system is based on a pointing mechanism to interact with a sound scene comprising 128 speakers. Users can point anywhere to interact with the system, and the spatial interaction utilizes motion capture, so it does not require a screen. Instead, it is completely controlled via hand gestures using a glove that is populated with motion-tracking markers. The main purpose of this system is to offer intuitive physical interaction with the perimeter-based spatial sound sources. Further, its goal is to minimize user-worn technology and thereby enhance freedom of motion by utilizing environmental sensing devices, such as motion capture cameras or infrared sensors. The ensuing creativity enabling technology is applicable to a broad array of possible scenarios, from researching limits of human spatial hearing perception to facilitating learning and artistic performances, including dance. In this paper, we describe our NIME design and implementation, its preliminary assessment, and offer a Unity-based toolkit to facilitate its broader deployment and adoption.
Citation:
Disha Sardana, Woohun Joo, Ivica Ico Bukvic, and Greg Earle. 2019. Introducing Locus: a NIME for Immersive Exocentric Aural Environments. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.3672946BibTeX Entry:
@inproceedings{Sardana2019, abstract = {Locus is a NIME designed specifically for an interactive, immersive high density loudspeaker array environment. The system is based on a pointing mechanism to interact with a sound scene comprising 128 speakers. Users can point anywhere to interact with the system, and the spatial interaction utilizes motion capture, so it does not require a screen. Instead, it is completely controlled via hand gestures using a glove that is populated with motion-tracking markers. The main purpose of this system is to offer intuitive physical interaction with the perimeter-based spatial sound sources. Further, its goal is to minimize user-worn technology and thereby enhance freedom of motion by utilizing environmental sensing devices, such as motion capture cameras or infrared sensors. The ensuing creativity enabling technology is applicable to a broad array of possible scenarios, from researching limits of human spatial hearing perception to facilitating learning and artistic performances, including dance. In this paper, we describe our NIME design and implementation, its preliminary assessment, and offer a Unity-based toolkit to facilitate its broader deployment and adoption.}, address = {Porto Alegre, Brazil}, author = {Disha Sardana and Woohun Joo and Ivica Ico Bukvic and Greg Earle}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.3672946}, editor = {Marcelo Queiroz and Anna Xambó Sedó}, issn = {2220-4806}, month = {June}, pages = {250--255}, publisher = {UFRGS}, title = {Introducing Locus: a {NIME} for Immersive Exocentric Aural Environments}, url = {http://www.nime.org/proceedings/2019/nime2019_paper048.pdf}, year = {2019} }