SpectraScore VR: Networkable virtual reality software tools for real-time composition and performance
Benedict Carey
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2016
- Location: Brisbane, Australia
- Track: Demonstrations
- Pages: 3–4
- DOI: 10.5281/zenodo.1176004 (Link to paper)
- PDF link
Abstract:
This paper describes a package of modular tools developed for use with virtual reality peripherals to allow for music composition, performance and viewing in `real-time' across networks within a spectralist paradigm. The central tool is SpectraScore, a Max/MSP abstraction for analysing audio signals and ranking the resultant partials according to their harmonic pitch class profiles. This data triggers the generation of objects in a virtual world based on the `topography' of the source sound, which is experienced by network clients via Google Cardboard headsets. They use their movements to trigger audio in various microtonal tunings and incidentally generate scores. These scores are transmitted to performers who improvise music from this notation using Leap Motion Theremins, also in VR space. Finally, the performance is broadcast via a web audio stream, which can be heard by the composer-audience in the initial virtual world. The `real-time composers' and performers are not required to have any prior knowledge of complex computer systems and interact either using head position tracking, or with a Oculus Rift DK2 and a Leap Motion Camera.
Citation:
Benedict Carey. 2016. SpectraScore VR: Networkable virtual reality software tools for real-time composition and performance. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1176004BibTeX Entry:
@inproceedings{Carey2016a, abstract = {This paper describes a package of modular tools developed for use with virtual reality peripherals to allow for music composition, performance and viewing in `real-time' across networks within a spectralist paradigm. The central tool is SpectraScore, a Max/MSP abstraction for analysing audio signals and ranking the resultant partials according to their harmonic pitch class profiles. This data triggers the generation of objects in a virtual world based on the `topography' of the source sound, which is experienced by network clients via Google Cardboard headsets. They use their movements to trigger audio in various microtonal tunings and incidentally generate scores. These scores are transmitted to performers who improvise music from this notation using Leap Motion Theremins, also in VR space. Finally, the performance is broadcast via a web audio stream, which can be heard by the composer-audience in the initial virtual world. The `real-time composers' and performers are not required to have any prior knowledge of complex computer systems and interact either using head position tracking, or with a Oculus Rift DK2 and a Leap Motion Camera. }, address = {Brisbane, Australia}, author = {Benedict Carey}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1176004}, isbn = {978-1-925455-13-7}, issn = {2220-4806}, pages = {3--4}, publisher = {Queensland Conservatorium Griffith University}, title = {SpectraScore VR: Networkable virtual reality software tools for real-time composition and performance}, track = {Demonstrations}, url = {http://www.nime.org/proceedings/2016/nime2016_paper00022.pdf}, year = {2016} }