A Human-Agents Music Performance System in an Extended Reality Environment
Pedro P Lucas, and Stefano Fasciani
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2023
- Location: Mexico City, Mexico
- Track: Papers
- Pages: 10–20
- Article Number: 2
- DOI: 10.5281/zenodo.11189090 (Link to paper)
- PDF link
Abstract:
This paper proposes a human-machine interactive music system for live performances based on autonomous agents, implemented through immersive extended reality. The interaction between humans and agents is grounded in concepts related to Swarm Intelligence and Multi-Agent systems, which are reflected in a technological platform that involves a 3D physical-virtual solution. This approach requires visual, auditory, haptic, and proprioceptive modalities, making it necessary to integrate technologies capable of providing such a multimodal environment. The prototype of the proposed system is implemented by combining Motion Capture, Spatial Audio, and Mixed Reality technologies. The system is evaluated in terms of objective measurements and tested with users through music improvisation sessions. The results demonstrate that the system is used as intended with respect to multimodal interaction for musical agents. Furthermore, the results validate the novel design and integration of the required technologies presented in this paper.
Citation:
Pedro P Lucas, and Stefano Fasciani. 2023. A Human-Agents Music Performance System in an Extended Reality Environment. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.11189090BibTeX Entry:
@inproceedings{nime2023_2, abstract = {This paper proposes a human-machine interactive music system for live performances based on autonomous agents, implemented through immersive extended reality. The interaction between humans and agents is grounded in concepts related to Swarm Intelligence and Multi-Agent systems, which are reflected in a technological platform that involves a 3D physical-virtual solution. This approach requires visual, auditory, haptic, and proprioceptive modalities, making it necessary to integrate technologies capable of providing such a multimodal environment. The prototype of the proposed system is implemented by combining Motion Capture, Spatial Audio, and Mixed Reality technologies. The system is evaluated in terms of objective measurements and tested with users through music improvisation sessions. The results demonstrate that the system is used as intended with respect to multimodal interaction for musical agents. Furthermore, the results validate the novel design and integration of the required technologies presented in this paper.}, address = {Mexico City, Mexico}, articleno = {2}, author = {Pedro P Lucas and Stefano Fasciani}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.11189090}, editor = {Miguel Ortiz and Adnan Marquez-Borbon}, issn = {2220-4806}, month = {May}, numpages = {11}, pages = {10--20}, title = {A Human-Agents Music Performance System in an Extended Reality Environment}, track = {Papers}, url = {http://nime.org/proceedings/2023/nime2023_2.pdf}, year = {2023} }