Sonifying Game-Space Choreographies With UDKOSC
Rob Hamilton
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2013
- Location: Daejeon, Republic of Korea
- Pages: 446–449
- Keywords: procedural music, procedural audio, interactive sonification, game music, Open Sound Control
- DOI: 10.5281/zenodo.1178544 (Link to paper)
- PDF link
Abstract:
With a nod towards digital puppetry and game-based film genres such asmachinima, recent additions to UDKOSC of- fer an Open Sound Control (OSC)control layer for external control over both third-person ''pawn'' entitiesand camera controllers in fully rendered game-space. Real-time OSC input,driven by algorithmic process or parsed from a human-readable timed scriptingsyntax allows users to shape choreographies of gesture, in this case actormotion and action, as well as an audiences view into the game-spaceenvironment. As UDKOSC outputs real-time coordinate and action data generatedby UDK pawns and players with OSC, individual as well as aggregate virtualactor gesture and motion can be leveraged as a driver for both creative andprocedural/adaptive gaming music and audio concerns.
Citation:
Rob Hamilton. 2013. Sonifying Game-Space Choreographies With UDKOSC. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.1178544BibTeX Entry:
@inproceedings{Hamilton2013, abstract = {With a nod towards digital puppetry and game-based film genres such asmachinima, recent additions to UDKOSC of- fer an Open Sound Control (OSC)control layer for external control over both third-person ''pawn'' entitiesand camera controllers in fully rendered game-space. Real-time OSC input,driven by algorithmic process or parsed from a human-readable timed scriptingsyntax allows users to shape choreographies of gesture, in this case actormotion and action, as well as an audiences view into the game-spaceenvironment. As UDKOSC outputs real-time coordinate and action data generatedby UDK pawns and players with OSC, individual as well as aggregate virtualactor gesture and motion can be leveraged as a driver for both creative andprocedural/adaptive gaming music and audio concerns.}, address = {Daejeon, Republic of Korea}, author = {Rob Hamilton}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.1178544}, issn = {2220-4806}, keywords = {procedural music, procedural audio, interactive sonification, game music, Open Sound Control}, month = {May}, pages = {446--449}, publisher = {Graduate School of Culture Technology, KAIST}, title = {Sonifying Game-Space Choreographies With UDKOSC}, url = {http://www.nime.org/proceedings/2013/nime2013_268.pdf}, year = {2013} }