Composing and executing Interactive music using the HipHop.js language
Bertrand Petit, and manuel serrano
Proceedings of the International Conference on New Interfaces for Musical Expression
- Year: 2019
- Location: Porto Alegre, Brazil
- Pages: 71–76
- DOI: 10.5281/zenodo.3672870 (Link to paper)
- PDF link
Abstract:
Skini is a platform for composing and producing live performances with audience participating using connected devices (smartphones, tablets, PC, etc.). The music composer creates beforehand musical elements such as melodic patterns, sound patterns, instruments, group of instruments, and a dynamic score that governs the way the basic elements will behave according to events produced by the audience. During the concert or the performance, the audience, by interacting with the system, gives birth to an original music composition. Skini music scores are expressed in terms of constraints that establish relationships between instruments. A constraint maybe instantaneous, for instance one may disable violins while trumpets are playing. A constraint may also be temporal, for instance, the piano cannot play more than 30 consecutive seconds. The Skini platform is implemented in Hop.js and HipHop.js. HipHop.js, a synchronous reactive DLS, is used for implementing the music scores as its elementary constructs consisting of high level operators such as parallel executions, sequences, awaits, synchronization points, etc, form an ideal core language for implementing Skini constraints. This paper presents the Skini platform. It reports about live performances and an educational project. It briefly overviews the use of HipHop.js for representing score.
Citation:
Bertrand Petit, and manuel serrano. 2019. Composing and executing Interactive music using the HipHop.js language. Proceedings of the International Conference on New Interfaces for Musical Expression. DOI: 10.5281/zenodo.3672870BibTeX Entry:
@inproceedings{Petit2019, abstract = {Skini is a platform for composing and producing live performances with audience participating using connected devices (smartphones, tablets, PC, etc.). The music composer creates beforehand musical elements such as melodic patterns, sound patterns, instruments, group of instruments, and a dynamic score that governs the way the basic elements will behave according to events produced by the audience. During the concert or the performance, the audience, by interacting with the system, gives birth to an original music composition. Skini music scores are expressed in terms of constraints that establish relationships between instruments. A constraint maybe instantaneous, for instance one may disable violins while trumpets are playing. A constraint may also be temporal, for instance, the piano cannot play more than 30 consecutive seconds. The Skini platform is implemented in Hop.js and HipHop.js. HipHop.js, a synchronous reactive DLS, is used for implementing the music scores as its elementary constructs consisting of high level operators such as parallel executions, sequences, awaits, synchronization points, etc, form an ideal core language for implementing Skini constraints. This paper presents the Skini platform. It reports about live performances and an educational project. It briefly overviews the use of HipHop.js for representing score.}, address = {Porto Alegre, Brazil}, author = {Bertrand Petit and manuel serrano}, booktitle = {Proceedings of the International Conference on New Interfaces for Musical Expression}, doi = {10.5281/zenodo.3672870}, editor = {Marcelo Queiroz and Anna Xambó Sedó}, issn = {2220-4806}, month = {June}, pages = {71--76}, publisher = {UFRGS}, title = {Composing and executing Interactive music using the HipHop.js language}, url = {http://www.nime.org/proceedings/2019/nime2019_paper014.pdf}, year = {2019} }