Music Proceedings
This page contains a list of peer-reviewed music performed at NIME conferences. (N.B.: this list is currently incomplete. We are currently assembling music proceedings from previous NIMEs)
2020
-
Judith Shatin and Maxwell Tfirn. 2020. Zipper Music. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 8–9. http://doi.org/10.5281/zenodo.6350604
Download PDF DOIZipper Music is scored for 2 amplified zipper players with interactive electronics performed by a MIDI controller operator. It forms part of my Quotidian Music series, embodying the musicality afforded by everyday sounds, and performable by ’everyday’ people, without requiring traditional musical training. Each zipper has a distinctive timbre, depending on material and length, as well as the fabric to which it is sewn. The zipper players are amplified and the sound of each is sent to a laptop. Next, their sound is either transformed using a MIDI controller, or sent through untouched, to stereo speakers. Coomposer Max Tfirm developed the original Max patch in consultation with me, with some additional changes by Alex Christie. The piece can be thought of as a dialogue, where the actors may be in sync or not; may try to convince one another, interrupt one another, or even talk over one another. Ultimately, they agree. The premiere performance is linked below. This version is for 2 amplified zipper players and 2 MIDI controllers and was premiered by the University of Virginia New Music Ensemble with Danielle Zevitz and Tianyu Zhang as zipper players, and Alex Christie and Travis Thatcher on MIDI controllers. The duration is 8:00, a minute longer than your requested duration; the structure was built around this time frame.
@inproceedings{nime20-music-Shatin, author = {Shatin, Judith and Tfirn, Maxwell}, title = {Zipper Music}, pages = {8-9}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350604}, url = {http://w ww.nime.org/proceedings/2020/nime2020_music03.pdf} }
-
Tina Tonagel, Conny Crumbach, Grundmann Gesine, and Britta Fehrmann. 2020. 4 Women, 12 Legs. 120 DEN! Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 1–3. http://doi.org/10.5281/zenodo.6350588
Download PDF DOIThe Cologne based ladies’ quartet 120 DEN, founded in 2019, plays with modified mannequin legs, which become independent electronic instruments through guitar strings, contact microphones and built-in synthesizer elements. The resulting sounds range from subtle caresses to overflowing tapestries of sound, to knee-jerked death metal passages and conceptual electronic textures. The experimental leg sound is of course also supported orally.
@inproceedings{nime20-music-Tonagel, author = {Tonagel, Tina and Crumbach, Conny and Gesine, Grundmann and Fehrmann, Britta}, title = {4 Women, 12 Legs. 120 DEN!}, pages = {1-3}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350588}, url = {http://www.nime.org/proceedings/2020/nime2020_music01.pdf} }
-
Laddy Patricia Cadavid. 2020. Knotting the memory//Encoding the khipu_. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 4–7. http://doi.org/10.5281/zenodo.6350594
Download PDF DOIThe khipu is an information processing and transmission device used mainly by the Inca empire and previous Andean societies. The word comes from the kichwa1 language [khipu] which means knot. This mnemotechnic interface, is one of the first textile computers known, consisting of a central wool or cotton cord to which other strings are attached with knots of different shapes, colors, and sizes encrypting different kinds of values and information. The system was widely used until the Spanish colonization that banned their use and destroyed a large number of these devices [1]. In the performance, the interface is reused as a NIME using new materials in an electronic khipu, paying homage to this device with the convert into an instrument for the interaction and generation of live experimental sound. Through the weaving of knots, the artist takes the position of a contemporary “khipukamayuq”(who was the person dedicated to knot the khipu) [3] seeking, from a decolonial perspective to encode with the touch, the gestures and the different kinds of knots, the interrupted legacy of this ancestral practice in a different experience of tangible live coding and computer music, as well as weave the past with the present of the indigenous and people resistance of the Andean territory with their sounds.
@inproceedings{nime20-music-Cadavid, author = {Cadavid, Laddy Patricia}, title = {Knotting the memory//Encoding the khipu{\_}}, pages = {4-7}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350594}, url = {http://www.nime.org/proceedings/2020/nime2020_music02.pdf} }
-
Chi Wang. 2020. Qin. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 10–11. http://doi.org/10.5281/zenodo.6350619
Download PDF DOIQin is a real-time interactive composition of approximately eight minutes in duration for two custom-made performance control interfaces, custom software created in Max, and Kyma. Qin is a special symbol in Chinese culture and literature that is associated with delicacy, elegance, confidence, power, eloquence, and longing for communication. The symbol Qin appears in literature as early as the time that the Book of Songs was collected. Qin is also a Chinese instrument. Qin has been played since ancient times, and has traditionally been favored by scholars and appeared in literature as an instrument associated with the ancient Chinese philosopher Confucius. In my composition Qin, I took as inspiration the shape of the original Qin instrument and mapped some of the traditional functions on to my custom-made performance interface, replacing the traditional Qin performance techniques with newly developed techniques that draw the desired data from the controllers.
@inproceedings{nime20-music-Wang, author = {Wang, Chi}, title = {Qin}, pages = {10-11}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350619}, url = {http://www.nime.org/proceedings/2020/nime2020_music04.pdf} }
-
Aurie Hsu. 2020. String Song (2019) for amplified prepared kinetic sculpture and live electronics. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 12–13. http://doi.org/10.5281/zenodo.6350624
Download PDF DOIString Song is a guided improvisation for prepared kinetic sculpture and live electronics. The kinetic sculpture is a collaboration between the Aurie Hsu and sound artist, Kyle Hartzell. The sculpture integrates design elements of a violin, erhu (Chinese “spike fiddle”), and a hurdy gurdy in using motorized gears to draw a bowing mechanism across the strings. The instrument also features two sets of sympathetic strings to augment the registral range and resonance of the instrument. The various ways of playing the instrument - plucking, using a metal slide across the strings, actuating the sympathetic strings, mechanical bowing, and physically altering the bow pressure - serve as a sound source for live electronics that further expand the instrument. The duration of the piece is four and a half minutes.
@inproceedings{nime20-music-Hsu, author = {Hsu, Aurie}, title = {String Song (2019) for amplified prepared kinetic sculpture and live electronics}, pages = {12-13}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350624}, url = {http://www.nime.org/proceedings/2020/nime2020_music05.pdf} }
-
Alex Lucas, Tim Leatham, Eoin Fitzpatrick, Mary-Louise McCord, and Daniel Morgan. 2020. Split Point: The Piano Reimagined as an Inclusive Hyper-Instrument. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 14–15. http://doi.org/10.5281/zenodo.6350634
Download PDF DOISplit Point is a quiet ambient work which explores inclusion and constraint and the redistribution of musical processes in contemporary piano music. In a collaboration between inclusive music collective The Wired Ensemble and experimental pianist Alex Lucas, the piano is played collectively, over a network, through the use of reductionist, accessible interfaces. In this con guration, the piano is considered a hyper-instrument with parameters such as pitch, rhythm and timbre, split and redistributed amongst the group. We invite the audience to consider if it is musically interesting to split the piano in such a way and if an individual can communicate independent creative expression when using binary on-off controllers; a common goal in inclusive music.
@inproceedings{nime20-music-Lucas, author = {Lucas, Alex and Leatham, Tim and Fitzpatrick, Eoin and McCord, Mary-Louise and Morgan, Daniel}, title = {Split Point: The Piano Reimagined as an Inclusive Hyper-Instrument}, pages = {14-15}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350634}, url = {http://www.nime.org/proceedings/2020/nime2020_music06.pdf} }
-
Yixuan Zhao. 2020. Charon – For Guzheng, violin, samples, and live electronics. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, p. 16. http://doi.org/10.5281/zenodo.6350646
Download PDF DOICharon is a celestial body in the solar system. In a tide-locked state, it and Pluto gradually becoming synchronous rotation from different orbits and rotation rates due to the influence of tidal forces over a period of time. This work combines different sounds of Chinese instrument, Western instrument and Samples. Their process from struggle to integration is a metaphor that Charon is gradually affected by tidal forces. Technology in this work has created interaction among acoustic instruments, samples and sound effects. The connection among them develops the work. I believe that the tension and interaction are the fascination of the combination of sounds.
@inproceedings{nime20-music-Zhao, author = {Zhao, Yixuan}, title = {Charon – For Guzheng, violin, samples, and live electronics}, pages = {16}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350646}, url = {http://www.nime.org/proceedings/2020/nime2020_music07.pdf} }
-
Nicole Robson and Ulfarsson Halldor. 2020. Dual/Duel/Duet/for/with/halldorophone. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 17–20. http://doi.org/10.5281/zenodo.6350695
Download PDF DOIThe halldorophone is a cello-like, feedback instrument, developed over the past decade by Halldór Úlfarsson. The instrument is well-established in experimental music circles and gaining wider recognition thanks to it’s use by composer and cellist Hildur Guðnadóttir in film scores, including her Oscar nominated music for Joker (2019). The halldorophone utilises a simple system, whereby the vibration of each string is detected by a pickup, amplified and routed to a speaker embedded in the back of the instrument. By adding gain to individual strings in the feedback loop, the instrument’s response can become rapidly complex, potentially spinning out of control. While every musical performance of a piece is unique in some way and contingent on its particular moment and situation in time, the unstable nature of the halldorophone exacerbates this condition. Players describe the halldorophone as ’unpredictable’, ’very much alive’ and as [having] ’its own ideas’, even tiny changes to their body position in performance might produce unexpected effects [5]. In this NIME premiere for the instrument, cellist Nicole Robson will perform a piece for a new digitally endowed halldorophone, and the title of the piece – Dual/Duel/Duet – acknowledges the active role of the instrument in shaping the composition and performance.
@inproceedings{nime20-music-Robson, author = {Robson, Nicole and Halldor, Ulfarsson}, title = {Dual/Duel/Duet/for/with/halldorophone}, pages = {17-20}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6350695}, url = {http://www.nime.org/proceedings/2020/nime2020_music08.pdf} }
-
Laurel Pardue, Jack Armitage, and Kuljit Bhamra. 2020. Petrified Wood - Untitled 59. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 21–23. http://doi.org/10.5281/zenodo.6351037
Download PDF DOIPetrified Wood brings together three exciting cutting-edge improvisers, from disparate musical realms to create new musical interactions utilizing novel new instruments and exploring the potential for live-coding in a musically interactive setting. The trio consists of the duo Jack Armitage (live-coding) and Laurel Pardue (svampolin) joined by the legendary composer, producer, musician, and pioneer of Bhangra, Kuljit Bhamra MBE (electronic tabla and percussion). The performance features two novel alternative instruments designed to play, present, feel like, and even sound like the acoustic instruments on which they are modelled, but can equally sound completely unlike the originals. The svampolin, a hybrid electro-acoustic violin is a functional decomposition and recomposition of the violin retaining the instrument’s acoustic sonic physicality while enabling audio signal modi cation. Meanwhile, the electronic tabla, developed as part of e orts to make tabla learning more accessible, can be used in more traditional roles either as a regular tabla or as an interface to control any array of expressive percussive instruments. Lastly, Jack Armitage brings his expertise and musicianship as a live-coder to not only provide music and texture, but to resample and reframe instrumental player’s ideas live or, through remote control of the svampolin, rede ne performer intimacy as the coder alters the svampolin’s performative results in real-time. Changing the instrument’s functionality during a piece, the player and coder are able to shift the role of the instrument from structure to behaviour, or to freely transition between lutherie and performance.
@inproceedings{nime20-music-Pardue, author = {Pardue, Laurel and Armitage, Jack and Bhamra, Kuljit}, title = {Petrified Wood - Untitled 59}, pages = {21-23}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351037}, url = {http://www.nime.org/proceedings/2020/nime2020_music09.pdf} }
-
Adam Pulz Melbye and Halldor Ulfarsson. 2020. The Feedback-Actuated Augmented Bass (FAAB). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 24–25. http://doi.org/10.5281/zenodo.6351045
Download PDF DOIThe FAAB (Feedback-Actuated Augmented Bass) is a modi ed double bass featuring electromagnetic pickups, an embedded speaker and onboard DSP through a Bela microprocessor. Through mechanically induced feedback and adaptive signal processing, the instrument expands the textural and spectral properties of traditional as well as extended playing techniques. The complex dynamics of the electro-acoustic couplings requires the performer to investigate human-instrument relationships from the perspective of negotiation and exploration rather than instrumental mastery. The improvised performance is a snapshot of the continuous co-evolution between, on the one hand, new techniques and performance practices and on the other, mechanical, acoustical and digital optimisation.
@inproceedings{nime20-music-Melbye, author = {Melbye, Adam Pulz and Ulfarsson, Halldor}, title = {The Feedback-Actuated Augmented Bass (FAAB)}, pages = {24-25}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351045}, url = {http://www.nime.org/proceedings/2020/nime2020_music10.pdf} }
-
Solomiya Moroz and Dejana Sekulic. 2020. Artefacts of Presence. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 26–27. http://doi.org/10.5281/zenodo.6351119
Download PDF DOIartefacts of presence uses archival material from a folk music archive and was part of the larger project addressing composition with archival material. In this piece, I used transcriptions of archival material to adapt to violin writing and techniques. I also use an extended bow attachment which is comprised of a minibee accelerometer whose speed and direction of movement influences digital sound processes of the violinist. The performative presence of the violinist on stage is established through a musical dialogue with the protagonists of the archival footage.
@inproceedings{nime20-music-Moroz, author = {Moroz, Solomiya and Sekulic, Dejana}, title = {Artefacts of Presence}, pages = {26-27}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351119}, url = {http://www.nime.org/proceedings/2020/nime2020_music11.pdf} }
-
Victor Zappi and Oren Ronen. 2020. Black Space. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 28–29. http://doi.org/10.5281/zenodo.6351195
Download PDF DOIBlack Space is a solo performance that presents highly correlated sonic and visual elements. The piece revolves around the manipulation of acoustic qualities of sound propagation and reverberation in a physical model, that is capable of both synthesizing and rendering sound waves. The performer gradually populates the black space that stretches across the surface of his instrument with sound sources, creating layered textures and percussive sounds that get trapped and resonate in di erent 2D shapes and materials. As the piece evolves, the performer explores the acoustic e ects arising as he alters the physical properties of these materials and the shapes these sounds exist in. Black Space was composed for a novel audio/visual digital musical instrument, whose name is anonymized for the submission.
@inproceedings{nime20-music-Zappi, author = {Zappi, Victor and Ronen, Oren}, title = {Black Space}, pages = {28-29}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351195}, url = {http://www.nime.org/proceedings/2020/nime2020_music12.pdf} }
-
Alon Ilsar and Matthew Hughes. 2020. The Air Sticks - An Audio Visual Gestural Instrument. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 30–31. http://doi.org/10.5281/zenodo.6351217
Download PDF DOIThe AirSticks are an audio-visual gestural instrument designed to allow the composition, performance and impro- visation of live electronic music and graphics using movements captured by handheld motion controllers, utilising bespoke software to generate musical and visual content from the gestural controller’s real-time position and rotation information. Through this interface, the performer is offered multidimensional control over audio-visual parameters, whilst providing a clearly transparent relationship between gesture and audio-visual product. The graphics are projected onto a transparent screen—or scrim—to allow both the performer and audience to relate to them. Percussionist and instrument designer Alon Ilsar has been performing with the AirSticks around the world since 2013 with highlights including a live performance with Alan Cumming at the MET museum in NYC, a TEDx performance with live electronic trio the Sticks at Sydney’s Opera House and a solo performance of an hour long audio-visual collaboration with Matt Hughes at Sydney’s Recital Hall entitled Trigger Happy Visualised. The AirSticks were also presented at the 2019 Guthman Musical Instrument Competition in Georgia Tech in Atlanta, Georgia, where they took out the Audience Choice Awards for Best Instrument and Best Performance. A similar presentation was recently made at SIGGRAPH Asia’s 2019 Real-Time Live Competition, in which the AirSticks took out the judge’s award for Best Presentation. In this video, we present three excerpts from Trigger Happy Visualised, plus a short introduction from SIGGRAPH Asia 2019.
@inproceedings{nime20-music-Ilsar, author = {Ilsar, Alon and Hughes, Matthew}, title = {The Air Sticks - An Audio Visual Gestural Instrument}, pages = {30-31}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351217}, url = {http://www.nime.org/proceedings/2020/nime2020_music13.pdf} }
-
Isak Han. 2020. Playing the nUFO. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 32–33. http://doi.org/10.5281/zenodo.6351319
Download PDF DOI"Will new digital instruments become part of the current musico-industrial framework with composers, publishers, producers, sound engineers, performers, concert halls, media, critics, audience? Or do they belong to a new age of musical practice? What is a new digital instrument? How do we play it? Who composes for it? Where does it fit in our culture? And is it a sustainable thing?" Thor Magnusson poses these questions in "sonic writing" (2019), and they all directly apply to the nUFO I’ve been developing over the past 3 years. They inform the conceptual background of the piece, with my personal answers to them becoming the initial sound material. Oscillating between clear intelligibility and complex processing blended with layers of abstract synthesis, the piece offers multiple auditory perspectives on these questions.
@inproceedings{nime20-music-Han, author = {Han, Isak}, title = {Playing the nUFO}, pages = {32-33}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351319}, url = {http://www.nime.org/proceedings/2020/nime2020_music14.pdf} }
-
Jesse Allison and Anthony T Marasco. 2020. Gravity | Density. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 34–35. http://doi.org/10.5281/zenodo.6351489
Download PDF DOIGravity | Density is a work for cyber-hacked devices and Web Audio applications with thematic material drawn from humankind’s fascination with the universe. In Gravity | Density, we begin by manipulating fixed-audio sources through the performance of hacked CD play- ers. The sonic results of this mangled audio is sampled and then distributed to the audience’s mobile devices in both passive and interactive manners. Passive distributions allow us to create intricately-spatialized rhythmic interplay between the glitching CD players and the blanket of overlapping samples dispersed throughout the networked audience. Active distributions allow the audience to join in our performance; by sampling small portions of the audio, processing and looping these sounds and sending them back to the performers, we string this audio together and feed it into a cyber-controlled distortion pedal before sending it back to the audience for more manipulation. This results in overlapping cycles of control and audio generation between performer, audience, network, and machine.
@inproceedings{nime20-music-Allison, author = {Allison, Jesse and Marasco, Anthony T}, title = {Gravity | Density}, pages = {34-35}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6351489}, url = {http://www.nime.org/proceedings/2020/nime2020_music15.pdf} }
-
Barbara Nerness. 2020. Embody. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 36–37. http://doi.org/10.5281/zenodo.6352727
Download PDF DOIembody is a piece for live performers using stethophones (stethoscope microphones), which I have created from hacked stethoscopes, in order to amplify the heartbeat and voice in the chest or vocal tract. The motivation came from my exploration of the sounds trapped within my body, such as my heartbeat and the resonance of my voice in my vocal tract or chest. Is it possible to record our voice as we hear it in our head? On the surface, embody asks the questions: “What if the ocean had a heart? What if machines could speak?” In some sense, the ocean does have a pulse through tides and machines do make noise, we just do not understand them as human. In the piece, sounds of the natural and built environment are anthropomorphized using the sounds inside the performers’ bodies. On a technical level, the sounds were constructed using spectral convolution and envelope following, probing the spectral overlap of our bodies with sounds external to us. The piece includes sounds from two eld recording sessions; one at the beach in Pescadero and another on a tour through the Stanford Energy Facility. The piece is spatialized in 3rd order Ambisonics to resemble a sonic body.
@inproceedings{nime20-music-Nerness, author = {Nerness, Barbara}, title = {Embody}, pages = {36-37}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352727}, url = {http://www.nime.org/proceedings/2020/nime2020_music16.pdf} }
-
Nicola Leonard Hein and Lukas Truniger. 2020. Membranes. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 38–39. http://doi.org/10.5281/zenodo.6352748
Download PDF DOIExploring the boundaries where music and language overlap, they use hybrid instruments – constructed from drum- skins and electronic components – as devices to turn written texts into pulses of light and percussive sound. As each machine translation emerges, the network of instruments starts to share the texts, transforming written material into aesthetic, visual and sonic patterns, for the performers and spectators to further interact with. Extrapolating from the example of the African talking drum, Membranes builds up an altogether new kind of tone language, constantly shifting and adapting itself before the viewer and performers alike. The instruments form a reactive network of semantic and aesthetic actors: a play of forms, light and sound unfolds between them. Following historical archetypes musical communication instruments and seeking to create a speculative acoustic interaction space, this audio-visual installation and performance offers a new alternative communication environment.
@inproceedings{nime20-music-Hein, author = {Hein, Nicola Leonard and Truniger, Lukas}, title = {Membranes}, pages = {38-39}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352748}, url = {http://www.nime.org/proceedings/2020/nime2020_music17.pdf} }
-
Weiming Song. 2020. Yunzhong Jun. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, p. 40. http://doi.org/10.5281/zenodo.6352756
Download PDF DOIYunzhong Jun is a real-time interactive music for human voice and Max/MSP. It is inspired by The Lord within the Clouds in Nine Songs (Chinese name Jiu Ge), which is an ancient Chinese poem series written by Qu Yuan. Yunzhong Jun is also the name of the figure in the poem who controls cloud and rain in ancient Chinese mythology. The lyrics spoken by the performer is come from the poem depicting the scene of ritual in which people calling for rain. The gesture and movement of the performer over the ultrasonic sensors generate the data for controlling parameters in the Max/MSP patch and for manipulating the voice of the performer as well.
@inproceedings{nime20-music-Song, author = {Song, Weiming}, title = {Yunzhong Jun}, pages = {40}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352756}, url = {http://www.nime.org/proceedings/2020/nime2020_music18.pdf} }
-
Christof Ressi and Szilard Benes. 2020. Terrain Study. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 41–43. http://doi.org/10.5281/zenodo.6352760
Download PDF DOIterrain study is a piece for solo performer and virtual reality system, which seeks to work with the possibilities and limitations of VR outside the usual context of a realistic 3D environment. The player starts in a simplistic 3D world consisting of only three basic elements: a randomly generated, slightly undulating terrain; a texture mapped cube which creates the illusion of an endless horizon (a so- called sky box); and several metal-like spheres hovering above the ground which the player can interact with musically. By and by, the visual and acoustic representation of the game world is manipulated by the sounds produced on the instrument, leading to bizarre structures and surreal perspectives, eventually questioning the division of subject and world.
@inproceedings{nime20-music-Ressi, author = {Ressi, Christof and Benes, Szilard}, title = {Terrain Study}, pages = {41-43}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352760}, url = {http://www.nime.org/proceedings/2020/nime2020_music19.pdf} }
-
Courtney D Brown. 2020. Machine Tango. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 44–45. http://doi.org/10.5281/zenodo.6352767
Download PDF DOIArgentine tango dancers generally react to musical recordings with improvised steps, each action arising from an unspoken conversation between leader and follower. In Machine Tango, this relation between dancers and music is turned upside down, enabling tango dancers to drive musical outcomes. Motion sensors are attached to dancer limbs, and their data is sent wirelessly to a computer, where algorithms turn the movement into sound. In doing so, the computer inserts itself in this on-going nonverbal conversation. Instead of traditional tango instruments such as the violin, dancers generate and transform the sounds of aluminum capsules, typewriters, and other found sounds. The musical response of the interactive system to dancer movement transforms during the dance, becoming more complex. The two dancers must traverse the resulting volatile sound landscape as one, responding with stylized tango movements. The effort involved in performing this task, such as how the performers are required to listen to one another’s movements with even more attention, and the contrast between the traditional with the experimental are essential to the performance aesthetic. The work is performed by myself and my tango partner, Brent Brimhall, who has contributed greatly to the structures of the dance.
@inproceedings{nime20-music-Brown, author = {Brown, Courtney D}, title = {Machine Tango}, pages = {44-45}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352767}, url = {http://www.nime.org/proceedings/2020/nime2020_music20.pdf} }
-
Se-Lien Chuang and Andreas Weixler. 2020. Sonic Cultures. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 46–48. http://doi.org/10.5281/zenodo.6352771
Download PDF DOISonic Cultures is an immersive realtime audiovisual compositional environment with interactive generative score (iScore) for multiple computer and open ensemble. An open acoustic instruments ensemble including electronic devices using digital interfaces (laptops etc.) serves as mutual media for score conducting, reading and interpreting. In concert it is performed within an combination of audiovisual realtime processing and improvisation conducted by interactive graphic scores on individual screens/computer driven by virtuoso random functions and intentional choices of a digital conductor/composer, which underline the visual and graphic components that are linked to and experienced by the musical sound environments. The conducted improvisation is completed by an audio realtime signal processing by fft controlled freeze reverb, classic ring modulation as well as spectral delay by a computer performer in mutual inducement with the instrumental player.
@inproceedings{nime20-music-Chuang, author = {Chuang, Se-Lien and Weixler, Andreas}, title = {Sonic Cultures}, pages = {46-48}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352771}, url = {http://www.nime.org/proceedings/2020/nime2020_music21.pdf} }
-
Konstantinos Vasilakos, Scott Wilson, Tsun Winston Yeung, Margetson Emma, and Erik Nystrom. 2020. Dark Matter (live coding). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 49–51. http://doi.org/10.5281/zenodo.6352914
Download PDF DOIThis performance, created in collaboration with the art@CMS project at CERN in Switzerland, involves the real-time sonification of data streams from the Large Hadron Collider, the world’s largest and most complex particle accelerator. Experimental data containing clues towards possible ’new physics’ becomes the raw material for improvised music and visualisations programmed with an aim to creating a result that while beautiful, is both musically and scientifically meaningful.
@inproceedings{nime20-music-Wilson, author = {Vasilakos, Konstantinos and Wilson, Scott and Yeung, Tsun Winston and Emma, Margetson and Nystrom, Erik}, title = {Dark Matter (live coding)}, pages = {49-51}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6352914}, url = {http://www.nime.org/proceedings/2020/nime2020_music22.pdf} }
-
Krzysztof Cybulski. 2020. Modular Process Music. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 52–54. http://doi.org/10.5281/zenodo.6353004
Download PDF DOIModular Process Music is an improvised performance with a set of self-devised and built electronic instruments. The instruments communicate with each other through the sound - each instrument has a speaker and a microphone, so they can listen to each other or to any other external sounds. The instruments are designed to make their interactions clearly comprehensible to the audience via their visual appearance.
@inproceedings{nime20-music-Cybulski, author = {Cybulski, Krzysztof}, title = {Modular Process Music}, pages = {52-54}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6353004}, url = {http://www.nime.org/proceedings/2020/nime2020_music23.pdf} }
-
Alex McLean. 2020. Feedforward. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, p. 55. http://doi.org/10.5281/zenodo.6353969
Download PDF DOIThis is an improvised, from-scratch live coding performance. The NIME interface which this performance showcases is the new Feedfoward editor for the TidalCycles live coding environment. Feedforward is written in Haskell using the ncurses library for terminal-based user interfaces. It runs on low-powered hardware including the Raspberry Pi Zero, with formative testing of prototypes conducted with several groups of children between the ages of 8 and 14. Feedforward has a number of features designed to support improvised, multi-pattern live coding. Individual Tidal patterns are addressable with hotkeys for fast mute and unmuting. Each pattern has a stereo VU meter, to aid the quick matching of sound to pattern within a mix. In addition, TidalCycles has been extended to store context with each event, so that source code positions in its polyrhythmic sequence mini-notation are tracked. This allows steps to be highlighted in the source code when- ever they are active. This works even when Tidal combinators have been applied to manipulate the timeline. Formal evaluation has yet to take place, but this feature appears to support learning of how pattern manipulations work in Tidal. Feedforward and TidalCycles is free/open source software under a GPL licence version 3.0.
@inproceedings{nime20-music-McLean, author = {McLean, Alex}, title = {Feedforward}, pages = {55}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6353969}, url = {http://www.nime.org/proceedings/2020/nime2020_music24.pdf} }
-
Owen Green. 2020. Race to the Bottom. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 56–58. http://doi.org/10.5281/zenodo.6353975
Download PDF DOIRace to the Bottom is the most recent of a string of improvising machines involving bowed cardboard boxes, developed over the last decade. In all these systems, bowed cardboard is both the source of all sonic material and the ’control’ interface to software that occupies a changeable and turbulent role in the territories between algorithmic co-player, instrument and processor. Boxes, it turns out, yield a much more varied sound world, and more room for practised technique than I had imagined when I started exploring them (somewhat facetiously). They also present a range of interesting challenges to machine listening algorithms, such are the instabilities and varied points of interest in their sound. All of these improvising machines have explored different approaches to dealing with this, and finding creative ways of enjoying the software’s frequent ’misunderstandings’ of its input. Increasingly, these machines have also been a place for me to investigate ways of dealing with time in algorithmically mediated improvising, particularly when (as here) my hands are already busy, and I have to trust my software’s sense of time and musicality (or at least put up with it). In Race to the Bottom, these fronts are explored by abusing segmentation algorithms and beat trackers as (loose- ish) analogues for, respectively, oscillators and filters. A clutch of these run at different rates, latching on to different parts of different sounds, and interfering with each other, informing both the processing of sound and the unfolding of musical shape.
@inproceedings{nime20-music-Green, author = {Green, Owen}, title = {Race to the Bottom}, pages = {56-58}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6353975}, url = {http://www.nime.org/proceedings/2020/nime2020_music25.pdf} }
-
John Bowers and Kerry Hagan. 2020. Touch, Strike, Slide, Twist, Shudder. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 59–60. http://doi.org/10.5281/zenodo.6353981
Download PDF DOIThis video catches Bowers and Hagan in the act of sticking their dirty hands into machineries best left alone as they struggle in the midst of unruly sonic behaviours and non-obvious interaction design. Using synthesis algorithms with extreme sensitivity to gesture, they steer rather than control a complex solfége of pulses, noises, crackles and drones, negotiating a link between chaotic dynamics and improvisation. All relationships are tricky, especially the love polyhedron between Bowers, Hagan, their interfaces, their algorithms and their many noises. But we hope for the best.
@inproceedings{nime20-music-Bowers, author = {Bowers, John and Hagan, Kerry}, title = {Touch, Strike, Slide, Twist, Shudder}, pages = {59-60}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6353981}, url = {http://www.nime.org/proceedings/2020/nime2020_music26.pdf} }
-
Mári Mákó. 2020. Schmitt. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 61–64. http://doi.org/10.5281/zenodo.6353997
Download PDF DOISchmitt is a quadrophonic live-electronic music performance. The piece meant to challenge and explore the relationships between motion, gesture and music in a multi-channel speaker setup. Along with that the narrative of the performance is about overcoming existential crisis, which is translated into a sonic journey. The main symbol is a self-made square wave Schmitt oscillator, which is the sound source throughout the whole piece. The development of the oscillator’s timbre is the transformation of the narrative. The piece is also questioning a certain issue behind using self-made instruments or controllers compared to the use of traditional instruments. It is strongly connected to a reference point (expectation) of how the players gestures on the instrument are coordinated with the sounding outcome.
@inproceedings{nime20-music-Mako, author = {M{\'a}k{\'o}, M{\'a}ri}, title = {Schmitt}, pages = {61-64}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6353997}, url = {http://www.nime.org/proceedings/2020/nime2020_music27.pdf} }
-
Erik Nyström. 2020. Intra-action. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Royal Birmingham Conservatoire, pp. 65–66. http://doi.org/10.5281/zenodo.6354005
Download PDF DOIIntra-action is an experimental computer music system and improvised performance where human agency and perceiving generative processes create an ecology of unconventional synthetic sonorities. The work is inspired by philosopher Karen Barad [1], for whom phenomena or objects are not external to one another, and do not precede their encounters, as implied in ’interaction’: instead they emerge from ’intra-action’, an interior process of relationships. In this work, intra-action is both a process occurring inside the computer—where morphological processes are shaped in relation to one another through machine listening and agent-based organisation—and a posthuman relation where ’human’ and ’machine’ agency are co-dependent. Intra-action was commissioned by, and premiered at, NEXT Festival 2019 in Bratislava.
@inproceedings{nime20-music-Nystrom, author = {Nystr{\"o}m, Erik}, title = {Intra-action}, pages = {65-66}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Wright, Joe and Feng, Jian}, year = {2020}, month = jul, publisher = {Royal Birmingham Conservatoire}, address = {Birmingham, UK}, doi = {10.5281/zenodo.6354005}, url = {http://www.nime.org/proceedings/2020/nime2020_music28.pdf} }
2019
-
Christophe D’Alessandro, Xiao Xiao, Grégoire Locqueville, and Boris Doval. 2019. Borrowed Voices. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 11–14.
Download PDFBorrowed voices is a performance featuring performative voice synthesis, with two types of instruments: C-Voks and T-Voks. The voices are played a cappella in a double choir of natural and synthetic voices. Performative singing synthesis is a new paradigm in the already long history of artificial voices. The singing voice is played like an instrument, allowing singing with the borrowed voice of another. The relationship of embodiment between the singer’s gestures and the vocal sound produced is broken. A voice is singing, with realism, expressivity and musicality, but it is not the musician’s own voice, and a vocal apparatus does not control it. The project focuses on control gestures: the music explores vocal sounds produced by the vocal apparatus (the basic sound material), and “played” by the natural voice, by free-hand Theremin-controlled gestures, and by writing gestures on a graphic tablet. The same (types of) sounds but different gestures give different musical “instruments” and expressive possibilities. Another interesting aspect is the distance between synthetic voices and the player, the voice being at the same time embodied (by the player gestures playing the instrument with her/his body) and externalized (because the instrument is not her/his own voice): two different voices sung/played by the same person.
@inproceedings{nime19-music-DAlessandro, author = {D'Alessandro, Christophe and Xiao, Xiao and Locqueville, Grégoire and Doval, Boris}, title = {Borrowed Voices}, pages = {11--14}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music001.pdf} }
-
Anna Rüst. 2019. Bad Mother / Good Mother - an audiovisual performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 8–10.
Download PDFBad Mother / Good Mother is an audiovisual performance involving a projection, a modified electronic breast pump as a sound generator, and a sound- reactive LED pumping costume. The project has four songs that critically explore technologies directed specifically at women like breast pumps and fertility extending treatments such as egg-freezing (social freezing). Depending on the song, the breast pump is either a solo instrument or part of an arrangement. The idea is to use workplace lactation as a departure point to uncover a web of societal politics and pre-conceived perceptions (pun intended) of ideal and non-ideal motherhood.
@inproceedings{nime19-music-Rust, author = {R{\"u}st, Anna}, title = {Bad Mother / Good Mother - an audiovisual performance}, pages = {8--10}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music001.pdf} }
-
James Dooley. 2019. colligation. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 15–16.
Download PDFcolligation (to bring or tie together) is a physical performance work for one performer that explores the idea of sculpting sound through gesture. Treating sound as if it were a tangible object capable of being fashioned into new sonic forms, "pieces" of sound are captured, shaped and sculpted by the performer’s hand and arm gestures, appearing pliable as they are thrown around and transformed into new sonic material. colligation uses two Thalmic Labs Myo armbands, one placed on the left arm and the other on the right arm. The Myo Mapper [1] software is used to transmit scaled data via OSC from the armbands to Pure Data. Positional (yaw, pitch and roll) and electromyographic data (EMG) from the devices are mapped to parameters controlling a hybrid synth created in Pure Data. The synth utilises a combination of Phase Aligned Formant synthesis [2] and Frequency Modulation synthesis [3] to allow a range of complex audio spectra to be explored. Pitch, yaw and roll data from the left Myo are respectively mapped to the PAF synth’s carrier frequency (ranging from 8.175-12543.9Hz), bandwidth and relative centre frequency. Pitch, yaw and roll data from the right Myo are respectively mapped to FM modulation frequency (relative to and ranging from 0.01-10 times the PAF carrier frequency), modulation depth (relative to and ranging from 0.01-10 times the PAF carrier frequency), and modulation wave shape (crossfading between sine, triangle, square, rising sawtooth and impulse). Data from the left and right Myo’s EMG sensors are mapped respectively to amplitude control of the left and right audio channels, giving the performer control over the level and panning of the audio within the stereo field. By employing both positional and bio data, an embodied relationship between action and response is created; the gesture and the resulting sonic transformation become inextricably entwined.
@inproceedings{nime19-music-Dooley, author = {Dooley, James}, title = {colligation}, pages = {15-16}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music003.pdf} }
-
Sabina Hyoju Ahn. 2019. DIY Bionoise. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 17–20.
Download PDFDIY Bionoise (2018) is an instrument in which the performer can generate sound and noise, deriving from their own body. It contains a circuit that can measure the bioelectricity from living beings to control the instrument by tactile sense. This instrument has two functions – a modular synthesizer with an eight-step sequencer and a bionoise control mode.
@inproceedings{nime19-music-Ahn, author = {Ahn, Sabina Hyoju}, title = {DIY Bionoise}, pages = {17--20}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music004.pdf} }
-
Ajin Tom. 2019. FlexSynth – Blending Multi-Dimensional Sonic Scenes. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 21–24.
Download PDFFlexSynth is an interpretation of The Sponge, a DMI embedded with sensors to detect squeeze, flexion and torsion along with buttons to form an interface using which musical sounds are generated and the sound is sculpted. The key idea of the sponge is to harness the properties of a retractable, flexible object that gives the performer wide range of multi- parametric controls with high resolution in a maximized gesture space, considering its high manoeuvrability.
@inproceedings{nime19-music-Tom, author = {Tom, Ajin}, title = {FlexSynth – Blending Multi-Dimensional Sonic Scenes}, pages = {21--24}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music005.pdf} }
-
Filipe Calegario João Tragtenberg. 2019. Gira. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 25–28.
Download PDFGira is a music and dance performance with Giromin, a wearable wireless digital instrument. With this Digital Dance and Music Instrument a gesture is transformed into sound by motion sensors and an analog synthesizer. This transmutation of languages allows dance to generate music, which stimulates a new dance in an infinite feedback loop.
@inproceedings{nime19-music-Tragtenberg, author = {João Tragtenberg, Filipe Calegario}, title = {Gira}, pages = {25--28}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music006.pdf} }
-
Rodrigo F. Cádiz. 2019. iCons. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 29–31.
Download PDFiCons is an interactive multi-channel music piece for live computer and a gesture sensor system designed by the composer especially for this piece, called AirTouch. Such system allows a much more musical approach to controlling sounds than the computer keyboard or mouse. Using only movements of the hands in the air it is possible to control most aspects of the music, such as sound shapes in time, loops, space positioning, or create very rich spectral densities.
@inproceedings{nime19-music-Cadiz, author = {Cádiz, Rodrigo F.}, title = {iCons}, pages = {29--31}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music007.pdf} }
-
Martim Galvão. 2019. MusiCursor. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 32–34.
Download PDFMusiCursor is an interactive multimedia performance/interface that reimagines consumer-facing technologies as sites for creative expression. The piece draws inspiration from established UI/UX design paradigms and the role of the user in relation to these technologies. The performer assumes the role of a user installing a musically-driven navigation interface on their computer. After an installation prompt, they are guided through a series of demos, in which a software assistant instructs the performer to accomplish several tasks. Through their playing, the performer controls the cursor’s navigation and clicking behavior. In lieu of a traditional score, the performer relies on text instructions and visual indicators from a software assistant. The software tracks the progress of the user throughout the piece and moves onto the next section only once a task has been completed. Each of the main tasks takes place on the web, where the user navigates across YouTube, Wikipedia, and Google Maps.
@inproceedings{nime19-music-Galvao, author = {Galvão, Martim}, title = {MusiCursor}, pages = {32--34}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music008.pdf} }
-
Barry Cullen, Miguel Ortiz, and Paul Stapleton. 2019. Pandemonium Trio perform Drone and Drama v2. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 35–38.
Download PDFPandemonium Trio is Barry Cullen, Miguel Ortiz and Paul Stapleton. Our performance research trio has been set up to explore multiple instantiations of custom-made electronic instruments through improvisation. We are particularly interested in exploiting irregularities in the qualities of circuit components (e.g. imprecise tolerances/values), and how this allows for the development of stylistic differences across multiple instrument-performer configurations. We are also interested in how skill, style and performance techniques are developed in different ways on similar devices over extended periods of time, and how our existing musical practices are reconfigured through such collaborative exchanges.
@inproceedings{nime19-music-Cullen, author = {Cullen, Barry and Ortiz, Miguel and Stapleton, Paul}, title = {Pandemonium Trio perform Drone and Drama v2}, pages = {35--38}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music009.pdf} }
-
Federico Visi and Rodrigo Schramm. 2019. Introduction. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, p. 4.
Download PDF
@inproceedings{nime19-music-introduction, author = {Visi, Federico and Schramm, Rodrigo}, title = {Introduction}, pages = {4}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music00I.pdf} }
-
Ana Dall’Ara-Majek and Takuto Fukuda. 2019. Pythagorean Domino. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 39–42.
Download PDFPythagorean Domino is an improvisatory composition composed in 2019 for an augmented Theremin and a gyro-based gestural controller. This work aims to integrate music concrete techniques and an algorithmic compositional approach in the context of composition for gestural controllers. While music concrete compositional practice brings out the concept of “composite object”—a sound object made up of several distinct and successive elements [1]—in the piece, our algorithmic compositional approach delivers an interpolation technique which entails gradual transformations of the composite objects over time. Our challenge is to perform a chain of short fragmental elements in tandem in the way to form a single musical unit, while the algorithms for transformation are autonomously changing synthetic and control parameter settings. This approach derives closely interconnected triangular interactions between two performers and a computer.
@inproceedings{nime19-music-DallAra-Majek, author = {Dall'Ara-Majek, Ana and Fukuda, Takuto}, title = {Pythagorean Domino}, pages = {39--42}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music010.pdf} }
-
Yiyao Nie. 2019. River. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 43–46.
Download PDF“No one can step into the same river twice.” This instrument, named as River, contains rules and randomness. What exactly is music and how does it connect to and shape our form? Traditional musical instruments always have fixed physical forms that require performers to adjust to them. How about making a musical instrument that is more fluid and more expressive via deforming according to performers’ movements? This was the question I attempted to explore when I started making this project. For this project, I combine the movement of dancing with music to present a fluid and dynamic shape of musical instrument. The fabric of this instrument can be separated as an extension to wash. It’s portable, wireless, chargeable, stable and beautiful. This musical instrument generates sound by detecting different movements of the performer. It has four different modes selected by toggling the switches on the instrument interface. Each mode has different movement detection methods, generating various sound and music. Moreover, it can be played as a transmitting Tambourine. As for the music in my performance, it’s all played by myself lively, consisting of different sound triggered and changed by performers’ gestures and melody composed myself. Like the name of this instrument River, the four toggles and their detection methods and their corresponding generated sounds are intentionally designed. From simple node, beat, loop, drum, to various node, melody, music, the detection methods and their triggered sounds are becoming more and more complex and various, developing like a journey of a river.
@inproceedings{nime19-music-Nie, author = {Nie, Yiyao}, title = {River}, pages = {43--46}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music011.pdf} }
-
Jiyun Park. 2019. Self-Built Instrument (sound performance). Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 47–49.
Download PDFSelf-Built Instrument project is focused on sound performance with an experi- mental instrument which is composed of strings and metallic sound box, pro- ducing overtones, harmonics and feed- back. It is capable to play with different sound colours : Resonances by cooper, bowing on strings, overtones and feed- back. All of factors triggers each other’s sound. It is not a point to play a specific tone or to make a musical harmony, because the instrument is not able to per- fectly control. Playing this Instrument is a challenge to your capacity, such as gestures and sonic phenomenon following sense and space. The artist composed a piece and use few repertoire partly, however, mostly it is interesting to find what kind of sound comes to nest in mesh. The Artist tried to get over typical aesthetics of classical music, such as using precise pitches, melodies, and read scores. Instead of that, her approach towards to discover unusual sound elements which are considered as mistake in tradi- tional way. And play with them, for instance, strings without tuning, hitting a stuffs, unorganized pitch, also so-called clicker which happens unskilled.
@inproceedings{nime19-music-Park, author = {Park, Jiyun}, title = {Self-Built Instrument (sound performance)}, pages = {47--49}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music012.pdf} }
-
André L. Martins and Paulo Assis Barbosa. 2019. Tanto Mar. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 50–51.
Download PDF"Tanto Mar" seeks to recreate the properties present in history between Portugal and Brazil, embracing the idea of an aqueous sound that dances and moves as much by cadence as by voluminous waves. The Atlantic Ocean, which separates and unites the two countries, serves as an inspiration for this quadraphonic performance, involving musical instruments and live electronics, where the sounds move through the four speakers. Each speaker symbolizes the paths that the sea travels uninterruptedly, in a unique dance of latitudes and longitudes. The intersection of sounds occurs through processes of reverberations, spatializations, echoes, modulations and grains that slowly form the sound material, composing, decomposing and manipulating the sound waves. Sound characters such as wind, oars, storms, calm, among others, are metaphorically evidenced through the sound material, creating a kind of rhythmic movement of a caravel at sea. The sounds of "Tanto Mar" move between entropy and chaos, between stillness and tsunami, between starboard and port, culminating in a textural dance where the objective is to take the listener away from electronic processing, and propose a dive in an intensified, attentive, deep and involving listening. New musical possibilities can happen through the experimentation of new routes, unusual routes and horizons not yet covered. The sea and its imprecise distances represent permanent challenges. "Tanto Mar" seeks to revive the feeling of the Portuguese poet Fernando Pessoa, when he wrote: "to dream even if it is impossible".
@inproceedings{nime19-music-Martins, author = {Martins, André L. and Barbosa, Paulo Assis}, title = {Tanto Mar}, pages = {50--51}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music013.pdf} }
-
Cassia Carrascoza and Felipe Merker Castellani. 2019. Tempo Transversal – Flauta Expandida. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 52–55.
Download PDF“Tempo Transversal – Flauta Expandida” aims to establish a computer- controlled catalyzer, which simultaneously combines and extends the flutist body actions, electronic sounds and the performative physical space. Some flute performance fragments, captured in real time by video cameras, besides pre-recorded images, built the visual projection. The flute player develops two pieces of experimental music for flute and electronic. All these heterogeneous elements are interrelated with each other in a network mediated by the computer. The result is a continuously unfolded interactive performance, which intends to manipulate settings of space-time perception. Brazilian contemporary repertoire for amplified bass flute and electronic sounds establishes the proposal.
@inproceedings{nime19-music-Carrascoza, author = {Carrascoza, Cassia and Castellani, Felipe Merker}, title = {Tempo Transversal – Flauta Expandida}, pages = {52--55}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music014.pdf} }
-
Rob Hamilton. 2019. Trois Machins de la Grâce Aimante (Coretet no. 1). Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 56–59.
Download PDFTrois Machins de la Grâce Aimante is a composition intended to explore Twenty-First century technological and musical paradigms. At its heart Trois Machins is a string quartet fundamentally descended from a tradition that spans back to the 18th century. As such, the work primarily explores timbral material based around the sound of a bowed string, in this case realized using a set of physically modeled bowed strings controlled by Coretet, a virtual reality string instrument and networked performance environment. The composition - for four performers, preferably from an existing string quartet ensemble - takes the form of three distinct movements, each exploring different capabilities of the instrument itself and requiring different forms of communication and collaboration between the four performers.
@inproceedings{nime19-music-Hamilton, author = {Hamilton, Rob}, title = {Trois Machins de la Grâce Aimante (Coretet no. 1)}, pages = {56--59}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music015.pdf} }
-
Paul Stapleton. 2019. uncertain rhythms. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 60–62.
Download PDFThis work is a continuation of my research into developing new performance ecosystems for improvisation. For this project I developed a new volatile assemblage, aka VOLA. My self-designed musical instruments are shaped by my history as a performer working in acoustic, mechanical, electronic and digital musics, blending and exploring the boundaries and breaking points of these different domains. My instruments support many of my existing techniques originally developed on more conventional instruments, while also affording the development of extended and novel techniques and performance strategies. In much of my work I am particularly focused on the exploration of musical timbre and texture; however, for this project my attention is also directed towards time, flow, pulse, duration, friction, disruption – in short, qualitative rhythms and defamiliarisation.
@inproceedings{nime19-music-Stapleton, author = {Stapleton, Paul}, title = {uncertain rhythms}, pages = {60--62}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music016.pdf} }
-
Çağri Erdem, Katja Henriksen Schia, and Alexander Refsum Jensenius. 2019. Vrengt: A Shared Body-Machine Instrument for Music-Dance Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 63–65.
Download PDFWhat if a musician could step outside the familiar instrumental paradigm and adopt a new embodied language for moving through sound with a dancer in true partnership? And what if a dancer’s body could coalesce with a musician’s skills and intuitively render movements into instrumental actions for active sound- making? Vrengt is a multi-user instrument, specifically developed for music-dance performance, with a particular focus on exploring the boundaries between standstill vs motion, and silence vs sound. We sought for creating a work for one, hybrid corporeality, in which a dancer and a musician would co-creatively and co- dependently interact with their bodies and a machine. The challenge, then, was how could two performers with distinct embodied skills unite in a continuous entanglement of intentions, senses and experiences to control the same sonic and musical parameters? This was conceptually different than they had done before in the context of interactive dance performances.
@inproceedings{nime19-music-Erdem, author = {Erdem, Çağri and Schia, Katja Henriksen and Jensenius, Alexander Refsum}, title = {Vrengt: A Shared Body-Machine Instrument for Music-Dance Performance}, pages = {63--65}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music017.pdf} }
-
Paulo Assis Barbosa and Miguel Antar. 2019. We Bass: inter(actions) on a hybrid instrument. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, pp. 66–67.
Download PDFThe key for a collective process of free improvisation is the interaction, dependence and surrender of its parts, so the resulting sound flux is more than the sum of each individual layer. The We Bass performance is an exploration of the symbiosis of two performers playing the same instrument: Their actions have direct consequence on the resulting sound, challenging the other player with instability and interference. From the experiments of the English scientist Thomas Young (1773-1829) on the phenomena of diffraction and interference of light waves, we observe that interferences generated by overlapping light waves can have a character of annihilation, when they are out of phase (destructive interference), or a reinforcing character when in phase (constructive interference). From this reflection we try to deepen the discussion about the interferences of the performers inputs involved in a free improvisation session. We seek a model of connection between the performers that promotes processes of creation in the free improvisation, exploring the dialectics between reinforcement actions (processes of interaction that reinforces a certain sound moment) and movement actions (that destabilizes and transforms the flow). We Bass is a duo performance exploring the interactions between the musicians playing one hybrid machine: an electric upright bass guitar with live electronics processing. The instrument consists of an electric upright bass with movement sensors and a live processing machine with a controller that interacts with the sensors, changing some processing parameters and some controller mapping settings, creating an instable ground for the musicians.
@inproceedings{nime19-music-Barbosa, author = {Barbosa, Paulo Assis and Antar, Miguel}, title = {We Bass: inter(actions) on a hybrid instrument}, pages = {66--67}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music018.pdf} }
-
Federico Visi (ed.). 2019. NIME 2019 Concert Program. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, p. 5.
Download PDF
@inproceedings{nime19-music-program, title = {NIME 2019 Concert Program}, pages = {5}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_music0II.pdf} }
-
Federico Visi (ed.). 2019. NIME 2019 Program Committee Members. Music Proceedings of the International Conference on New Interfaces for Musical Expression, UFRGS, p. 6.
Download PDF
@inproceedings{nime19-music-PC-members, title = {NIME 2019 Program Committee Members}, pages = {6}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Visi, Federico}, year = {2019}, month = jun, publisher = {UFRGS}, address = {Porto Alegre, Brazil}, url = {http://www.nime.org/proceedings/2019/nime2019_musicIII.pdf} }
2016
-
Jesse Allison. 2016. Causeway. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Allison2016, author = {Allison, Jesse}, title = {Causeway}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
James Andean. 2016. Hyvät matkustajat. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Hyvat matkustajat (2014) (Finnish for ’Dear Travellers’, but also for ’The Good Travellers’) began life as a "sonic postcard from Finland", using soundscape field recordings from around the country. This turned out to be only the first stop on its journey, however. The original material was later further developed as material for sonic exploration and spectral transformations, with the external spaces of the original version taking a sharp digital turn inwards, to chart internal spectral landscapes, together with the soundmarks and soundscapes of its first incarnation. Everything in Hyvat matkustajat is made from the original field recordings which first gave birth to the piece.
@inproceedings{nime2016-music-Andean2016, author = {Andean, James}, title = {Hyvät matkustajat}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Leah Barclay. 2016. Ground Interference - The Listen(n) Project. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Ground Interference draws on short recordings from each location I visited in spring 2014 with a particular focus on Joshua Tree National Park, Jornada Biosphere Reserve, Mojave Desert, and Death Valley National Park. These fragile desert environments are inhabited by thousands of species all part of a delicate ecosystem that is in a state of flux induced by changing climates. The transfixing acoustic ecologies of the southwest deserts demand a stillness that encourages a deeper environmental awareness and engagement. In many instances during our field trip we struggled to find locations without human interference. The distant hum of highway traffic and relentless airplanes under the flight path from LAX were expected, yet we also encountered unexpected sounds interfering with the acoustic ecologies of the land. These range from an obscure reverberating vending machine in Death Valley National Park to rattling power lines in the Jornada Biosphere Reserve that were so loud I could feel the vibrations through my feet.
@inproceedings{nime2016-music-Barclay2016, author = {Barclay, Leah}, title = {Ground Interference - The Listen(n) Project}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Stephen Beck. 2016. Quartet for Strings. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Beck2016, author = {Beck, Stephen}, title = {Quartet for Strings}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Stephen David Beck and Scott Smallwood. 2016. From Uganda" Mara Helmuth. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Beck2017, author = {Beck, Stephen David and Smallwood, Scott}, title = {From Uganda" Mara Helmuth}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Alice Bennett. 2016. Echolocation Suite. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Three short pieces for flute and micro-bats (world premiere). This work uses data collected by Australian environmental scientist, Dr. Lindy Lumsden, in her research of native Australian micro bats. It uses data from bat-detecting devices: ultrasonic recording devices that recognize bat calls and transpose them down to the human hearing range. The data is analysed in the form of a spectrogram, and each species of bat is discerned by the shape and range of the calls. This piece uses the pitch and rhythm of bat calls as source material for the structure of each movement, and also uses the transposed calls throughout. The recordings are triggered at certain frequencies and dynamics of the flute via Max MSP, setting bats flying across the room (in 4 channels). The flute mimics different types of bat calls, triggering and reacting to the recordings and using its inherent flexibility to create a different voice in each register. I. Victoria Circa 5.’ There are 21 species of native bats in Victoria, all with unique calls above human hearing range. Like birds, these calls occur in different frequency levels so that different species of bat may co-exist without disturbing each other. A bat’s call bounces off the objects around it allowing it to ‘see’ at night, creating a beautiful cacophony that no one ever notices. II. Melbourne Circa 5.’ Did you think that bats only live in the bush? 17 of the 21 species of bats in Victoria can be found in metropolitan Melbourne, roosting in the hollows of our 100+-year-old trees. These fascinating creatures go largely unnoticed by all except the odd cat due to their size (most adult micro bats fit into a matchbox), speed, and auditory range (only a few species can be heard by humans, including the White-striped Freetail Bat). These bats are insectivorous and without them we’d be inundated with mosquitos and bugs. III. Southern Bent-Wing Bat Circa 6.’ Very little is known about this curious endangered species other than its secretive breeding place in a cave somewhere in South-West Victoria. These bats can be found all over Victoria, but unlike any other species of bat, they travel hundreds of miles to breed in one place. No one knows how the young bats know where to go, without flying in flocks like birds there’s no way for them to follow each other, so how do they know where to go? This is one of the questions that Dr. Lindy Lumsden hopes to answer in her research.
@inproceedings{nime2016-music-Bennett2016, author = {Bennett, Alice}, title = {Echolocation Suite}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Henning Berg. 2016. Improvising with Tango. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Berg2016, author = {Berg, Henning}, title = {Improvising with Tango}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Oliver Bown. 2016. DIADs - The Ford Transit…. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Bown2016, author = {Bown, Oliver}, title = {DIADs - The Ford Transit…}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Brigid Burke. 2016. Coral Bells Movt.2. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Coral Bells explores the diverse overtone, microtone sounds and origins of the Federation Hand Bells and Bass clarinet into the visual with discrete sounds of the ecosystems of coral from Fitzroy Island Northern Australia. This creation brings a new life to the Federation Hand Bells providing deepening connections with the Australian landscape. It is the conversation of between the audio and dead coral from that accentuates the audio-visual reflecting both the translucent Federation Bell sounds, Bass clarinet, glass and dead coral. The acoustic resonators vibrates with the coral and are recreated into visuals of moving glass objects. These sounds transform into acousmatic sounds. The colors and texture within the visuals are layered white/grey, sepia, hints of pastel colours, burnt reds, yellows and gold images that are layered to create a thick timbral texture to form the video voice. The sounds of subtle high pitched Bells and gritty sand sounds with the Bass clarinet periodically joining the drones with discordant multiphonics and flourishes of notes dominate throughout. Subsequent acoustic and visual motifs capture and emerge sonically/visually creating timbre layers of the interpreted coral and glass reflections.
@inproceedings{nime2016-music-Burke2016, author = {Burke, Brigid}, title = {Coral Bells Movt.2}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
David Burraston. 2016. Rainwire. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Rainwire encompasses the investigation of rainfall & its application as a medium for artistic, cultural & scientific exchange. The Rainwire project includes development of a prototype Acoustic Rain Gauge using suspended cables (long wire instruments), and subsequently expanded through various collaborations in a range of creative & environmental contexts. Rainwire is an experimental approach at technological appropriation of agricultural based objects for art and science, with particular emphasis on climate change issues and agriculture. This performance will present a live laptop mix of environmental sonification recordings from the newly built Rainwire prototype. Previous work on Rainwire has been conducted on shared instruments, this performance will be an opportunity to present the newly built dedicated Rainwire prototype in public for the first time in Australia. Long-wire instruments are made from spans of fencing wire across the open landscape. Rainwire developed from using contact mic recordings of rainfall ‘playing’ the long wire instruments for my music compositions. This enabled a proof of concept study to the extent that the audio recordings demonstrate a wide variety of temporal & spatial rain event complexity. This suggests that environmental sonification has great potential to measure rainfall accurately, & address recognized shortcomings of existing equipment & approaches in meteorology. Rain induced sounds with long wire instruments have a wide range of unique, audibly recognisable features. All of these sonic features exhibit dynamic volume & tonal characteristics, depending on the rain type & environmental conditions. Aside from the vast array of creative possibilities, the high spatial, temporal, volume & tonal resolution could provide significant advancement to knowledge of rainfall event profiles, intensity & microstructure. The challenge lies in identifying distinctive sound patterns & relating them to particular types of rainfall events. Rainwire is beyond simple sonification of data, it embeds technology & data collection within cultural contexts. With rainfall as catalyst to draw inspiration from, artists, scientists & cultural groups are key to informing science & incite new creative modalities.
@inproceedings{nime2016-music-Burraston2016, author = {Burraston, David}, title = {Rainwire}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Travis Thatcher & Peter Bussigel. 2016. Danger Music No. 85. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Bussigel2016, author = {Bussigel, Travis Thatcher & Peter}, title = {Danger Music No. 85}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Peter Bussigel. 2016. Ndial Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Bussigel2017, author = {Bussigel, Peter}, title = {Ndial Performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Joe Cantrell. 2016. Blackbox Loops. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Cantrell2016, author = {Cantrell, Joe}, title = {Blackbox Loops}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Zubin Kanga & Benjiman Carey. 2016. Taking the Auspices. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Carey2016, author = {Carey, Zubin Kanga & Benjiman}, title = {Taking the Auspices}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Nicole Carroll. 2016. Everything In Its Place. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Carroll2016, author = {Carroll, Nicole}, title = {Everything In Its Place}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Richard Cyngler. 2016. Music for various groups of performers (after Lucier). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Cyngler2016, author = {Cyngler, Richard}, title = {Music for various groups of performers (after Lucier)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Jon Drummond. 2016. Light Traces. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Drummond2016, author = {Drummond, Jon}, title = {Light Traces}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Arne Eigenfeldt. 2016. Machine Songs. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Eigenfeldt2016, author = {Eigenfeldt, Arne}, title = {Machine Songs}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Karheinz Essl. 2016. Lexicon Sonate. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Essl2016, author = {Essl, Karheinz}, title = {Lexicon Sonate}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Sean Foran. 2016. Improvisations with the other. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Foran2016, author = {Foran, Sean}, title = {Improvisations with the other}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Ben Freeth. 2016. Bio-vortex: Exploring Wet. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Freeth2016, author = {Freeth, Ben}, title = {Bio-vortex: Exploring Wet}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Sam Gillies. 2016. Shelter. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Working almost exclusively at a very soft volume, Shelter inverts the relationships between the source sound material and it’s experience in the real world, placing very large sounds (sourced from field recordings) at the threshold of audibility while audio artifacts are brought to the forefront of our focus to act as recognisable musical material. By utilising a soft dynamic, all audience members are able to hear each channel more equally, regardless of their position in the performance space. This new version for bass clarinet, electric guitar, and electronics expands the original electronic composition into something more lively and environmentally focused. The compositional intentions of the original Shelter remain at play here - this version still seeks to address the assumptions of multichannel listening, while affecting an environment of sound in preference to an experience of sound. However, this electroacoustic version adds a little bit of much needed chaos, allowing performers to interact and manipulate this sonic environment. About the performers: Cat Hope - Bass Flute Lindsay Vickery - Bass Clarinet Aaron Wyatt - Viola
@inproceedings{nime2016-music-Gillies2016, author = {Gillies, Sam}, title = {Shelter}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Georg Hajdu. 2016. Just Her - Jester - Gesture. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Hajdu2016, author = {Hajdu, Georg}, title = {Just Her - Jester - Gesture}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Vicki Hallett. 2016. Elephant Talk. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: The Elephant Listening Project from Cornell University is the basis of Elephant Talk/Elephant Listening Project music performances. They present not only logistical difficulties but musical difficulties. It was 2-3 years of attempting to confirm the possibility of the project with Cornell University. The researchers and contacts of course, were deep in Africa recording the sounds for their research. Threats of poaching are a reality and in one instance, although the researcher reached safety, the elephants weren’t so lucky. Cornell University use a variety of technological platforms for their research both recording and processing of these recordings. The music created also uses a variety of technological and compositional methods to both utilise the sounds and to create something that is inspiring, innovative and become a whole listening experience. Through using different format types of sounds, for example: infrasonic sampled so that humans can hear them as well as regular files, the aim is to create relationships between the natural environment of the forest elephants, the other recorded acoustic occurrences while incorporating various instruments to create a conversation between the sonic environment, performer and listener.
@inproceedings{nime2016-music-Hallett2016, author = {Hallett, Vicki}, title = {Elephant Talk}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Cat Hope & Stuart James. 2016. Chunk. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-James2016, author = {James, Cat Hope & Stuart}, title = {Chunk}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Zubin Kanga. 2016. Morphosis for piano. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Kanga2016, author = {Kanga, Zubin}, title = {Morphosis for piano}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Jonghyun Kim. 2016. Live Performance for Leappmotion. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Kim2016, author = {Kim, Jonghyun}, title = {Live Performance for Leappmotion}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Donna Hewitt & Julian Knowles. 2016. Doppelgänger³ Macrophonics². Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Knowles2016, author = {Knowles, Donna Hewitt & Julian}, title = {Doppelgänger³ Macrophonics²}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Shawn Lawson. 2016. Owego System Trade Routes. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Lawson2016, author = {Lawson, Shawn}, title = {Owego System Trade Routes}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Sang Won Lee. 2016. Live Writing: Gloomy Streets. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Lee2016, author = {Lee, Sang Won}, title = {Live Writing: Gloomy Streets}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Christos Michalakos. 2016. Augmented Drum-Kit: Path Finder. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Michalakos2016, author = {Michalakos, Christos}, title = {Augmented Drum-Kit: Path Finder}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Amy Alexander & Curt Miller. 2016. Composition #1 for PIGS (Percussive Image Gestural System). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Miller2016, author = {Miller, Amy Alexander & Curt}, title = {Composition #1 for PIGS (Percussive Image Gestural System)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Stephan Moore. 2016. Basaur. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Basaur is a structured improvisation for software, microphones, and objects, performed through a multichannel sound system. Using simple, readymade household devices as the primary sound source, Basaur unfolds as a guided exploration of the small mechanical drones and noises that occupy the edges of our quotidian sonic awareness. Using both pre-recorded and live-performed sound sources, textures are layered and connected, building to a richly detailed environment of active sounds – background becomes foreground, and the everyday annoyances of modern convenience take on a full-throated presence that is by turns lyrical and menacing.
@inproceedings{nime2016-music-Moore2016, author = {Moore, Stephan}, title = {Basaur}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Johannes Mulder. 2016. On Solo. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: The performance is part of the ongoing research project into Karlheinz Stockhausen’s historic work Solo (Solo, für Melodie-Instrument mit Rückkopplung 1965-6). Together with my colleague Dr. Juan Parra Cancino from ORCIM Ghent we are teasing out the consequences of the (now common) software replacement of the elaborate tape delay system that was used in the time of the work’s inception.
@inproceedings{nime2016-music-Mulder2016, author = {Mulder, Johannes}, title = {On Solo}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Yoshihito Nakanishi. 2016. TRI=NITRO. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Nakanishi2016, author = {Nakanishi, Yoshihito}, title = {TRI=NITRO}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Yoshihito Nakanishi. 2016. Powder Box. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Nakanishi2017, author = {Nakanishi, Yoshihito}, title = {Powder Box}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Kiran Bhumber & Nancy Lee Norah Lorway. 2016. Hollow Vertices. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-NorahLorway2016, author = {Norah Lorway, Kiran Bhumber & Nancy Lee}, title = {Hollow Vertices}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Benjamin O’Brien. 2016. Along the Eaves. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: "along the eaves" is part of a series that focuses on my interest in translational procedures and machine listening. It takes its name from the following line in Franz Kafka’s “A Crossbreed [A Sport]” (1931, trans. 1933): “On the moonlight nights its favourite promenade is along the eaves.” To compose the work, I developed custom software written in the programming languages of C and SuperCollider. I used these programs in different ways to process and sequence my source materials, which, in this case, included audio recordings of water, babies, and string instruments. Like other works in the series, I am interested in fabricating sonic regions of coincidence, where my coordinated mix of carefully selected sounds suggests relationships between the sounds and the illusions they foster.
@inproceedings{nime2016-music-OBrien2016, author = {O'Brien, Benjamin}, title = {Along the Eaves}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Garth Paine. 2016. Becoming Desert - The Listen(n) Project. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Becoming Desert draws on the experience of sitting or lying down silent in the desert for several hours at a time to make sound recordings. The field recordings I made in four deserts of the American Southwest are the basis of this work. When listening to the desert sounds through headphones at the time of recording, one is aware of a kind of hyper-real sonic environment. The amplified soundfield in the headphones is surreal in its presence and accuracy and multiplies my direct experience of listening many times.
@inproceedings{nime2016-music-Paine2016, author = {Paine, Garth}, title = {Becoming Desert - The Listen(n) Project}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Garth Paine. 2016. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFAbout the performer: Garth is particularly fascinated with sound as an experiential medium, both in musical performance and as an exhibitable object. This passion has led to several interactive responsive environments where the inhabitant generates the sonic landscape through their presence and behaviour. Garth has composed several music scores for dance generated through video tracking of the choreography, and more recently using Bio-Sensing on the dancers body. His immersive interactive environments have been exhibited in Australia, Europe, Japan, USA, Canada, UK, Hong Kong and New Zealand. Garth Paine is internationally regarded as an innovator in the field of interactivity in electronic music and media arts (some papers here). He gained his PhD in interactive immersive environments from the Royal Melbourne Institute of Technology, Australia in 2003, and completed a Graduate Diploma in software engineering in the following year at Swinburne University. All a long way from his Bachelor of classical Flute performance from the conservatorium of Tasmania. Garth is Associate Professor in Digital Sound and Interactive Media at the School of Arts Media + Engineering at Arizona State University in the USA. His previous post was as Associate Professor of Sound Technologies at the University of Western Sydney, where he established the Virtual, Interactive, Performance Research environment (VIPRe) . He is often invited to run workshops on interactivity for musical performance and commissioned to develop interactive system for realtime musical composition for dance and theatre performances.
@inproceedings{nime2016-music-Paine2017, author = {Paine, Garth}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Andrew Pfalz. 2016. Of Grating Imperma. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Pfalz2016, author = {Pfalz, Andrew}, title = {Of Grating Imperma}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Michon Romain. 2016. A Minor Chord for BladeAxe. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Romain2016, author = {Romain, Michon}, title = {A Minor Chord for BladeAxe}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Stephan Moore & Scott Smallwood. 2016. Losperus. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Smallwood2016, author = {Smallwood, Stephan Moore & Scott}, title = {Losperus}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Dj Sniff. 2016. Live performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Sniff2016, author = {Sniff, Dj}, title = {Live performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Andrew Sorensen. 2016. Barely a Piano. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Sorensen2016, author = {Sorensen, Andrew}, title = {Barely a Piano}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Andrew Sorensen. 2016. Splice. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Sorensen2017, author = {Sorensen, Andrew}, title = {Splice}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Andrew Stewart. 2016. Ritual for Karlax. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Stewart2016, author = {Stewart, Andrew}, title = {Ritual for Karlax}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Atsushi Tadokoro. 2016. Membranes. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Tadokoro2016, author = {Tadokoro, Atsushi}, title = {Membranes}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Juan Carlos Vasquez & Koray Tahiroğlu. 2016. NOISA Étude 2. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: "NOISA Étude 2" is a second set of performance instructions created to showcase compelling, evolving and complex soundscapes only possible when operating the NOISA instruments, integrating the system’s autonomous responses as part of a musical piece. The multi-layered sound interaction design is based on radical transformations of acoustic instruments performing works from the classical music repertoire. This second "étude" is based entirely on interaction with spectrum-complementary Phase Vocoders. The system is fed with variations of a fixed musical motif, encouraging the system to recognise elements of the motive and create its own set of different versions emulating a human musical compositional process. Also, the Myo Armband is used in a creative way as an independent element for dynamic control, using raw data extracted from the muscles’ tension.
@inproceedings{nime2016-music-Tahiroglu2016, author = {Tahiroğlu, Juan Carlos Vasquez & Koray}, title = {NOISA Étude 2}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Koray Tahiroğlu. 2016. KET Conversations. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Tahiroglu2017, author = {Tahiroğlu, Koray}, title = {KET Conversations}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Paul Vandemast-Bell. 2016. Deformed Electronic Dance Music. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-VandemastBell2016, author = {Vandemast-Bell, Paul}, title = {Deformed Electronic Dance Music}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Lindsay Vickery. 2016. Nature Forms II for Flute, Clarinet, Viola, Percussion, Hybrid Field Recording and Electronics. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDFProgram notes: Nature Forms II is an eco-structuralist work, maintaining what Opie and Brown term the “primary rules” of “environmentally-based musical composition”: that “structures must be derived from natural sound sources” and that “structural data must remain in series”. Nature Forms II explores the possibility of recursive re-interrogation of a field recording through visualization and resonification/resynthesis via machine and performative means. The source field recording is contrasted with artificially generated versions created with additive, subtractive and ring modulation resynthesis. Interaction between the live performers and the electronic components are explores through “spectral freezing” of components of the field recording to create spectrally derived chords from features of the recording bird sounds and a rusty gate which are then transcribed into notation for the instrumentalists and temporal manipulation of the recording to allow complex bird calls to be emulated in a human time-scale. Cat Hope - Bass Flute Lindsay Vickery - Clarinet Aaron Wyatt - Viola Vanessa Tomlinson - Percussion
@inproceedings{nime2016-music-Vickery2016, author = {Vickery, Lindsay}, title = {Nature Forms II for Flute, Clarinet, Viola, Percussion, Hybrid Field Recording and Electronics}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Lindsay Vickery. 2016. Detritus (2015). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Vickery2017, author = {Vickery, Lindsay}, title = {Detritus (2015)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
-
Bernt Isak Wærstad. 2016. Cosmo Collective. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Griffith University.
Download PDF
@inproceedings{nime2016-music-Waerstad2016, author = {Wærstad, Bernt Isak}, title = {Cosmo Collective}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Brown, Andrew and Gifford, Toby}, year = {2016}, month = jun, publisher = {Griffith University}, address = {Brisbane, Australia} }
2012
-
Robert Alexander, David Biedenbender, Anton Pugh, Suby Raman, Amanda Sari Perez, and Sam L. Richards. 2012. Thought.Projection. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: The MiND Ensemble (Music in Neural Dimensions) is a new-media performance group that utilizes custom interfaces to explore the mind-machine-music connection. The traditional realization of the creative process ahs been as follows: there is an artist, a thought process, and a fixed medium which actualizes those thoughts. Neurofeedback radically shifts this paradigm. Now there is an artist and a dynamic medium that actively interfaces with the thought processes of the artist himself, drastically reshaping the way we understand the creative process. The MiND Ensemble promotes a rich awareness in which the mind is the creative medium. All projection and audio processing in this piece are driven in real time, with data gathered from the Emotiv EPOC headset. Composer(s) Credits: Robert Alexander, David Biedenbender, Anton Pugh, Suby Raman, Amanda Sari Perez, Sam L. Richards Instrumentalist(s) Credits: Jeremy Crosmer (violoncello), Robert Alexander (MiND Synth / Emotiv), Anton Pugh (MiND Synth / Emotiv) Artist(s) Biography: Robert Alexander is a Sonification Specialist with the Solar Heliospheric Research group at the University of Michigan, where he is pursuing a PhD in Design Science. He was awarded a JPFP Fellowship from NASA, an Outstanding Achievement award from ICAD, and is an Artist in Residence with the Imagine Science Film Festival. He has collaborated with artists such as DJ Spooky, and performed on several international stages. He founded the MiND Ensemble in 2010. David Biedenbender is currently a doctoral student in music composition at the University of Michigan. His first musical collaborations were in rock and jazz bands as an electric bassist and in jazz and wind bands as a bass trombonist and euphonium player. His present interests include working with everyone from classically trained musicians to improvisers, fixed electronics to brain data. Anton Pugh is a Masters student in Electrical Engineering: Systems (Signal Processing concentration) at the University of Michigan. Presently he is working on expanding his knowledge of the Processing and iOS platforms, especially as they apply to the MiND Ensemble. His primary hobby is designing and building custom electronic instruments and new musical interfaces. He is also an active musician and plays viola in the Campus Symphony Orchestra. Suby Raman is a composer, conductor, polyglot and linguist. His major artistic passion is drawn from language itself: the basic aural and mental components of language, how it determines, separates and unites cultures, and its influence (or lack thereof) on our perception and expression of reality. He has conducted research in brain-computer interface technology, which assist people afflicted by ALS and spinal cord injuries. Amanda Sari Perez is a researcher with the Neural Engineering lab at the University of Michigan. She is currently working with microelectrode arrays to record brain activity from implanted sites. In 2009 she co- founded the Ann Arbor HackerSpace, a DIY community engaged in hands-on learning. For the past 3 years she has created artistic installations for the Burning Man festival, including a performance that deconstructs participants’ notions of the self. Amanda is with the MiND Ensemble to work toward lowering the barrier for creative expression. Sam L. Richards is a composer, artist, and researcher with a penchant for interdisciplinary collaboration and an appetite for creative engagement of unwieldy conceptual problems. As a composer he has worked with media artists, filmmakers, animators, and choreographers, as well as making music for the concert hall. Although formally trained as a musician, he also produces video installations, visual and aural media, creative writing, and regularly steps off the beaten path in order to engage new things in new ways. Jeremy Crosmer is a gifted young professional cellist and composer. After achieving a double-major in music and mathematics from Hendrix College, he went on to receive multiple graduate degrees from the University of Michigan by the age of 23. As a cellist, Crosmer has performed across the country, soloing with orchestras in Arkansas and attending music festivals from Music Academy of the West to Tanglewood Music Center. An avid promoted of new music, Crosmer has both commissioned and premiered dozens of works by composers at Michigan and elsewhere. His performance dissertation at the University of Michigan is a study of the music of Paul Hindemith and cello sonatas by French composers during World War I. Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-Alexander2012, author = {Alexander, Robert and Biedenbender, David and Pugh, Anton and Raman, Suby and Perez, Amanda~Sari and Richards, Sam~L.}, title = {Thought.Projection}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Mark Applebaum. 2012. Aphasia. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Aphasia (2010), for solo performer and two-channel tape, was commissioned by the GRM, Paris and composed for virtuoso singer Nicholas Isherwood. The tape, an idiosyncratic explosion of warped and mangled sounds, is made up exclusively of vocal samples—all provided by Isherwood and subsequently transformed digitally. Against the backdrop of this audio narrative, an elaborate set of hand gestures are performed—an assiduously choreographed sign language of sorts. Each gesture is fastidiously synchronized to the tape in tight rhythmic coordination. In the context of NIME, the piece is noteworthy for its deliberate—if unintentionally political—contemporary technology abstinence. Ancillary questions arise, such as “What are the present limits of gesture control?”; “Do these limitations present unwelcome pressures on the boundaries of artistic imagination and creative capacity?”; and “How do we learn to recognize when it is artistically prudent to eschew emerging tools?” Composer(s) Credits: Mark Applebaum Instrumentalist(s) Credits: Mark Applebaum Artist(s) Biography: Mark Applebaum is Associate Professor of Composition at Stanford University where he received the 2003 Walter J. Gores Award for excellence in teaching. He received his Ph.D. in composition from the University of California at San Diego where he studied principally with Brian Ferneyhough. His solo, chamber, choral, orchestral, operatic, and electroacoustic work has been performed throughout the United States, Europe, Africa, South America, and Asia. Many of his recent works are characterized by challenges to the conventional boundaries of musical ontology. Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-Applebaum2012, author = {Applebaum, Mark}, title = {Aphasia}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Mercedes Blasco. 2012. The Theremin Orchestra. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: The Theremin Orchestra is a composition for three voices and a modular system of four spheres with built-in Theremin Sensors. Two of those spheres will control different effects on the voices and the rest will be played as Theremin instruments. The performance is presented as a sound event where initially the three voices appear raw and naked and as the composition unfolds the voices will be increasingly distorted through different effects applied with the Theremin controllers. In the climax of its progression the other two Theremin balls will become audible merging their sound with the mesh of vocal reshaped sources, not allowing to distinguish where the human ends and the machine starts. Composer(s) Credits: Mercedes Blasco Instrumentalist(s) Credits: Mercedes Blasco (voice, Theremin controllers, EMS synth), Thessia Machado and Sonia Megías (voice, Theremin instrument) Artist(s) Biography: Merche Blasco: Trained as a Telecommunications Engineer, \textbfMerche Blasco developed in parallel to her studies a more creative path related with music, video, installation and performance. She created her alter ego “Burbuja” as a vehicle for her own musical exploration and since its conception she has participated & collaborated with various artists, establishing a strong relationship between different mediums of artistic expression & her own musical direction lsuch as Lucy Orta at the Venice biennale, Chicks on Speed and Cristian Vogel. Her debut,“burbuja” (station55 records) was presented in Sonar 2007 and has been touring in different cities in Europe, USA and Canada in the past years: Mapping Festival (Geneve), Sonic Art Circuits (Washington), Queens Museum of Art (New York). Thanks to a Fulbright Grant she is currently a MPS Candidate in the Interactive Telecommunications Program (NYU) where she is mainly researching about new tools for Electronic Music Performance. Thessia Machado, Brazil/NY, investigates the physicality of sound and its effect on our perception of space. Many of her recent sculptures and installations function also as unorthodox instruments—pieces that have a real-time, live component. The expressive potential is active and changeable as the viewer interacts and performs with it. Thessia’s installations and video pieces have been exhibited in New York, London, Philadelphia, Paris, Amsterdam, Dublin, Berlin and Athens. She has been awarded residencies at the MacDowell Colony, Yaddo, the Atlantic Center for the Arts, the Irish Museum of Modern Art and the Vermont Studio Center and she is a recipient of fellowships from the New York Foundation for the Arts, The Experimental Television Center and The Bronx Museum. Performing as link, Thessia Machado, a self-avowed noisician, employs a changing line-up of handmade, found and modified instruments to build driving, meditative soundscapes. Sonia Megias was born on June 20th 1982 in Almansa, a village at the southeast of Spain. Since she was a kid, she has been abducted by the arts, nature and spirituality. Even today, some years later, she tries to interweave these beautiful disciplines, with the goal of transmit to the world her perception of Beauty or True. Thanks to the intensity of her musical production, she finds herself living in New York since 2010, on the Fulbright and a NYU Steinhardt grants. Here, she combines her studies at the New York University with the compositions of her last commissioned pieces. Her music has been performed in different music halls and festivals, underlining the following: Auditorio 400 at the National Museum of Contemporary Art “Queen Sophia” (2012, 2008); Cervantes Institute of New York (2012, 2011); Houston University, at Opera Vista Festival (2011); Consulate of Argentina in New York, at a Tribute to Alfonsina Storni (2009); Embassy of France in Spain (2009); United Nations Headquarters (2008). Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-Blasco2012, author = {Blasco, Mercedes}, title = {The Theremin Orchestra}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Per Bloland. 2012. Of Dust and Sand. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Of Dust and Sand uses the Electromagnetically-Prepared Piano device, a rack of 12 electromagnets which is suspended over the strings of a piano. Each electromagnet is sent an audio signal and in turn excites its respective string, much like a stereo speaker made from piano strings. In this piece a subset of the magnets remains active throughout, the performer physically silencing the strings by pressing down with fingertips. Thus the instrument becomes a kind of anti-piano—lifting a finger frees a string to vibrate, producing sound. In addition, various items, such as paper and a plastic ruler, rest directly on the strings further altering the timbre. Remember—everything you hear is entirely acoustic. Of Dust and Sand is dedicated to The Kenners. Composer(s) Credits: Per Bloland Instrumentalist(s) Credits: Daniel Graser (alto saxophone), Veena Kulkarni (piano) Artist(s) Biography: Per Bloland is a composer of acoustic and electroacoustic music whose works have been described as having an “incandescent effect” with “dangerous and luscious textures.” His compositions range from short intimate solo pieces to works for large orchestra, and incorporate video, dance, and custom built electronics. He has received awards and recognition from organizations such as SEAMUS/ASCAP, Digital Art Awards of Tokyo, ISCM, and SCI/ASCAP. He is currently a Visiting Assistant Professor of Computer Music at the Oberlin College Conservatory, and serves as the founding director of OINC, the Oberlin Improvisation and Newmusic Collective. For more information, please see: www.perbloland.com. Daniel Graser: Saxophonist \textbfDaniel Graser is emerging as one of the most innovative performers and pedagogues of his generation. A recent recipient of the Doctorate of Musical Arts from the University of Michigan, Dan served as Teaching Assistant to legendary saxophone pedagogue Donald Sinta for the past three years and joined the faculty of Oakland University School of Music, Theater, and Dance in 2011. Previously, Dan earned a Masters Degree from the University of Michigan in 2008 and Bachelors degrees in music theory/history and saxophone performance as a student of Dr. Timothy McAllister at the Crane School of Music in 2007. As an orchestral performer, Dan has performed as principal saxophonist with the National Wind Ensemble in Carnegie Hall under H. Robert Reynolds, the Detroit Symphony Orchestra under Leonard Slatkin, The New World Symphony under Michael Tilson Thomas and John Adams, the Ann Arbor Symphony under Arie Lipsky, the University of Michigan Symphony Orchestra under Kenneth Kiesler, the Hot Springs Festival Orchestra under Richard Rosenberg, and the Orchestra of Northern New York under Kenneth Andrews. Dan was selected by the University of Michigan to be featured as a recitalist at the Kennedy Center for the Performing Arts in Washington DC as part of the Millenium Stage Series. Recent and forthcoming performances include world premieres at the University of Michigan, orchestral performances with the New World Symphony and the Detroit Symphony Orchestra as well as chamber music performances at the Navy Band International Saxophone Symposium and the 2012 North American Saxophone Association Biennial Conference Veena Kulkarni: A regular performer in southeast Michigan, \textbfVeena Kulkarni teaches at the Faber Piano Institute and Madonna University. Veena’s performances have taken her throughout the United States and beyond as both a soloist and collaborator. In October, Veena won Best Liszt Interpretation in the 2011 Liszt-Garrison International Piano Competition. Veena is the pianist for Eero Trio, whose debut CD entitled Wolf Glen was released in 2010. Wolf Glen features the premiere recording of Christopher Dietz’s Fumeux fume, for clarinet, cello & piano. Veena completed her doctorate in Piano Performance and Pedagogy under Logan Skelton and John Ellis at the University of Michigan. Prior to that, she studied at Indiana University with Emile Naoumoff and Professors Brancart, Auer, Gulli and Tocco and at the Royal Academy of Music with Hamish Milne. Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-Bloland2012, author = {Bloland, Per}, title = {Of Dust and Sand}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Daniel Brophy and Colin Labadie. 2012. Munich Eunuch. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Many of the discourses around technological development in music are deeply concerned with aspects of control; i.e. how does one exert their control, or “mastery” over the technology they use. However, we propose that technological systems with a certain amount of unpredictability and randomness may also be useful, especially for improvisation. As an improvisation duo, our method often involves designing electronic instruments whose behaviors are somewhat unpredictable. As a result, our entire aesthetic is largely based on “riding” the boundary of control. Working in this way creates a situation where we are often forced to react to, and work with, the unexpected. Our improvisation features a number of handmade and hacked electronic instruments, all of which have been designed to behave somewhat unpredictably. Composer(s) Credits: Instrumentalist(s) Credits: Daniel Brophy (electronics), Colin Labadie (electronics) Artist(s) Biography: Daniel Brophy is a composer, performer and improviser of various musical styles and instrumentations ranging from orchestral and chamber music to extreme metal, sound installations, experimental improvisation and noise. He is a recipient of a SSHRC research grant, the 2012 KW Chamber Orchestra composition prize, the University of Alberta’s President’s Award of Distinction, and a Queen Elizabeth II Graduate Scholarship. Daniel currently resides in Edmonton, Alberta where he is pursuing a Doctor of Music degree in composition under the supervision of Dr. Scott Smallwood. He is member of the noise duo MUGBAIT and is proud to have worked with a number of other wonderful musicians, dancers and visual artists such as The Enterprise Quartet, junctQin, Digital Prowess, TorQ, Gerry Morita, Werner Friesen and many others. Daniel is currently developing interactive clothing for dancers, utilizing a combination of high and low technology. Colin Labadie is a composer and performer currently based in Edmonton, Alberta. His musical output ranges from solo, chamber, choral, orchestral, and electroacoustic compositions, to sound installations, multimedia collaboration, experimental improvisation, and noise music. His work is shaped by a broad range of musical influences, at times dealing exclusively with repetition, patterns, and subtle variation, while at others exploring chaos and unpredictability. Colin holds a BMus from Wilfrid Laurier University, where he studied with Linda Catlin Smith and Peter Hatch, and an MMus from the University of Alberta where he studied with Howard Bashaw, Mark Hannesson, Scott Smallwood, and Andriy Talpash. Currently, he is pursuing a Doctoral degree in Composition from the University of Alberta under the supervision of Scott Smallwood. He is the recipient of SSHRC’s Joseph-Armand Bombardier Master’s and Doctoral Scholarships, the University of Alberta Master’s and Doctoral Recruitment Scholarships, and the President’s Doctoral Prize of Distinction. Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-BrophyLabadie2012, author = {Brophy, Daniel and Labadie, Colin}, title = {Munich Eunuch}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Christopher Burns. 2012. Fieldwork. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Fieldwork is a software environment for improvised performance with electronic sound and animation. Two musicians’ sounding performances are fed into the system, and analyzed for pitch, rhythm, and timbral change. When the software recognizes a sharp contrast in one performer’s textures or gestures, it reflects this change by transforming the sound of the other musician’s performance. Not only are the musicians responding to one another as in conventional improvisation, but they are also able to directly modify their duo partner’s sound through the software. Fieldwork emphasizes rapid, glitchy, and polyrhythmic distortions of the musician’s performances, and establishes unpredictable feedback processes that encourage unexpected improvisational relationships between the performers and computer. Composer(s) Credits: Christopher Burns Instrumentalist(s) Credits: Christopher Burns, Andrew Bishop Artist(s) Biography: Christopher Burns is a composer, improviser, and multimedia artist. His instrumental chamber works weave energetic gestures into densely layered surfaces. Polyphony and multiplicity also feature in his electroacoustic music, embodied in gritty, rough-hewn textures. As an improviser, Christopher combines an idiosyncratic approach to the electric guitar with a wide variety of custom software instruments. Recent projects emphasize multimedia and motion capture, integrating performance, sound, and animation into a unified experience. Across these disciplines, his work emphasizes trajectory and directionality, superimposing and intercutting a variety of evolving processes to create form. Christopher is an avid archaeologist of electroacoustic music, creating and performing new digital realizations of classic music by composers including Cage, Ligeti, Lucier, Nancarrow, Nono, and Stockhausen. A committed educator, he teaches music composition and technology at the University of Wisconsin-Milwaukee. He has studied composition with Brian Ferneyhough, Jonathan Harvey, Jonathan Berger, Michael Tenzer, and Jan Radzynski. Andrew Bishop is a versatile multi-instrumentalist, composer, improviser, educator and scholar comfortable in a wide variety of musical idioms. He maintains a national and international career and serves as an Assistant Professor of Jazz and Contemporary Improvisation at the University of Michigan in Ann Arbor. Bishop’s two recordings as a leader have received widespread acclaim from The New York Times, Downbeat Magazine, Chicago Reader, All Music Guide, Cadence Magazine, All About Jazz-New York, All About Jazz-Los Angeles, and the Detroit Free Press, among others. As a composer and arranger he has received over 20 commissions, numerous residencies and awards and recognition from ASCAP, the Chicago Symphony Orchestra, the Andrew W. Melon Foundation, the National Endowment for the Arts, Chamber Music of America and a nomination from the American Academy of Arts and Letters. He has performed with artist in virtually every musical genre. He earned five degrees in music including a D.M.A. in music composition from the University of Michigan. Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-Burns2012, author = {Burns, Christopher}, title = {Fieldwork}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
James Caldwell. 2012. Texturologie 12: Gesture Studies. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Texturologie 12: Gesture Studies (2011) is the most recent of my series of pieces that explore the creation of intricate continuous-field textures (and borrow the name of a series of paintings by Dubuffet). In this piece, I return to my explorations of the potential of the Wii™remote to control computer music in performance. This time, I tried to treat the physical gesture as the germ or motive for the music. Some of the gestures are abstract, but some are suggestive of familiar activities like petting a cat, ringing a bell, smoothing wallpaper , playing a guiro, scooping, tapping, or vigorous stirring. (Check out the videos of my other Wiii™pieces on YouTube. Search “Caldwell wii.”) Composer(s) Credits: James Caldwell Instrumentalist(s) Credits: James Caldwell (Wii remotes) Artist(s) Biography: James Caldwell (b. 1957) is Professor of Music at Western Illinois University and co-director of the New Music Festival. He was named Outstanding Teacher in the College of Fine Arts and Communication (2005) and received the inaugural Provost’s Award for Excellence in Teaching. He was named the 2009 Distinguished Faculty Lecturer. He holds degrees from Michigan State University and Northwestern University, where he studied composition, theory, and electronic and computer music. Since 2004 he has studied studio art—drawing, lithography, painting, and sculpture—at WIU as a way to stretch creatively and again experience being a student. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-Caldwell2012, author = {Caldwell, James}, title = {Texturologie 12: Gesture Studies}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Thomas Ciufo. 2012. Fragments. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Fragments is an improvisational performance piece that utilizes physical treatments inside an acoustic piano, as well as digital treatments provided by computer-based digital signal processing. In addition to using a few simple physical controls (foot pedals and custom iPad interface) this piece also uses the performed audio stream as a gestural control source. The preformed audio stream is analyzed and important features are extracted. The current state and trajectory of these audio features are used to influence the behavior of the real-time signal processing environment. This creates a computer-mediated performance system that combines the capabilities of computation and sound processing with the tactile and expressive intimacy of the prepared acoustic piano. Fragments invites the listener into a unique and complex sonic environment where expectation, repetition, spontaneity, and discovery are intertwined. Composer(s) Credits: Thomas Ciufo Instrumentalist(s) Credits: Thomas Ciufo Artist(s) Biography: Thomas Ciufo is a composer, improviser, sound artist, and researcher working primarily in the areas of electroacoustic improvisational performance and hybrid instrument / interactive systems design. He currently serves as Assistant Professor of Recording Arts and Music Technology in the Department of Music at Towson University. He has been active for many years in the areas of composition, performance, interactive installation, video work, as well as music technology education. Festival performances include the SPARK festival in Minneapolis, the Enaction in Arts conference in Grenoble, the International Society for Improvised Music conference, the NWEAMO festival, the Extensible Electric Guitar Festival, various NIME conferences, and the ICMC / Ear to the Earth conference. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-Ciufo2012, author = {Ciufo, Thomas}, title = {Fragments}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Palle Dahlstedt. 2012. Pencil Fields. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: An improvised performance on a custom built instrument, using a simple pencil drawing as a gestural interface for controlling complex analog synthesis. The interface works by using the resistive properties of carbon to create a voltage potential field in the graphite/pencil markings on the paper using custom movable electrodes, made from coins. Then, control voltages are extracted from other points on the paper, controlling various aspects of the synthesized sound. The design was inspired by my previous research in complex mappings for advanced digital instruments, and provides a similarly dynamic playing environment for analogue synthesis. The interface is very lo-tech, easy to build, and should be possible to use with any analogue modular synthesizer. Here, I use it with a Bugbrand modular, built by Tom Bugs in Bristol, UK. The interface is presented in more detail in a paper presentation at the NIME conference. Composer(s) Credits: Instrumentalist(s) Credits: Palle Dahlstedt (pencil fields interface & modular synthesizer) Artist(s) Biography: Palle Dahlstedt (b.1971), composer, improviser, pianist and researcher from Stockholm, since 1994 living in Göteborg, Sweden. With composition degrees from the Academies of Malmö and Göteborg and a PhD from Chalmers University of Technology in evolutionary algorithms for composition, he is currently the main lecturer in electronic music composition at the Academy of Music and Drama, University of Gothenburg, and artistic director the Lindblad Studios. Also, he is associate professor in computer-aided creativity at the Department of Applied IT, performing extensive research in novel technology-based performance and improvisation techniques for electronic and acoustic musicians, and in computer models of artistic creative processes. His music has been performed on six continents and received several awards, e.g., in 2001 he was awarded the prestigeous Gaudeamus Prize, as the first ever for an electronic work. He is also performing on piano with and without electronics, and in the electronic free impro duo pantoMorf. Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-Dahlstedt2012, author = {Dahlstedt, Palle}, title = {Pencil Fields}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Diemo Schwarz Nicolas d’Alessandro and. 2012. DaisyLab, a Phonetic Deconstruction of Humankind. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: DaisyLab is a duet performance for two new interfaces for musical expression that have in common the ability to generate versatile vocal material. Diemo Schwarz’s instrument uses a variety of sensors on the top of corpus-based concatenative synthesis, which has been fed with voice sounds for this performance. Nicolas d’Alessandro plays the HandSketch interface over the new MAGE speech synthesizer, bringing tangible inputs to an emerging speech synthesis technique. Both systems have been submitted as long papers for this 2012 edition of NIME. Together these two performers explore the boundaries between vocal and non-vocal sonic spaces, aiming at deconstructing the humankind’s most ubiquitous communicative channel through a compositionally directed improvisation, a “comprovisation.” Composer(s) Credits: Instrumentalist(s) Credits: Nicolas d’Alessandro (HandSketch, iPad), Diemo Schwarz (CataRT, gestural controllers) Artist(s) Biography: Nicolas d’Alessandro obtained his PhD in Applied Sciences from the University of Mons in 2009. From a lifelong interest in musical instruments and his acquired taste in speech and singing processing, he will incrementally shape a research topic that aims at using gestural control of sound in order to gain insights in speech and singing production. He works with Prof. T. Dutoit for a PhD at the University of Mons between 2004 and 2009. Late 2009, he moves to Canada, to take a postdoc position with Prof. S. Fels at the MAGIC Lab, University of British Columbia, where he will work on the DiVA project. There he will also organize the first p3s workshop. Since December 2011, he is back in the University of Mons and leads the MAGE project. Nicolas is also an active electroacoustic performer in and around Belgium, playing guitar and invented instruments in various performances. Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-dAlessandroSchwarz2012, author = {Nicolas d'Alessandro and, Diemo Schwarz}, title = {DaisyLab, a Phonetic Deconstruction of Humankind}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Scott Deal. 2012. Jack Walk. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Jack Walk explores notions of ecstatic energy, control and release. The work begins with live and fixed percussion lines, re-processed into a series of electronic representations of specified structure. This provides a compositional framework that a percussionist interacts with, while in another sonic layer, a laptop musician simultaneously samples and re-processes the live percussion while channeling the audio back into the larger environment. A videographer mixes imagery related to the original compositional notions of ecstatic control and release. Layers of sonic material emanating from the drummer’s kit blur the virtual and real, while the music and imagery evoke imaginary lines tracing physical and conceptual flows of energy. The trio of performers for the NIME 2012 performance of Jack Walk (Deal, Drews, and Munson) comprise group known as Big Robot, an Indianapolis-based computer-acoustic trio that creates live, interactive, and media-enriched works. Composer(s) Credits: Scott Deal Instrumentalist(s) Credits: Scott Deal (percussion), Michael Drews (audio electronics), Jordan Munson (video) Artist(s) Biography: Scott Deal has premiered solo, chamber and mixed media works throughout North America Europe, and Asia. An artist who “displays phenomenal virtuosity” (Artsfuse) and presents a “riveting performance” (Sequenza 21), his recording of John Luther Adams’s Four Thousand Holes, for piano, percussion, and electronics was listed in New Yorker Magazine’s 2011 Top Ten Classical Recordings. In 2011, he and composer Matthew Burtner were awarded the Internet2 IDEA Award for their co-creation of Auksalaq, a telematic opera. Deal is Professor of Music and Director of the Donald Louis Tavel Arts and Technology Research Center at Indiana University Purdue University Indianapolis (IUPUI). He is the founder and director of the Telematic Collective, a multi-disciplinary artistic research group comprised of graduate students and professional collaborators. He also serves on the faculty for the Summer Institute for Contemporary Performance Practice at the New England Conservatory. Michael Drews is a composer, sound artist and computer musician. His work explores unconventional narrative strategies created from transforming contextual identity and the expressive power of cultural artifacts found in particular sonic and visual materials. Present throughout Drews’s work is an interest in performance-based computer virtuosity and improvisational applications of computer technology that expand traditional ideas of musical performance and creativity. Drews is a member of computer-acoustic ensemble, Big Robot and the experimental-electronica duo, Mana2. Performances of Drews’s compositions have been featured at SEAMUS, Cinesonika, Electronic Music Midwest, NYC Electronic Music Festival, Studio 300, PASIC, Super Computing Global and IASPM-Canada. Drews holds degrees from the University of Illinois at Urbana-Champaign (D.M.A.), Cleveland State University (M.MUS.) and Kent State University (B.A.). He resides with his family in Indianapolis and is Assistant Professor of Music at Indiana University-Indianapolis (IUPUI). For more information: michaeldrews.org or Twitter.com/MICHAEL-DREWS Jordan Munson is a musician, composer, and multimedia artist. He is a Lecturer in Music and Arts Technology, and an associate of the Donald Louis Tavel Arts and Technology Research Center, both at Indiana University Purdue University Indianapolis (IUPUI). His works for multimedia and music have been premiered at institutions such as the University of Kentucky, the University of Alaska at Fairbanks and the University of California San Diego. As a video artist, he has shown work at New York City Electro-Acoustic Music Festival and SEAMUS. Munson’s experimental electronic efforts have resulted in performances alongside artists such as Matmos, R. Luke DuBois and Bora Yoon. He is a member of the computer-acoustic ensemble Big Robot, in which he work focuses on live experimental percussion and electronics. Munson holds degrees from Indiana University in Indianapolis (M.S.M.T.) and the University of Kentucky (B.M.). Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-Deal2012, author = {Deal, Scott}, title = {Jack Walk}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Marco Donnarumma. 2012. Music for Flesh II, interactive music for enhanced body. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Composer(s) Credits: Marco Donnarumma Instrumentalist(s) Credits: Marco Donnarumma (Xth Sense) Artist(s) Biography: Marco Donnarumma: New media and sonic artist, performer and teacher, \textbfMarco Donnarumma was born in Italy and is based in Edinburgh, UK. Weaving a thread around biomedia research, musical and theatrical performance, participatory practices and subversive coding, Marco looks at the collision of critical creativity with humanized technologies. He has performed and spoken in 28 countries worldwide at leading art events, specialized festivals and academic conferences. Has been artist in residence at Inspace (UK) and the National School of Theatre and Contemporary Dance (DK). His work has been funded by the European Commission, Creative Scotland and the Danish Arts Council. In February 2012 Marco was awarded the first prize in the Margaret Guthman Musical Instrument Competition (Georgia Tech Center for Music Technology, US) for the Xth Sense, a novel, biophysical interactive system named the “world’s most innovative new musical instrument”. Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-Donnarumma2012, author = {Donnarumma, Marco}, title = {Music for Flesh II, interactive music for enhanced body}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Alexander Dupuis. 2012. Stelaextraction. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Stelaextraction uses the electronic extension capabilities of the Yerbanaut to construct a musical composition through self-reference across different timescales. The Yerbanaut is a custom electro-acoustic kalimba built from a yerba mate gourd, with the tines placed in a circular pattern rather than the usual horizontal arrangement. Its sensors are intended to make use of this new arrangement, with force-sensitive buttons giving the otherwise inert left hand expressive capabilities, and a distance sensor allowing the right hand’s motion to determine aspects of the processing. In Stelaextraction, all acoustic and processed sounds are recorded to a single buffer, the contents of which can be scrubbed through using the right hand’s distance sensor. In this way, past musical gestures can be explored and then re-explored, with the recursive processing developing self-similar musical patterns over the course of the piece. Composer(s) Credits: Alexander Dupuis Instrumentalist(s) Credits: Alexander Dupuis (Yerbanaut) Artist(s) Biography: Alexander Dupuis develops real-time audiovisual feedback systems mediated by performers, sensors, musicians, matrices, bodies, scores, games, and environments. He also composes, arranges and performs sounds for guitars, liturgies, chamber groups, horse duos, microwave cookbooks, and celebrity voices. He graduated from Brown University’s MEME program as an undergraduate in 2010, and is now in his second year of the Digital Musics masters program at Dartmouth College. Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-Dupuis2012, author = {Dupuis, Alexander}, title = {Stelaextraction}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Jenn Figg, Matthew McCormack, and Paul Cox. 2012. Thunderclap For Six Kinetic Light Drums. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: This work merges sound and light to illuminate complex rhythmic motives, polyrhythms and metrical patterns in a visual display generated by three drummers playing six “light” drums. These new instruments bring to life the dreams of 20th century synesthetes, such as Wassily Kandinsky and Alexander Scriabin and others who sought to create an imagined “visual music,” an ideal synthesis of music and visual art. Community Light Beacons are percussion instruments that leverage the potentials of music, analog technology, and human-generated power to visualize sound. These instruments add the dimension of light to the ancient tradition of drumming. The drums are user-powered, and when they are played—banged, hit and tapped—the vibrations from the drumhead are converted to electricity by the internal speaker transducer. The generated energy powers ultra bright LEDs, which light up with every hit and beam out from the Fresnel lens. Composer(s) Credits: Jenn Figg, Matthew McCormack, Paul Cox Instrumentalist(s) Credits: Ryan Hilty, Samuel Haese, Eric Young (Kinetic Light Drums) Artist(s) Biography: Jenn Figg is an artist investigating the connections between industry, science and art through the transformation of energy, performative objects and constructed ecosystems. She graduated with a BFA in Textiles from the Rhode Island School of Design and an MFA from the University of California at Santa Barbara. She is pursuing her Ph.D. in Media, Art, and Text at Virginia Commonwealth University. She lives in Baltimore and is an Assistant Professor of Art at Towson University in Maryland. Exhibitions include The Print Center in Philadelphia, Pennsylvania, The Art House at the Jones Center in Austin, Texas, Virginia MOCA in Virginia Beach, Virginia, the Columbus Center of Science and Industry in Columbus, Ohio, the Ingenuity Festival in Cleveland, Ohio. Other awards and residencies include the MacDowell Colony, the Lower Manhattan Cultural Council Residency, the Great Lakes College Association New Directions Initiative, and the University of California Interdisciplinary Humanities Center, Visual, Performing & Media Arts Award. Matthew McCormack explores energy transformation and expression through technology, kinetic sculpture and blown glass. He graduated with a BFA in Glass from The Ohio State University and is now living in Baltimore, Maryland. He is pursuing an Interdisciplinary MFA at Towson University. His research interests include modifying a speaker transducer for optimum energy generation and developing a series of rapid prototyped Fresnel lens stamps for quartz crystal light instruments. His work has been featured at the Virginia Museum of Contemporary Art in Virginia Beach, Virginia, the Columbus Center of Science and Industry in Columbus, Ohio, the Toledo Museum of Art in Toledo, Ohio, the Rankin Art Gallery at Ferris State University in Big Rapids, Michigan, the National Museum of Glass in Eskisehir, Turkey, the Franklin Park Conservatory in Columbus, Ohio, the Ingenuity Festival in Cleveland, Ohio, and as part of the Lower Manhattan Cultural Council’s Governors Island Residency in New York City. Paul Cox is a scholar, composer and percussionist in Cleveland, Ohio. He currently teaches music history and percussion at Case Western Reserve University (CWRU) and the Oberlin Conservatory of Music, where he is a Visiting Assistant Professor. He earned a PhD in musicology from CWRU in 2011 after the completion of his dissertation, Collaged Codes: John Cage’s Credo in Us, a study of Cage and Merce Cunningham’s first dance collaboration in 1942. Current projects include composing Just.Are.Same for string quartet, oboe and tape, which weaves together an electronic soundscape of spoken words drawn from victims of genocide with acoustic and electronic sounds; composing an evening-length work for the ensemble NO EXIT, in collaboration with famed world percussionist Jamie Haddad and guitarist Bobby Ferrazza; curating a Cage “Musicircus” for the opening of the new Museum of Contemporary Art in Cleveland, and artistically advising the Sitka Fest in Alaska, a three-month-long festival of arts and culture. Ryan Hilty is a percussionist earning a degree in Music Education from the Case Western Reserve University School of Music in Cleveland, Ohio. He is currently in his second undergraduate year, studying percussion with Matthew Larson. He has performed as a percussionist in numerous ensembles, including the Crestwood Wind Ensemble, Jazz Band, and the Cleveland Youth Wind Symphony. He is the recipient of the 2010 John Phillip Sousa Award. After earning his degree in music education, Ryan aspires to become a high school band director. Samuel Haese is a student of music and physics at Case Western Reserve University (CWRU) in Cleveland, OH. He has studied concert percussion with Matthew Bassett, Feza Zweifel, and Matthew Larson, and currently collaborates with Paul Cox in exploring and performing modern percussion music. In the meantime, Sam is receiving a BA in Music for studying piano with Gerardo Teissonniere through the Cleveland Institute of Music. Sam intends to also receive a degree in Engineering Physics from CWRU which he hopes will allow him to explore and understand music technologies. Originally from Berkeley, California, his current plans include moving to a sunnier place than Cleveland after graduation within the next two years. Eric Young is a student at Case Western Reserve University majoring in Computer Science and Audio Engineering. He grew up in Kansas City, Missouri. He plans on incorporating his interests into a career developing digital audio software. Eric has been studying general percussion performance since 2003 and specializes in jazz drums. Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-FiggMcCormackCox2012, author = {Figg, Jenn and McCormack, Matthew and Cox, Paul}, title = {Thunderclap For Six Kinetic Light Drums}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Jonathan Golove and Magnus Martensson. 2012. Rachmaninoff-Wilson Medley. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: The most impressive uses of the theremin cello during Theremin’s time in New York are Leopold Stokowski’s inclusion of one in the Philadelphia Orchestra’s low string section and Varese’s composition of two solo parts in Ecuatorial. Even more important, from my perspective, is the fact that the instrument represents the first attempt to harness the human potential to shape and manipulate electronic sound by means of the technical apparatus of the modern player of bowed string instruments. Rachmaninoff’s Vocalise, Op. 34 no. 14, for textless high voice, highlights the hauntingly vocal quality of the theremin cello. Vocalise is the last of a set of 14 songs published in 1912, less than a decade before Theremin’s experiments with musical sounds began to bear fruit. Brian Wilson and the Beach Boys, by virtue of their use of Bob Whitsell’s Electro-Theremin on several recordings, are irrevocably linked to the history of the theremin. Composer(s) Credits: Vocalise, Op.34 no. 14 - Sergei Rachmaninoff Medley (Good Vibrations/God Only Knows) - Brian Wilson Instrumentalist(s) Credits: Jonathan Golove (Theremin cello), Magnus Martensson (piano) Artist(s) Biography: Jonathan Golove, Associate Professor of Music at the University at Buffalo, has been featured as theremin cello soloist with the Asko/Schoenberg Ensemble, London Sinfonietta, and International Contemporary Ensemble; and as cello soloist with the Buffalo Philharmonic Orchestra, Slee Sinfonietta, and New York Virtuoso Singers. He has recorded for the Albany, Centaur, Albuzerque, and Nine Winds labels, and appeared at festivals including the Holland Festival, Festival d’Automne, Lincoln Center Festival, June in Buffalo, and the Festival del Centro Histórico (Mexico City). Golove gave the first performance of Varese’s Ecuatorial using Floyd Engel’s recreation of the legendary early 20th century instrument at the University at Buffalo in 2002. He is also active as an electric cellist, particularly in the field of creative improvised music. An accomplished composer, his works have been performed at venues including the Kennedy Center, Venice Biennale, Festival of Aix-en-Provence, Lincoln Center Chamber Music Society II, and the Kitchen. Magnus Martensson is Music Director of The Scandinavian Chamber Orchestra of New York; between 1996 and 2007 he was Visiting Assistant Professor at SUNY Buffalo and conductor of the Slee Sinfonietta. In 1989, Martensson made his operatic debut in Malmö, Sweden, conducting a production of Offenbach’s Orpheus in the Underworld, and has subsequently conducted operas by Mozart, Puccini, Golove, among others. He has conducted several world premiere recordings, including orchestral music by Jeffrey Stadelman, Roger Reynolds, and David Felder. In the past few seasons Martensson has guest conducted with the New York New Music Ensemble, the Trondheim Soloists, Musica Vitae, ICE, and at the Monday Evening Concert Series (Los Angeles), The Manhattan School of Music, and Teatro San Martin (Buenos Aires). Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-GoloveMartensson2012, author = {Golove, Jonathan and Martensson, Magnus}, title = {Rachmaninoff-Wilson Medley}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Mara Helmuth and Rebecca Danard. 2012. Water Birds. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Water Birds is an interactive and collaborative composition for clarinet, bass clarinet and computer. The sound of the clarinets is processed live by spectral delays with MaxMSP and rtcmix . Space structures the composition, as the particular sound parameters initiated depend on the performer’s location on the stage. The development of the current version of the piece involved a custom wireless infrared sensor network, which responds to the clarinetist’s movements. Currently the piece is performed without the sensor network, but the strategy of that configuration still drives the composition. A score containing five sound-generating ideas, consisting of musical fragments and a Zen poem, allows the performer to improvise, creating his/her own sound pathway through the piece. The pathway is reminiscent of the path of birds in the Zen poem, Dogen’s On the Nondependence of Mind, which reads: “Water birds/going and coming/their traces disappear/but they never/forget their path.” Composer(s) Credits: Mara Helmuth and Rebecca Danard Instrumentalist(s) Credits: Rebecca Danard (B♭Clarinet, bass clarinet), Mara Helmuth (Computer) Artist(s) Biography: Mara Helmuth composes music often involving the computer, and creates multimedia and software for composition and improvisation. Her recordings include Sounding Out! (Everglade, forthcoming 2010), Sound Collaborations, (CDCM v.36, Centaur CRC 2903), Implements of Actuation (Electronic Music Foundation EMF 023), and Open Space CD 16, and her work has been performed internationally. She is on the faculty of the College-Conservatory of Music, University of Cincinnati and its Center for Computer Music’s director. She holds a D.M.A. from Columbia University, and earlier degrees from the University of Illinois, Urbana-Champaign. Her software for composition and improvisation has involved granular synthesis, Internet2, and RTcmix instruments. Her writings have appeared in Audible Traces, Analytical Methods of Electroacoustic Music, the Journal of New Music Research and Perspectives of New Music. Installations including Hidden Mountain (2007) were created for the Sino-Nordic Arts Space in Beijing. She is a past president of the International Computer Music Association. Rebecca Danard: Performer, educator, scholar and entrepreneur, \textbfRebecca Danard holds a doctorate in clarinet performance at the University of Cincinnati College-Conservatory of Music. Also an enthusiastic teacher, Rebecca is Adjunct Faculty at Carleton University. She is currently Artistic Director of the Ottawa New Music Creators: a collective of professional composers and performers dedicated to bringing contemporary music to Canada’s capital. Rebecca’s performance career centres on new and experimental music, including interdisciplinary collaborations, working with new technology, organizing events, and commissioning composers. She has worked with film makers, dancers, choreographers, actors, poets, lighting designers and visuals artists as well as performing musicians and composers. She has been invited to perform at festival such as Music10 (Hindemith Centre, Switzerland), the Ottawa Chamber Music Festival, the Ottawa Jazz Festival, the Bang on a Can Summer Festival, and Opera Theatre and Music Festival of Lucca; at conferences such as Clarinetfest, CLIEC and SEAMUS. Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-HelmuthDanard2012, author = {Helmuth, Mara and Danard, Rebecca}, title = {Water Birds}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Bill Hsu. 2012. Flue. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Flue is a structured audio-visual improvisation for three musicians, utilizing live acoustic and electronic sound and interactive animations. A physics-based smoke simulation is influenced by the real-time audio from the musicians’ performance. The audio from the performance is analyzed; high-level tempo, spectral and other features are extracted, and sent via Open Sound Control to the animation environment. The smoke trails are also able to coalesce into well- defined symbols and forms, all while moving in a natural-seeming manner consistent with the underlying fluid simulation. Composer(s) Credits: Bill Hsu Instrumentalist(s) Credits: Bill Hsu (electronics, interactive animation), Matt Endahl (piano), Mike Khoury (violin) Artist(s) Biography: Bill Hsu is an Associate Professor of Computer Science at San Francisco State University. He has performed in the US, Asia, and Europe, including NIME 2011 (Oslo), Festival art::archive:architectures (ZKM, Karlsruhe, 2011), SMC 2009 (Porto), Harvestworks Festival 2009 (New York), Fete Quaqua 2008 (London), MIX Festival 2007 and 2009 (New York), NIME 2007 (New York), Stimme+ 2006 (ZKM, Karlsruhe), and the First Hong Kong Improvised Performance Festival 2005. Website: http://userwww.sfsu.edu/ whsu/art.html Matt Endahl (b. 1985) is an improvising pianist based in Ann Arbor, MI. A student of Geri Allen and Ed Sarath at the University of Michigan, Matt is an active performer and organizer, having performed in a wide variety of settings, from Gershwin’s "Rhapsody in Blue" to freeform solo electronic sets. Matt has taught jazz piano at Hillsdale College since 2008. http://www.myspace.com/mattendahl Mike Khoury was born in Mt. Pleasant, Michigan in 1969. As the son of visual artist Sari Khoury, he was exposed to various forms of visual arts and creative musical forms. Khoury is Palestinian. Khoury’s collaborators often include Leyya Tawil (dance), Ben Hall (percussion), Christopher Riggs (guitar), and Andrew Coltrane (sound manipulation). He has performed and recorded with Faruq Z. Bey, Dennis Gonzalez, Luc Houtkamp, Maury Coles, Jack Wright, Graveyards, John Butcher, Gino Robair, Gunda Gottschalk, and Le Quan Ninh. Khoury runs the Entropy Stereo music label where he focuses on issuing new and archival music by challenging artists. His studies include those with John Lindberg, Gerald Cleaver, and composer/violinist David Litven. Khoury is the author of a chapter on Egyptian-American composer Halim El-Dabh in a forthcoming anthology on the Arab avant garde, published by Wesleyan University Press. Website: http://www.myspace.com/michaelkhoury Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-Hsu2012, author = {Hsu, Bill}, title = {Flue}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Mari Kimura and Tomoyuki Kato. 2012. Eigenspace. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Eigenspace (2011) is a collaborative project with Japan’s leading visual artist in new media, Tomoyuki Kato (Movie Director), with Yoshito Onishi (Image Programing), and Chisako Hasegawa (Producer). As Japanese, we were deeply touched by the Fukushima nuclear meltdown, the worst manmade catastrophe in the history of the human kind, which is not contained today contaminating the globe. Eigenspace is about our love and prayer for the humankind and our planet, and for the next generation. The name is also taken from “eigenvalue,” a mathematical function used in analyzing the bowing movement, which interacts in real time with Mr. Kato’s software. The musical expression is extracted by IRCAM’s “Augmented Violin” and their newest motion sensor “mini-MO”, custom-fit into a glove designed by Mark Salinas. Special thanks to the Real Time Musical Interactive Team at IRCAM. Eigenspace was commissioned by Harvestworks, and premiered at Roulette in Brooklyn, on October 9th, 2011. Composer(s) Credits: Tomoyuki Kato (Movie Director), with Yoshito Onishi (Image Programing), and Chisako Hasegawa (Producer) Instrumentalist(s) Credits: Mari Kimura (violin), Tomoyuki Kato (Interactive graphics) Artist(s) Biography: Mari Kimura: Violinist/composer \textbfMari Kimura is widely admired as the inventor of “Subharmonics” and her works for interactive computer music. As a composer, Mari’s commissions include the International Computer Music Association, Harvestworks, Music from Japan, and received grants including NYFA, Arts International, Meet The Composer, Japan Foundation, Argosy Foundation, and NYSCA. In 2010 Mari won the Guggenheim Fellowship in Composition, and invited as Composer-in-Residence at IRCAM in Paris. In October 2011, the Cassatt String Quartet premiered Mari’s “I-Quadrifoglo”, her string quartet with interactive computer at the Symphony Space in NYC, commissioned through Fromm Commission Award. Feature articles in the past year include: the New York Times (May 13th, written by Matthew Gurewitsch), and Scientific American (May 31st, written by Larry Greenemeier). Mari’s CD, The World Below G and Beyond, features her Subharmonics works and interactive computer music. Mari teaches a course in Interactive Computer Performance at Juilliard. http://www.marikimura.com Tomoyuki Kato is a renowned Japanese visual artist/movie director who works in a wide range of high-tech projects including advertisements, commercials, museums exhibitions and theme-parks. Kato’s work is known for the superb quality, high impact, originality and new technical methods. Recently, Kato has been active in creating corporate future vision, such as “concept car,” incorporating live action, computer graphics and animation on project bases; his recent exhibition includes 2010 Shanghai Expo. His highly acclaimed “Grand Odyssey,” created for 2005 Aichi Expo’s Toshiba/Mitsui pavilion, is now displayed at Nagasaki’s Huistenbosch theme-park. In 2010, Kato created “Better Life from Japan,” an exhibit for Otsuka Pharmaceutical company at Shanghai Expo, using a 360-degree display. Kato has received and nominated for numerous awards at international and national festivals, including Japan Ministry of Culture Media Arts Festival, Los Angels International Short Film Festival, Montreal International Film Festival and London International Advertising Festival. Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-KimuraKato2012, author = {Kimura, Mari and Kato, Tomoyuki}, title = {Eigenspace}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Bongjun Kim and Woon Seung Yeo. 2012. Where Are You Standing? In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Where Are You Standing? (2012) is a collaborative mobile music piece using the digital compass on mobile phones as an intuitive, interactive musical instrument. The piece features performers on stage making sound by aiming at other performers: compass-measured orientation of each aiming gesture is mapped to a specific musical note depending on which player is aimed at, and is visualized on screen in real-time. The piece begins with three performers playing “harmonic” sounds by taking aim at each other. This consonance is broken by the introduction of the fourth performer who represents conflict: the notes played by this performer as well as the notes played by others when they aim at this performer are dissonant to cause musical tension. Finally, the last performer leaves the stage to resolve the tension, and the piece ends with three performers back in congruity. Composer(s) Credits: Bongjun Kim, Woon Seung Yeo Instrumentalist(s) Credits: Bongjun Kim (operator), Woon Seung Yeo, Jeong-seob Lee, Seunghun Kim, Xuelian Yu (iPhones) Artist(s) Biography: Bongjun Kim (b. 1981, Seoul, Korea) is a Masters student at Korea Advanced Institute of Science and Technology (KAIST) and a member of the Audio and Interactive Media (AIM) Lab at the Graduate School of Culture Technology (GSCT), KAIST. Kim has received his B.S. and M.S. degrees in Industrial and Information Systems Engineering from Ajou University, and he has also worked at Doosan Infracore as an R&D researcher. He is also a composer, performer, and system developer of the KAIST Mobile Phone Orchestra (KAMPO), where he has designed interactive mobile music performance system and composed the piece “Where Are You Standing?” which features digital compass-based interaction. Currently his research interests are algorithmic composition, music informatics, machine improvisation, and mobile media as a new musical interface. Woon Seung Yeo is a bassist, media artist, and computer music researcher/educator. He is Assistant Professor at Korea Advanced Institute of Science and Technology (KAIST) and leads the Audio and Interactive Media (AIM) Lab and the KAIST Mobile Phone Orchestra (KAMPO). Yeo has received degrees from Seoul National University (B.S. and M.S. in Electrical Engineering), University of California at Santa Barbara (M.S. in Media Arts and Technology), and Stanford University (M.A. and Ph.D. in Music). His research interests include digital audio signal processing, musical acoustics, audiovisual art, cross-modal display, physical interaction for music, musical interfaces, mobile media for music, and innovative performance paradigm as well. Yeo has also curated/produced mobile music concerts, telematics music concerts, and multimedia installations and exhibitions. Jeong-seob Lee is a Ph.D. student at the Graduate School of Culture Technology (GSCT), KAIST, Korea, and a research member of Audio & Interactive Media Lab. He received his M.S. degree from the same institute, and his undergraduate degree in mechanical engineering from Seoul National University. As an amateur dancer and choreographer, he is interested in various performances involving dance. His experiences on stage and in engineering lead him to conduct research in interactive performance paradigm and multimedia interface technology. He has produced a number of new media performances through collaborations with dancers and musicians, and worked as an audiovisual interaction designer. He is also interested in acoustic motion detection with off-the-shelf audio devices. Seunghun Kim is a Ph.D. candidate at KAIST and is a member of Audio and Interactive Media (AIM) Lab in the Graduate School of Culture Technology (GSCT). He has received the B.S degree in electrical and communications engineering from Information and Communications University (ICU). He wrote his Master thesis on sound synthesis of the geomungo (a traditional Korean stringed instrument) at KAIST. He has presented several papers on musical interfaces at domestic/international conferences including the international conference on new interfaces for musical expression (NIME) and the international computer music conference (ICMC). In addition, he has participated in the development of interactive installations, which were exhibited at Incheon International Digital Art Festival (INDAF), KT&G SangSang Madang, Gwangju Design Biennale, and Seoul Digital Media Content International Festival. He is also a member of the KAIST Mobile Phone Orchestra (KAMPO). Xuelian Yu was born and raised in China and earned her B.S. in engineering at Jiangnan University’s Digital Media Technology program. She joined the Audio and Interactive Media (AIM) Lab at the Graduate School of Culture Technology (GSCT), KAIST in the Fall of 2010 to combine her problem-solving skills and creative abilities to set up worlds that people become characters in the environments and interact with their surroundings. Xuelian is currently in Pittsburgh to discover more experience in projects that produce artifacts that are intended to entertain, inspire or affect the participants, at Entertainment Technology Center of Carnegie Mellon University and she focuses on the research on the comparison of description on surround sound at the same time. The passion for learning and expanding her experiences has strengthened her goal to work in interactive design. Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-KimYeo2012, author = {Kim, Bongjun and Yeo, Woon Seung}, title = {Where Are You Standing?}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Hans Leeuw and Diemo Schwarz. 2012. Violent Dreams. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Two typical NIME related inventions meet in this performance. IRCAM based Diemo Schwarz and HKU lecturer and Electrumpet player Hans Leeuw met at STEIM in 2010. The extreme sound possibilities of the sensor driven Electrumpet combine wonderfully with the corpus based techniques in CataRT. Both Diemo and Hans play their self-invented instruments for a number of years in which they have done several iterations / extensions and built a lot of performance experience. This experience pays off in the expressive capabilities of both performers making this a concert that goes far beyond an extended demonstration of new instruments. In Violent Dreams, Hans’s manipulated sounds are recorded in CataRT, from which Diemo chooses specific sonic characters and evolutions via gestural controllers, that are played back and transformed by CataRT, challenging Hans to come up with more extreme sounds surpassing his own originals. Thus we get an interesting and challenging improvisation battle between two players that both fully master their instrument. Composer(s) Credits: Instrumentalist(s) Credits: Hans Leeuw (Electrumpet), Diemo Schwarz (CataRT, gestural controllers) Artist(s) Biography: Hans Leeuw is recognized as one of Hollands top players composers and bandleaders in the Jazz and improvised music scene even before he started to use electronics and designed his own Electrumpet. He is most noted as the bandleader of the Dutch formation Tetzepi, a 14 piece Big Band. Tetzepi exists for 15 years and is structurally funded by Dutch government. Next to his activities as a performer Hans teaches at the Utrecht school for the arts at the Music Technology department and at the faculty Industrial Design of the Technical University Eindhoven where he coaches projects on the design of new musical instruments. In 2008 he designed the Electrumpet, a hybrid electroacoustic instrument that differs from similar design in that it takes the trumpet players normal playing position and expression in account thus creating an instrument that combines acoustic and electronic expression seamlessly. (see ‘the electrumpet, additions and revisions’) Diemo Schwarz is a researcher and developer at Ircam, composer of electronic music, and musician on drums and laptop. He holds a PhD in computer science applied to music for his research on corpus-based concatenative musical sound synthesis. His compositions and live performances, under the name of his solo project Mean Time Between Failure, or improvising with musicians such as Frédéric Blondy, Victoria Johnson, Pierre Alexandre Tremblay, Etienne Brunet, Luka Juhart, George Lewis, Evan Parker, explore the possibilities of corpus-based concatenative synthesis to re-contextualise any sound source by rearranging sound units into a new musical framework using interactive navigation through a sound space, controlled by gestural input devices. His research work includes improving interaction between musician and computer, and exploiting large masses of sound for interactive real-time sound synthesis, collaborating with composers such as Philippe Manoury, Dai Fujikura, Stefano Gervasoni, Aaron Einbond, Sam Britton. Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-LeeuwSchwarz2012, author = {Leeuw, Hans and Schwarz, Diemo}, title = {Violent Dreams}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Levy Lorenzo. 2012. Modified Attack. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: When designing my new electronic instruments, I always keep in mind the relationship between the instrument and performer as a tool and its master. The instrument should be a channel by which the performer can access the dimensions of sound in order to attempt to make music. The music should then originate from the musicians intention and not the instrument itself. Thus, I design my instruments as intuitive, transparent, and non-idiosyncratic mappings between physical gesture and sound. This new electronic instrument remaps a Logitech Attack 3 Joystick to be able to control sound. Through the joystick, the performer can control volume, rhythm, repetition, and pitch of custom, preprogrammed sounds. Additionally, the joystick can be used to record and playback short audio loops. The product of this design allows for agile and intentional electronic musical gestures where rhythm, volume, and pitch are clear and deliberate. I have been able to reach a wide range of musical expressions and I am learning and discovering more as I practice MODIFIED ATTACK. Composer(s) Credits: Levy Lorenzo Instrumentalist(s) Credits: Levy Lorenzo Artist(s) Biography: Levy Lorenzo is an electronics engineer and percussionist living in New York. Specializing in microcontroller-based, he performs experimental, live-electronic & acoustic music using new, custom electronic musical instruments and percussion. His work has been featured at STEIM in Amsterdam (NL), the Darmstadt School for New Music (DE) and the International Ensemble Moderne Academy (AU). Currently, Levy is a Live Sound Engineer for the International Contemporary Ensemble and Issue Project Room (Brooklyn, NY). Levy holds B.S. and M.Eng. degrees in Electrical & Computer Engineering from Cornell University as well as an M.M. degree in Percussion Performance from Stony Brook University, where he is currently a D.M.A. candidate. [www.levylorenzo.com] Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-Lorenzo2012, author = {Lorenzo, Levy}, title = {Modified Attack}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Martin Marier. 2012. Clarinet (Albino Butterfly). In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Clarinet is the third piece in a series of monotimbral works. Like its siblings Piano and Cymbal, it was inspired by the sound qualities of an acoustic instrument. This minimalist and meditative piece is a structured improvisation performed on the sponge, a musical interface designed by the composer. The sponge is basically a cushion equipped with sensors (accelerometers, buttons and force sensing resistors) which detect when it is squeezed, twisted or shaken. Because the sponge evolves continuously, the piece exists in many versions. Each new version drifts further away from the original compositional intentions and the piece is slowly becoming less meditative. The latest version is subtitled Albino Butterfly. Composer(s) Credits: Martin Marier Instrumentalist(s) Credits: Martin Marier (Sponge) Artist(s) Biography: Martin Marier is a composer and a performer who is mainly interested in live electronic music using new interfaces. He is the inventor of the sponge, a cushion like musical interface that he uses to perform his pieces. The main goal of this approach is to establish a natural link between gesture and sound in electronic music. He aims at improving the interaction with the audience and at making the process of composing more playful. His research on the sponge is the topic of the doctorate he is pursuing at the Universit de Montral under the supervision of Prof. Jean Piché. He was also supervised by Dr. Garth Paine during an exchange at the University of Western Sydney (Australia) in 2011. Martin has also composed music for theatre, collaborating mostly with the Théâtre I.N.K. company for whom he wrote the music of three plays: "L’effet du temps sur Matévina" (2012), "Roche, papier... couteau" (2007), "La cadette" (2006). He sometimes writes music for films and collaborates with the film composer Benoit Charest. He is one of the founders of Point d’écoute (PDE), a collective whose purpose is to promote electroacoustic music. Along with his four colleagues of PDE, he produced concerts in Montreal, New York and Sydney. Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-Marier2012, author = {Marier, Martin}, title = {Clarinet (Albino Butterfly)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Roberto Morales-Manzanares. 2012. Desamor I. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Desamor I is inspired in a model of meditation where primordial awareness or naturally occurring timeless awareness is seen as a result of a conversation with my wife Alejandra. This work is for piano, computer and two Wii controllers attached to my forearms. The output is 4 channels. The gestures of the pianist (movement, timber and dynamics) are captured in real time via 2 microphones and a set of 2 Wii controllers. The computer languages involved in the development of the project were: Escamol, a prolog environment for algorithmic composition designed by the composer, and SuperCollider. In this piece I share my experience as a performer-composer within a multi-platform programming environments involving signal processing and machine learning techniques. Composer(s) Credits: Roberto Morales-Manzanares Instrumentalist(s) Credits: Roberto Morales-Manzanares (piano, percussion and electronics) Artist(s) Biography: Roberto Morales-Manzanares: Born in Mexico City, \textbfRoberto Morales-Manzanares started his musical training in national folkloric music and learned how to play harps and different kinds of guitars and flutes from several regions of the country. His doctorate in music composition was completed at UC Berkeley in 2006. As a composer, he has written music for theater, dance, movies, TV and radio. As an interpreter Morales-Manzanares has participated on his own and with other composers in forums of jazz, popular and new music, including tours to Europe United States and Latin-America. As a researcher, he has been invited to different national and international conferences such as ICMC, International Join Conference on Artificial Intelligence IJCAI and Symposium on Arts and Technology and has several publications. Currently he is member of the “Sistema Nacional de Creadores”. His music can be found in ICMC recordings, Victo label www.victo.qc.ca (Leyendas in collaboration with Mari Kimura) and Irradia/Pocoscocodrilos. Concert Venue and Time: Lydia Mendelssohn Theatre, Tuesday May 22, 7:00pm
@incollection{nime2012-music-Morales-Manzanares2012, author = {Morales-Manzanares, Roberto}, title = {Desamor I}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Alberto Novello. 2012. Fragmentation. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: In this piece we explore the personality of the “post-modern man”. Exposed to aggressive stimulation and overwhelming data streams, he must make important choices to follow a rational “mind path” while his time quickly runs out. The performer, impersonating the post-modern man, wears an electro-encephalographic headset that detects his mind activity. The analysis of its output reveals the power of the performer’s three thoughts which are connected to forward movement, turn left, and turn right in the virtual maze projected on a screen. Despite the distracting external forces, embodied by the sound and flickering visuals, the performer must remain paradoxically calm to generate the correct states of mind that let him navigate his way out of the maze. Every time the performer crosses a red boundary in the maze, he gets closer to the exit, and a new stochastic musical scene is triggered. The time and structure of the composition is thus entirely determined by the choices and concentration of the performer. Composer(s) Credits: Alberto Novello Instrumentalist(s) Credits: Alberto Novello (music, EEG analysis, top visuals), Emmanuel Elias Flores (frontal visuals), Honza Svasek (Butoh, EEG control), E. McKinney (photography) Artist(s) Biography: Alberto Novello a.k.a. JesterN studied piano and double bass at the Conservatory of Udine, graduated in Physics at the University of Trieste, he completed in 2004 the master “Art, Science and Technologies” at the Institut National Polytechnique of Grenoble, France, under the guidance of J.C. Risset, and C. Cadoz. He was teacher of electronic music composition at the Conservatory of Cuneo, Italy. From 2004 to 2009 he worked at the Philips Research, Eindhoven, Netherlands, in the field of Music Perception and Music Information Retrieval with several publications in international conferences and journals. In 2009 he received a PhD degree at the Technische Universiteit Eindhoven. He attended the Mater of Sonology under the guidance of Paul Berg, Joel Ryan, and Richard Barret. Since 2004 he produced several electronic audio visual pieces assisting among others Alvin Lucier, Trevor Wishart, and Butch Morris. His pieces can be found on his website: http://dindisalvadi.free.fr/. Honza Svasek was born in 1954 in the Netherlands. After his studies he moved to Copenhagen were he became a graphic designer. Then he worked as computer professional and became a UNIX/Linux expert. In present he is a visual artist and performer. Honza started his research of Butoh 5 years ago. He studied with Butoh performers such as Itto Morita, Atsushi Takenouchi, Ken May, Yumiko Yoshioka,Yuko Ota, Imre Thormann. Currently he is studying with Rhizome Lee at the Himalaya Subbody Butoh School. http://Honz.nl Emmanuel Elias Flores is a media designer and software artist based in the Netherlands. He studied music and cinema in Mexico and Sonology at the Royal Conservatory in The Hague (NL). His work is centered around the idea of exploring different types of cinematic experiences and the enhancement of new narrative forms which bridge technology, art and perception. His work has been presented on a wide range of formats: from audiovisual pieces for electronic music, opera, dance and live cinema sets, to the design of public installations and interactive applications for mobile devices. In parallel to his creative activities he has worked as a developer and IT/video consultant for different commercial and art enterprises and as a programmer for portable devices. www.emmanuelflores.net Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-Novello2012, author = {Novello, Alberto}, title = {Fragmentation}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Gascia Ouzounian, R. Benjamin Knapp, Eric Lyon, and R. Luke DuBois. 2012. Music for Sleeping & Waking Minds. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Music for Sleeping & Waking Minds (2011-2012) is an overnight event in which four performers fall asleep while wearing custom designed EEG sensors, which monitor their brainwave activity over the course of one night. The data gathered from the EEG sensors is applied in real time to different audio and image signal processing functions, resulting in a continuously evolving multi-channel sound environment and visual projection. This material serves as an audiovisual description of the individual and collective neurophysiological state of the ensemble, with sounds and images evolving according to changes in brainwave activity. Audiences, who are invited to bring anything that they need to ensure comfortable sleep, can experience the work in different states of attention: while alert and sleeping, resting and awakening. Gascia Ouzounian (composition & production), R. Benjamin Knapp (physiological interface & interaction design), Eric Lyon (audio interface & interaction design), R. Luke DuBois (visual interface & interaction design)Composer(s) Credits: Gascia Ouzounian (composition & production), R. Benjamin Knapp (physiological interface & interaction design), Eric Lyon (audio interface & interaction design), R. Luke DuBois (visual interface & interaction design) Instrumentalist(s) Credits: Artist(s) Biography: Gascia Ouzounian is a violinist, musicologist, and composer. She has performed with such varied ensembles as Yo-Yo Ma and the Silk Road Ensemble at Carnegie Hall, Bang On A Can All-Stars at the Mass MOCA, Sinfonia Toronto at the Toronto Centre for the Arts, and the Theatre of Eternal Music Strings Ensemble at the Dream House. Gascia’s recent projects include two compositions that are intended for overnight listening: EDEN EDEN EDEN with filmmaker Chloe Griffin, and Music for Sleeping & Waking Minds with R. Benjamin Knapp, Eric Lyon and R. Luke DuBois. In the latter, an ensemble of sleeping performers generates an audiovisual environment through their neurophysiological activity over the course of one night. Gascia teaches at Queen’s University Belfast, where she leads the performance programme in the School of Creative Arts. Her writings on experimental music and sound art appear in numerous academic journals and the book Paul DeMarinis: Buried in Noise. R. Benjamin Knapp is the founding director of the Institute for Creativity, Arts, and Technology at Virginia Tech, where he is Professor of Computer Science. Ben has led the Music, Sensors and Emotion (MuSE) group, whose research focuses on the understanding and measurement of the physical gestures and emotional states of musical performers and their audience. For over 20 years, Ben has been researching and developing user-interfaces and software that enable composers and performers to augment the physical control of a musical instrument with more direct neural interaction. From the invention of the Biomuse with Hugh Lusted in 1987 to the introduction of the concept of an Integral Music Controller (a generic class of controllers that use the direct measurement of motion and emotion to augment traditional methods of musical instrument control) in 2005, Ben has focused on creating a user-aware interface based on the acquisition and real-time analysis of biometric signals. Eric Lyon is a composer and computer music researcher. During the 1980s and 1990s, his fixed media computer music focused on spectral and algorithmic processing of audio, with a tendency toward extreme modifications of samples, variously sourced. From the early 1990s, Lyon became involved with live computer music, performing solo, and in the Japanese band Psychedelic Bumpo, with the Kyma system. Later in the 1990s, he gravitated toward software-based live processing, starting to develop Max/MSP externals in 1999. This work resulted in his LyonPotpourri collection of Max/MSP externals, and the FFTease spectral package, developed in collaboration with Christopher Penrose. In recent years, Lyon has focused on computer chamber music, which integrates live, iterative DSP strategies into the creation of traditionally notated instrumental scores. Other interests include spatial orchestration, and articulated noise composition. Lyon teaches computer music in the School of Music and Sonic Art at Queen’s University Belfast. R. Luke DuBois is a composer, artist, and performer who explores the temporal, verbal, and visual structures of cultural and personal ephemera. He has collaborated on interactive performance, installation, and music production work with many artists and organizations including Toni Dove, Matthew Ritchie, Todd Reynolds, Jamie Jewett, Bora Yoon, Michael Joaquin Grey, Elliott Sharp, Michael Gordon, Maya Lin, Bang on a Can, Engine27, Harvestworks, and LEMUR, and was the director of the Princeton Laptop Orchestra for its 2007 season. Stemming from his investigations of “time-lapse phonography,” his recent work is a sonic and encyclopedic relative to time-lapse photography. Just as a long camera exposure fuses motion into a single image, his work reveals the average sonority, visual language, and vocabulary in music, film, text, or cultural information. He teaches at the Brooklyn Experimental Media Center at the Polytechnic Institute of NYU, and is on the Board of Directors of Issue Project Room. Concert Venue and Time: North Quad Space 2435, Monday May 21, 11:00pm
@incollection{nime2012-music-OuzounianKnappLyonDuBois2012, author = {Ouzounian, Gascia and Knapp, R.~Benjamin and Lyon, Eric and DuBois, R.~Luke}, title = {Music for Sleeping \& Waking Minds}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Kevin Patton and Butch Rovan. 2012. the ellipsis catalog. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: the ellipsis catalog features new instruments designed by Kevin Patton and Butch Rovan. Patton’s instrument, the “Fossil”, is a wireless sensor-based musical instrument that is played with the entire gestural range of arm movement as well as finger pressure. Four FSRs, a momentary button, and a two-dimensional accelerometer are used to control the parameters of a custom software environment built in Max/MSP/Jitter. It is part of a group of four hand-carved wood instruments called the Digital Poplar Consort. Rovan’s “Banshee” is an analog electronic musical instrument. Modeled after a wind instrument, the design uses six finger pads to control the pitch of an array of interrelated oscillators, and a mouth sensor that allows the performer to control volume. The Banshee also features a tilt-sensor that allows motion to change the voicing circuitry and resulting timbre. Battery powered, the instrument can plug into any amplifier or mixing console, much like an electric guitar. Composer(s) Credits: Instrumentalist(s) Credits: Kevin Patton (Fossil), Butch Rovan (Banshee) Artist(s) Biography: Kevin Patton is a musician, scholar, and technologist active in the fields of experimental music and multimedia theatre whose work explores the intersection of technology and performance. The design of new musical instruments as well as interfaces and computer systems for analysis, improvisation, installation and projection is at the center of his practice. His work has been recognized for his collaboration with visual artist Maria del Carmen Montoya with the 2009 Rhizome commission for the piece, I Sky You. Patton is an assistant professor of music and performance technologies at Oregon State University. He holds a Ph.D. and M.A. from Brown University in electronic music and multimedia composition. He also holds a Master of Music degree in jazz studies and composition from the University of North Texas. He was an Invited Researcher at the Sorbonne, University of Paris IV, for the Spring of 2009. Butch Rovan is a media artist and performer at Brown University, where he co-directs MEME (Multimedia & Electronic Music Experiments @ Brown). Rovan has received prizes from the Bourges International Electroacoustic Music Competition, the Berlin Transmediale International Media Arts Festival, and his work has appeared throughout Europe and the U.S. Most recently his interactive installation Let us imagine a straight line was featured in the 14th WRO International Media Art Biennale, Poland. Rovan’s research includes new sensor hardware design and wireless microcontroller systems. His research into gestural control and interactivity has been featured in IRCAM’s journal Resonance, Electronic Musician, the Computer Music Journal, the Japanese magazine SoundArts, the CDROM Trends in Gestural Control of Music (IRCAM 2000), and in the book Mapping Landscapes for Performance as Research: Scholarly Acts and Creative Cartographies (Palgrave Macmillan, 2009). Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-PattonRovan2012, author = {Patton, Kevin and Rovan, Butch}, title = {the ellipsis catalog}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Isaac Schankler, Alexandre François, and Elaine Chew. 2012. Mimi: Multi-modal Interaction for Musical Improvisation. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Mimi, designed by Alexandre François with input from Elaine Chew and Isaac Schankler, is a multi-modal interactive musical improvisation system that explores the impact of visual feedback in performer-machine interaction. The Mimi system enables the performer to experiment with a unique blend of improvisation-like on-the-fly invention, composition-like planning and choreography, and expressive performance. Mimi’s improvisations are created through a factor oracle. The visual interface gives the performer and the audience instantaneous and continuous information on the state of the oracle, its recombination strategy, the music to come, and that recently played. The performer controls when the system starts, stops, and learns, the playback volume, and the recombination rate. Mimi is not only an effective improvisation partner, it also provides a platform through which to interrogate the mental models necessary for successful improvisation. This performance also features custom synths and mechanisms for inter-oracle interaction created for Mimi by Isaac Schankler. Composer(s) Credits: Isaac Schankler, Alexandre François, Elaine Chew Instrumentalist(s) Credits: Isaac Schankler (keyboard & electronics), Mimi (keyboard & electronics) Artist(s) Biography: Isaac Schankler is a Los Angeles-based composer-improviser. His recent honors include a grant from Meet the Composer for his opera Light and Power, selection as finalist in the ASCAP/SEAMUS Composition Competition, and the Damien Top Prize in the ASCAP/Lotte Lehmann Foundation Art Song Competition. He is the Artist in Residence of the Music Computation and Cognition Laboratory (MuCoaCo) at the USC Viterbi School of Engineering, and an Artistic Director of the concert series People Inside Electronics. Isaac holds degrees in composition from the USC Thornton School of Music (DMA) and the University of Michigan (MM, BM). Elaine Chew is Professor of Digital Media at Queen Mary, University of London, and Director of Music Initiatives at the Centre for Digital Music. An operations researcher and pianist by training, her research goal is to de-mystify music and its performance through the use of formal scientific methods; as a performer, she collaborates with composers to present eclectic post-tonal music. She received PhD and SM degrees in Operations Research from MIT and a BAS in Music and Mathematical & Computational Sciences from Stanford. She is the recipient of NSF Career and PECASE awards, and a Radcliffe Institute for Advanced Studies fellowship. Alexandre R.J. François’s research focuses on the modeling and design of interactive (software) systems, as an enabling step towards the understanding of perception and cognition. He was a 2007-2008 Fellow of the Radcliffe Institute for Advanced Study at Harvard University, where he co-led a music research cluster on Analytical Listening Through Interactive Visualization. François received the Diplôme d’Ingénieur from the Institut National Agronomique Paris-Grignon in 1993, the Diplôme d’Etudes Approfondies (M.S.) from the University Paris IX - Dauphine in 1994, and the M.S. and Ph.D. degrees in Computer Science from the University of Southern California in 1997 and 2000 respectively. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-SchanklerFrancoisChew2012, author = {Schankler, Isaac and Fran\c{c}ois, Alexandre and Chew, Elaine}, title = {Mimi: Multi-modal Interaction for Musical Improvisation}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Matthias Schneiderbanger and Michael Vierling. 2012. Floating Points II. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: The piece Floating Points II is the result of the continued work of the instrument makers and performers Michael Vierling and Matthias Schneiderbanger within their self-developed system for collaborative performance including the digital musical instruments Sensor-table and Chirotron. These instruments use several sensors to transform the movements and gestures of their players into data for sound generation, placement and movement of the sound in the room. The performances with Sensor-table and Chirotron emphasize the connection between the performer and the digital musical instruments by using the basic noise of the sensors as a notable characteristic in the sound synthesis to accentuate the technical boundaries in an aesthetic way. The network is the core of the common setup: It offers the ability to connect two physically separated instruments into one common signal chain for sound processing and spatialisation. Composer(s) Credits: Instrumentalist(s) Credits: Matthias Schneiderbanger (Chirotron), Michael Vierling (Sensor-table) Artist(s) Biography: Matthias Schneiderbanger (*1987) musician and sonic artist, studies since 2007 at the Karlsruhe University of Music, Germany. Currently master student in music informatics with emphasis in composition and sonic arts. Main foucs on development of digital musical instruments, sound installations, contemporary music and live coding. Since 2010, there is also an artistic collaboration with M. Vierling in the development of digital musical instruments. Their instruments were presented 2011 at the Music and Sonic Arts Symposium in Baden-Baden, performances include the Network Music Festival in Birmingham and the ZKM in Karlsruhe. He is a member of the laptop ensemble Benoît and the Mandelbrots, performances along with numerous other concerts at the BEAM Festival in Uxbridge, the SuperCollider Symposium 2012 in London, the Laptops Meet Musicians Festival 2011 in Venice and the next\- -generation 4.0 Festival at the ZKM in Karlsruhe. He is a member of Karlsruhe artist collective nil. Michael Vierling studies music informatics master at the Karlsruhe University of Music, Germany. He is drummer in several band projects and teaches a drumclass at School for Music and Performing Arts in Bühl, Germany. His main interests besides producing and performing music are sonic arts especially live- electronics, creating digital music instruments and sound installations with use of sensor technologies. Since 2010, there is an artistic collaboration with M. Schneiderbanger in the development of digital musical instruments. Their instruments were presented 2011 at the Music and Sonic Arts Symposium in Baden-Baden, performances include, the NIME 2012 in Michigan and the Network Music Festival 2012 in Birmingham. His works have been exhibited at various Festivals e.g. ton:art 2010/11, UND 6/7, Sommerloch 2011, Beyond 3D-Festival in Karlsruhe and the Next Level Conference in Cologne. He is a member of Karlsruhe artist collective nil. Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-SchneiderbangerVierling2012, author = {Schneiderbanger, Matthias and Vierling, Michael}, title = {Floating Points II}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Paul Stapleton and Tom Davis. 2012. Ambiguous Devices. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: This performance explores notions of presence and absence, technologically mediated communication and audience perception through the staging of intentionally ambiguous but repeatable sonic interactions taking place across two geographically separate locations. Thanks to SARC, CCRMA & Bournemouth University for support during the development of this project. Composer(s) Credits: Paul Stapleton and Tom Davis Instrumentalist(s) Credits: Paul Stapleton (Networked Instrument), Tom Davis (Networked Instrument) Artist(s) Biography: Paul Stapleton is a sound artist, improviser and writer originally from Southern California, currently based in Belfast, Northern Ireland. Paul designs and performs with a variety of modular metallic sound sculptures, custom made electronics, found objects and electric guitars in locations ranging from experimental music clubs in Berlin to remote beaches on Vancouver Island. He is currently involved in a diverse range of artistic collaborations including: performance duo ABODE with vocalist Caroline Pugh, interdisciplinary arts group theybreakinpieces, improvisation duo with saxophonist Simon Rose, Eric Lyon’s Noise Quartet, and the DIY quartet E=MCHammer. Since 2007, Paul has been on the faculty at the Sonic Arts Research Centre where he teaches and supervises Master’s and PhD research in performance technologies, interaction design and site-specific art. Tom Davis is a digital artist working mainly in the medium of sound installation. His practice and theory based output involves the creation of technology-led environments for interaction. He performs regularly as part of JDTJDJ with Jason Dixon and as part of the Jackson4s. He has performed and exhibited across Europe and in the US. Davis is currently a lecturer at the University of Bournemouth and holds a PhD from the Sonic Arts Research Centre. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-StapletonDavis2012, author = {Stapleton, Paul and Davis, Tom}, title = {Ambiguous Devices}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Eli Stine. 2012. Motion-Influenced Composition. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: This piece consists of a partially pre-composed acousmatic composition actualized in real time by hand motion. The audio generated by the hand motions is analyzed, colorized and projected beside the performer during the performance. The motions and content of this piece are inspired by the late Merce Cunningham and this performance is dedicated to him. Composer(s) Credits: Eli Stine Instrumentalist(s) Credits: Eli Stine Artist(s) Biography: Eli Stine (born 1991 in Greenville, NC) is a composer, programmer, and sound designer currently pursuing a Double Degree at Oberlin College, studying Technology In Music And Related Arts and composition in the conservatory and Computer Science in the college. Winner of the undergraduate award from the Society for Electro-Acoustic Music in the United States (SEAMUS) in 2011, Eli has studied with Tom Lopez, Lewis Nielson, and Per Bloland at Oberlin, focusing on electroacoustic and acoustic music, as well as live performance with electronics. While at Oberlin Eli has performed with Oberlin’s Contemporary Music Ensemble, had works played in concert by Oberlin’s Society of Composers, inc. ensemble and student ensemble ACADEMY, and collaborated with students and faculty across disciplines on collaborative multimedia projects. More information about Eli’s work can be found at www.oberlin.edu/student/estine/. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-Stine2012, author = {Stine, Eli}, title = {Motion-Influenced Composition}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Koray Tahiroğlu. 2012. InHands: Improvisation for Mobile Phones. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: InHands, an audiovisual real-time improvisation for mobile phones, explores alternative options for musical interactions with two mobile instruments in live performances. In this improvisation piece, sound output of each mobile phone instrument becomes a sound input for the other instrument; to be processed further with an act of responding immediately and spontaneously. Granular synthesis module captures audio in real-time and creates the grains based on the texture of the sounds. Magnitude, roll and pitch values of the acceleration are mapped to the control parameters. In the control layer of Sub-synthesis module, the change in direction of a touch position is tracked on the mobile surface and the distance of the same touch position to 4 certain points on the touchscreen is used as a source for creating frequency values. This mapping model generates 4 control parameters throughout 2 dimensional input layers. Hannah Drayson created the abstract visual-layers of this piece. Composer(s) Credits: Instrumentalist(s) Credits: Koray Tahiroğlu (mobile phones) Artist(s) Biography: Koray Tahiroğlu is a musician, postdoctoral researcher and lecturer in the Department of Media, Aalto University. He practices art as a researcher focusing on embodied approaches to sonic interaction in participative music experience, as well as a performer of live electronic music. He conducted an artistic research with a focus on studying and practicing human musical interaction. Tahiroğlu has completed the degree of Doctor of Arts with the dissertation entitled "Interactive Performance Systems: Experimenting with Human Musical Interaction" after its public examination in 2008. He developed interactive performance systems and experimental musical instruments, which were used in his live performances. Since 2004, he has been also teaching workshops and courses introducing artistic strategies and methodologies for creating computational art works. Tahiroğlu has performed experimental music in collaboration as well as in solo performances in Europe and North America. Concert Venue and Time: Necto, Tuesday May 22, 9:00pm
@incollection{nime2012-music-Tahiroglu2012, author = {Tahiro\u{g}lu, Koray}, title = {InHands: Improvisation for Mobile Phones}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Atau Tanaka and Adam Parkinson. 2012. 4 Hands iPhone. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Adam & Atau exploit a commonly available consumer electronics device, a smartphone, as an expressive, gestural musical instrument. The device is well known an iconic object of desire in our society of consumption, playing music as a fixed commodity. The performers re-appropriate the mobile phone and transform the consumer object into an instrument for concert performance. As a duo, with one in each hand, they create a chamber music, 4-hands iPhone. The accelerometers allow high precision capture of the performer’s free space gestures. This drives a granular synthesis patch in Pure Data (PD), where one patch becomes the process by which a range of sounds from the natural world are stretched, frozen, scattered, and restitched. The fact that all system components—sensor input, signal processing and sound synthesis, and audio output, are embodied in a single device make it a self-contained, expressive musical instrument. Composer(s) Credits: Atau Tanaka and Adam Parkinson Instrumentalist(s) Credits: Artist(s) Biography: Atau Tanaka’s first inspirations came upon meeting John Cage during his Norton Lectures at Harvard and would go to on re-create Cage’s Variations VII with Matt Wand and :zoviet*france:, performing it in Newcastle upon Tyne, Berlin, and Paris. In the 90’s he formed Sensorband with Zbigniew Karkowski and Edwin van der Heide and then moved to Japan and came in contact with the noise music scene, playing with Merzbow, Otomo, KK Null and others. Atau has released solo, group, and compilation CD’s on labels such as Sub Rosa, Bip-hop, Caipirinha Music, Touch/Ash, Sonoris, Sirr-ecords. His work has been presented at ICC in Japan, Ars Electronica, DEAF/V2, IRCAM, and Transmediale in Europe, and Eyebeam, Wood Street Gallery, and SFMOMA in the U.S. He has been artistic ambassador for Apple, researcher for Sony CSL, artistic co-director of STEIM, and director of Culture Lab Newcastle. He is currently European Research Council (ERC) fellow at Goldsmiths Digital Studios in London. Adam Parkinson is an electronic musician based in Newcastle, England. He has recently completed PhD, with much of his research looking at mobile music and performing with iPhones.He has worked alongside various improvisers such as Rhodri Davies, Klaus Filip, Robin Hayward and Dominic Lash, and has been involved in collaborations to create sound installations with Kaffe Matthews and Caroline Bergvall. He also dabbles in making dance music, and is trying to write a perfect pop song. Atau & Adam have been performing as a duo since 2008: first as a laptop / biomuse duo then in the current iPhone formation. 4-Hands iPhone has so far been performed across Europe and North America including the FutureEverything Festival (Manchester), Passos Manuel (Porto), Charm of Sound Festival (Helsinki), Electron Festival (Geneva), Mois Multi (Quebec), Music With A View (New York). Concert Venue and Time: Lydia Mendelssohn Theatre, Monday May 21, 9:00pm
@incollection{nime2012-music-TanakaParkinson2012, author = {Tanaka, Atau and Parkinson, Adam}, title = {4 Hands iPhone}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Shawn Trail, Thor Kell, and Gabrielle Odowichuk. 2012. Må ne Havn (mounhoun): An Exploration of Gestural Language for Pitched Percussion. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: Må ne Havn (mounhoun) is an improvisational multi-media performance system for extended vibraphone with accompanying custom LED sculptures and projected visuals. The music draws specifically from NYC free jazz, the funeral music of the Lobi people of northern Ghana, Dub, psych rock and minimalism. Abstract interactive light sculptures actuated from the instrument’s audio and controller data will accompany the performance, creating a visually shifting immersive space. The sculptures, named ‘Takete’ and ‘Maluma’, reference Gestalt psychology and the known correlation between our perceptions of sound and light. Mappings will reflect this phenomenon. The piece uses a pitched percussion tool suite developed by the Music Intelligence & Sound Technology Collective at the University of Victoria, including: Magic Eyes (3D gesture controller), Ghost Hands (control data looper), MSTR DRMMR++ (rhythm template as control switches), Fantom Faders (vibraphone bars as control faders) and Gyil Gourd (physical modeling of the Lobi xylophone’s gourd resonator). Composer(s) Credits: Shawn Trail, Thor Kell, Gabrielle Odowichuk (Artistic Director) Instrumentalist(s) Credits: Shawn Trail (xtended Vibraphone, Notomoton- robotic drum, suspended cymbal) Artist(s) Biography: Shawn Trail: Electro-acoustic percussionist, \textbfShawn Trail, designs and builds new performance technologies for acoustic pitched percussion instruments integrating musical robotics, physical modeling synthesis, and HCI. He was Control Interface and Robotics Technician for Pat Metheny’s Orchestrion World Tour (2010), Fulbright Scholar at Medialogy- Aalborg University, Copenhagen researching DSP, synthesis, and HCI (2009), and composer-in-residence with League of Electronic Musical Urban Robots (2008). In 2002 he conducted field research in Ghana on the Gyil (traditional xylophone). He has a Master of Music in Studio Composition from Purchase Conservatory of Music and a BA in percussion performance and music technology. He is an Interdisciplinary PhD candidate in Computer Science, Electrical Engineering, and Music with MISTIC at the University of Victoria. Performing solo under the moniker TXTED, his multi- media performance works singularly revolve around minimal, textural evolving polyrhythmic, melodic ostinati propelled by a sense of urgency intrinsic to cultural music rituals informed by specific traditions. Thor Kell: As a composer, programmer, and DJ, \textbfThor Kell likes combining interesting things in unique ways. A recent graduate of the University of Victoria’s Music / Computer Science program, he will begin his MA at McGill University in the fall, focusing on interactions between performer, interface, and software. While at UVic, he received a Jamie Cassels Undergraduate Research Award: his research involved prototyping and composing for a gestural control mapping system for extending the marimba. His traditional compositions are all clockwork riffs and hidden structures, based on mathematical constants or time- stretched quotes from the English folk music canon: he has written for everything from full orchestra to solo piano. He has programmed for The Echo Nest and SoundCloud. In his secret life as a DJ and techno maven, he has released chart-toppers on Kompakt, impossibly deep jams on Fade, and hour-long remix / video symphonies on his own label, Tide Pool. Gabrielle Odowichuk is a graduate student in Electrical Engineering at the University of Victoria, working in the MISTIC research lab. A specialist in DSP and MIR, her research has focused on sound spatialization and gesture-based control of sound and music, developing a variety of prototypes, including Fantom Faders and Magic Eyes, the mallet tracking and gesture control applications used in this performance. For Møane Havn (mounhoun), she draws on previous experience in art direction and stage design to produce unique real-time gesture-controlled visualizations. She designed, built, and developed the interactive LED sculptures, Takete and Maluma, used in this piece, as well as the projections. Her work has been published by ICMC, IEEE, and NIME. Concert Venue and Time: Lydia Mendelssohn Theatre, Wednesday May 23, 7:00pm
@incollection{nime2012-music-TrailKellOdowichuk2012, author = {Trail, Shawn and Kell, Thor and Odowichuk, Gabrielle}, title = {M\aa ne Havn (mounhoun): An Exploration of Gestural Language for Pitched Percussion}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Pierre Alexandre Tremblay. 2012. Sandbox#3.6. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: A bass guitar and a laptop. No sequence, no set list, no programme, no gizmo, no intention, no fireworks, no meaning, no feature, no beat, no argument, no nothing. Just this very moment with my meta-instrument: a third sandbox in which I play in public for the sixth time, here, whatever happens. Composer(s) Credits: Instrumentalist(s) Credits: Pierre Alexandre Tremblay Artist(s) Biography: Pierre Alexandre Tremblay (Montréal, 1975) is a composer and a performer on bass guitar and sound processing devices, in solo and within the groups ars circa musicæ (Paris, France), de type inconnu (Montréal, Québec), and Splice (London, UK). His music is mainly released by Empreintes DIGITALes and Ora. He is Reader in Composition and Improvisation at the University of Huddersfield (UK) where he also is Director of the Electronic Music Studios. He previously worked in popular music as producer and bassist, and is interested in videomusic and coding. He likes oolong tea, reading, and walking. As a founding member of the no-tv collective, he does not own a working television set. www.pierrealexandretremblay.com Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-Tremblay2012, author = {Tremblay, Pierre~Alexandre}, title = {Sandbox\#3.6}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
-
Yuta Uozumi, Keisuke Oyama, Jun Tomioka, Hiromi Okamoto, and Takayuki Kimura. 2012. four fragments—A Performance for Swarming Robotics. In Music Proceedings of the International Conference on New Interfaces for Musical Expression, Georg Essl, Brent Gillespie, Michael Gurevich and Sile O’Modhrain (eds.). Electrical Engineering & Computer Science and Performing Arts Technology, University of Michigan, Ann Arbor, Michigan, U.S.A.
Download PDFProgram notes: This performance aims to approach the next style of “mashup” and/or “Cut-up” via fusion of paradigms of artificial-life and turntable. We developed a system named “SoniCell” to realize it. SoniCell employs four robots called “cell”. Each cell behaves as a metaphor of life based on a simple interaction model with prey-predator relationship. Each cell is assigned a music-track in the manner of turntable. Therefore, the system reconstructs and mixes the music-tracks via cells’ interactions and performers’ interventions. In this framework, the aspects of the system and performers interactions and cells’ internal-states create structures of sounds and music from different tracks. Composer(s) Credits: Yuta Uozumi, Keisuke Oyama, Jun Tomioka, Hiromi Okamoto, Takayuki Kimura Instrumentalist(s) Credits: Artist(s) Biography: Yuta Uozumi is a sound artist and agent-base composer was born in the suburbs of Osaka, Japan. He started computer music at the age of fifteen. He received his Ph.D. from Keio University SFC Graduate School of Media and Governance. He is researching and teaching at Tokyo University of Technology. He is studying Multi-Agent based dynamic composition with computer or human ensembles. In 2002 His CD "meme?" was released from Cubicmusic Japan (under the name of SamuraiJazz). In 2003 agent-based musical interface "Chase" was accepted by NIME. It is a collaborative project by system-designer, DSP engineer and performer. In 2005 an application for agent-based composition “Gismo” and a piece created with the system “Chain” (early version) were accepted by ICMC(International Computer Music Conference). Keisuke Oyama, was born in Kumamoto, Japan on September 19, 1986. He plays various instruments freely in childhood. When he was 18, moved to Tokyo to study jazz theory. After starting his career as a jazz musician, he participated various sessions as a guitarist. Furthermore, his interest covered electro acoustic in the career. He was enrolled at Keio University Shonan Fujisawa Campus (SFC) to learn method and technique of computer music and media art in 2009. He is exploring the new expression of music. Concert Venue and Time: Necto, Wednesday May 23, 9:00pm
@incollection{nime2012-music-Uozumi2012, author = {Uozumi, Yuta and Oyama, Keisuke and Tomioka, Jun and Okamoto, Hiromi and Kimura, Takayuki}, title = {four fragments---A Performance for Swarming Robotics}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Essl, Georg and Gillespie, Brent and Gurevich, Michael and O'Modhrain, Sile}, year = {2012}, month = may, day = {21-23}, publisher = {Electrical Engineering \& Computer Science and Performing Arts Technology, University of Michigan}, address = {Ann Arbor, Michigan, U.S.A.} }
2011
-
Sarah Taylor, Maurizio Goina, and Pietro Polotti. 2011. Body Jockey. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFAbout the performers: Sarah Taylor: Dancer, Choreographer trained at the Australian Ballet School (Degree in Dance), in Classical, Cunningham and Graham, Scholarship student to Martha Graham school in New York. Currently working with Cesc Gelabert, for the 2011 Grec Festival, Barcelona. Maurizio Goina: Viola player and an audio-visual composer. Currently he is affiliated with the School of Music and New Technologies of the Conservatory of Trieste where he is developing, together with Pietro Polotti and with the collaboration of Sarah Taylor, the EGGS system for gesture sonification. Pietro Polotti: Studied piano, composition and electronic music. He has a degree in physics from the University of Trieste. In 2002, he obtained a Ph.D. in Communication Systems from the EPFL, Switzerland. Presently, he teaches Electronic Music at the Conservatory Tartini of Trieste, Italy. He has been part of the EGGS project since 2008.
@inproceedings{nime2011-music-SarahTaylor2011, author = {Taylor, Sarah and Goina, Maurizio and Polotti, Pietro}, title = {Body Jockey}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {http://www.nime.org/proceedings/2019/nime2019_music001.pdf} }
-
Victor Zappi and Dario Mazzanti. 2011. Dissonance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Dissonance is an audio/visual performance in which a progressive soundtrack is created along with the exploration of an interactive virtual environment. While real instrument–generated music animates the projected worlds, the two performers are allowed to physically interact with virtual objects, changing their position, shape and color to control music and create new sounds. As the journey continues and the environment introduces new elements and new metaphors, performers are driven to explore the sonic laws that rule each scenario. Spectators wearing 3D glasses perceive the virtual environment as moving out of the screen and embracing the artists, in choreographies where real and virtual world literally overlap. About the performers: Victor Zappi: PhD student and a new media artist. His research focuses on Virtual Reality and its applications in art and live performances. Dario Mazzanti: computer science engineer and multi-instrumentalist composer. He enjoys writing, recording and playing music combining his artistic streak with his interest for technology.
@inproceedings{nime2011-music-Zappi2011, author = {Zappi, Victor and Mazzanti, Dario}, title = {Dissonance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26616186} }
-
Christopher Alden. 2011. REMI Sings. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: REMI Sings is an electroacoustic performance for the bio-inspired Rhizomatic Experimental Musical Interface (REMI) and accordion. REMI is an interactive networked musical organism that receives sonic input from its environment, processes it based on the ever changing structure of its interior network, and generates a unique musical output. This rhizomatic network is a software structure modelled after the functioning and growth patterns of biological rhizomes, specifically the mycorrhizal association that form vital nutrient pathways for the majority of the planet’s land-plant ecosystems. The performance REMI Sings highlights this interface’s interactive nature, creating a dialogue between human performer and non-human musical intelligence. About the performer: Christopher Alden: Composer, programmer, and instrumentalist currently studying at New York University’s Interactive Telecommunications Program, where his research focuses on interactive music systems for composition and performance. Before ITP, he received his undergraduate degree in Music Theory and Composition at NYU where he studied composition under Marc Antonio-Consoli
@inproceedings{nime2011-music-Alden2011, author = {Alden, Christopher}, title = {REMI Sings}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26619152} }
-
Julien Guillamat, Charles Céleste Hutchins, Shelly Knotts, Norah Lorway, Jorge Garcia Moncada, and Chris Tarren. 2011. BiLE (Birmingham Laptop Ensemble). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: An open playground for laptop improvisation and performance. BiLE’s performance will focus on semi-structured improvisation, with players creating and manipulating sound using a variety of motion capture devices - iPhones, Wiimotes, and Xbox Kinect. The data captured by each device, along with analysed musical parameters, will be sent out over the shared network, to be used by each performer as they see fit. The aim is to allow players to latch onto other members of the group by mapping the shared data to their own software parameters, creating moments of convergence between the ensemble. BiLE takes an ‘instrumental’ approach to performance, with each performer having their own speaker, sonic identity and spatial location. About the performers: BiLE (Birmingham Laptop Ensemble): A collaborative group of six composers, brought together through their shared interest in live performance and improvisation. BiLE has an open and inclusive attitude towards experimentation with sound, and draws on the members’ wide-ranging musical backgrounds.
@inproceedings{nime2011-music-Guillamat2011, author = {Guillamat, Julien and Hutchins, Charles Céleste and Knotts, Shelly and Lorway, Norah and Moncada, Jorge Garcia and Tarren, Chris}, title = {BiLE (Birmingham Laptop Ensemble)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26619928} }
-
Yago de Quay and Ståle Skogstad. 2011. Where Art Thou?: Dance Jockey. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: As artists, we have learned that throughout the history of mankind music and technology have co-evolved, shaping — and being shaped by — human expression and creativity. The variety and intricacy of these recombination processes contribute profoundly to the current diversity of performative structures and aesthetics within the arts. Where art Thou? is a 15 minute theatrical performance where sounds are controlled by sensors on the dancer’s body. Blending a mixture of electronic music and sound effects with dance and acting, this novel act refocuses sensors from simplistic action-to-sound to contextualized aesthetic and dramatic expression. The name reflects the itinerant quality of the stage character as he travels through a world of sounds. About the performers: Yago de Quay: Interactive media artist, musician and researcher based in Porto. His numerous installations and performances focus on user participation contributing to modify the art piece itself. They always have a strong sonic component and combine technologies to help create new modes of expression. Yago is currently finishing his M.Sc. in Sound Design and Interactive Music at the Faculty of Engineering, University of Porto. Ståle Skogstad: PhD student in the fourMs group at the University of Oslo. His research is focused on using real-time full-body motion capture technology for musical interaction. This includes real-time feature extraction from full body motion capture data and technical studies of motion capture technologies. He is currently working with the Xsens MVN inertial sensor suit.
@inproceedings{nime2011-music-Quay2011, author = {de Quay, Yago and Skogstad, Ståle}, title = {Where Art Thou?: Dance Jockey}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26619980} }
-
Paul Stapleton, Caroline Pugh, Adnan Marquez-Borbon, and Cavan Fyans. 2011. E=MCH. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFAbout the performers: E=MCH is a recently formed quartet featuring Belfast-based improvisers Paul Stapleton (BoSS & Postcard Weevil), Caroline Pugh (Voice & Analogue Cassette Decks, Zero-input Mixer), Adnan Marquez-Borbon (Feedback Bass Clarinet, Recording Modules & Delay Lines) and Cavan Fyans (DIY Electronics). Memories, distortions of time and place, echoes from analogue delay lengths, solid state samplers, and modified vinyl all help shape the fabric of the music in response to its larger ecology. “Okay so making instruments and playing on them is not new, can’t really see that there is any new thought about how why and what here, but the sound sculpture looks nice.” — Cosmopolitan Paul Stapleton: Sound artist, improviser and writer originally from Southern California, currently lecturing at the Sonic Arts Research Centre in Belfast (SARC). Paul designs and performs with a variety of custom made metallic sound sculptures, electronics and found objects in locations ranging from impro clubs in Cork to abandoned beaches on Vancouver Island. Caroline Pugh: Scottish vocalist and performance artist. She deviously borrows analogue technologies and oral histories to create performances that present imagined constructions of traditional and popular culture. With a background in both folk music and improvisation, she collaborates with people from any discipline and performs in a wide variety of venues including folk clubs, arts venues and cinemas. Adnan Marquez-Borbon: Saxophonist, improviser, computer musician, and composer, currently a PhD student at SARC. His research emphasis is on the roles of learning models and skill development in the design of digital musical instruments. As a musician, his music focuses on improvisation and the electronic manipulation of sounds in real-time. Cavan Fyans: PhD research student, instrument builder, noise maker & improviser. Currently located at SARC, Cavan’s research examines the spectator’s cognition of interaction and performance in communicative interactions with technology. Cavan also devotes time to developing new and innovative ways of breaking cheap electronic toys (Circuit Bending) and (re)constructing circuitry for sonic creation (Hardware Hacking).
@inproceedings{nime2011-music-PaulStapleton2011, author = {Stapleton, Paul and Pugh, Caroline and Marquez-Borbon, Adnan and Fyans, Cavan}, title = {E=MCH}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26620232} }
-
Lauren Sarah Hayes and Christos Michalakos. 2011. Socks and Ammo. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFPerformer notes: Socks and Ammo for piano, percussion and live electronics, is a new work investigating novel methods of communication between laptop and performer, as well as performer and performer, in an improvisational setting. Enhancing traditional aural and visual cues, a network is established between laptops, providing direction and suggestion to and between performers. Tactile feedback is provided to performers in the form of tiny vibrations on the skin, opening up a further, yet covert, channel of information to transmit signals and cues, allowing for a more informed and focused performance. About the performers: Lauren Sarah Hayes: Composer and performer from Glasgow. Her recent practice focuses on realizing compositions for piano and live electronics, which unify extended technique, bespoke software and instrument augmentation. Undertaken at the University of Edinburgh, her research investigates audio-haptic relationships as performance strategies for performers of digital music. Christos Michalakos: Composer and improviser from northern Greece. Working predominantly with percussion and live electronics, his music explores relationships between acoustic and electronic sound worlds, through an examination of methods for developing and augmenting his drum kit, forming part of his PhD research at the University of Edinburgh. === Recorded at: 11th International Conference on New Interfaces for Musical Expression. 30 May - 1 June 2011, Oslo, Norway. http://www.nime2011.org
@inproceedings{nime2011-music-Hayes2011, author = {Hayes, Lauren Sarah and Michalakos, Christos}, title = {Socks and Ammo}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26629807} }
-
Bill Hsu and Alain Crevoisier. 2011. Interstices AP. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Interstices AP is a structured audio-visual solo improvisation, using the multitouch Airplane Controller to manipulate live electronic sound and interactive animations. During the piece, Bill Hsu will be using the Airplane Controller in combination with his PSHIVA particle system software, to synthesize and interact with generative sound and animations. The visual component of Interstices AP is a physics-based simulation of a particle system. Particles, images and other components interact with physical gestures in a fluid like system; the results resemble asymmetric, constantly evolving Rorschach blots that open up a wide range of visual associations. For more details, see Bill Hsu’s poster in the conference proceedings. About the performers: Bill Hsu: Associate Professor of Computer Science at San Francisco State University. His work with real-time audiovisual performance systems has been presented at (among others) SMC 2009 (Porto), Harvestworks Festival 2009 (New York), Fete Quaqua 2008 (London), MIX Festival 2007 and 2009 (New York), and Stimme+ 2006 (Karlsruhe). Alain Crevoisier: Senior researcher at the Music Conservatory of Geneva, Switzerland. He is the founder of Future-instruments.net, a collaborative research network active in the field of new musical interfaces and interactive technologies. The latest realization is the Airplane controller, a portable system that makes possible to transform any flat surface, into a multitouch interface.
@inproceedings{nime2011-music-Hsu2011, author = {Hsu, Bill and Crevoisier, Alain}, title = {Interstices AP}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26629820} }
-
Bill Hsu, Håvard Skaset, and Guro Skumsnes Moe. 2011. Flayed/Flock. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFPerformer notes: Flayed/Flock is a structured audio-visual improvisation for three musicians, utilizing live acoustic and electronic sound and interactive animations. The visual component of Flayed/Flock is an enhanced flocking simulation that interacts with real-time audio from the performance of improvising musicians. Abstract patterns develop out of the flocking behavior; the flocks are also able to coalesce into well-defined symbols and forms such as crescents and stars, all while moving in a natural-looking manner consistent with flocking behavior. For more details, see Bill Hsu’s poster in the conference proceedings. About the performers: Bill Hsu: Associate Professor of Computer Science at San Francisco State University. His work with real-time audiovisual performance systems has been presented at (among others) SMC 2009 (Porto), Harvestworks Festival 2009 (New York), Fete Quaqua 2008 (London), MIX Festival 2007 and 2009 (New York), and Stimme+ 2006 (Karlsruhe). Håvard Skaset (guitar) and Guro Skumsnes Moe (bass): The Oslo-based duo works intensively in the borders between improv, noise and rock. Skaset and Moe play in bands including Bluefaced People, Art Directors, Sult, Mirror Trio, SEKSTETT, Telling Stories About Trees and MOE. They have been working with Christian Wolff, Pauline Oliveros, Fred Frith, Ikue Mori, Okkyung Lee, Frode Gjerstad and many more.
@inproceedings{nime2011-music-Hsu2011a, author = {Hsu, Bill and Skaset, Håvard and Moe, Guro Skumsnes}, title = {Flayed/Flock}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26629835} }
-
Alex Nowitz. 2011. The Shells. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Since 2008 I have been performing and composing music for voice and live-electronics using two Wii-remotes as gestural controllers. The live-electronics function in two ways: as an extension of my voice and as an instrument as well. The music creation is mainly based on live-sampling the voice. I also use pre-recorded sounds and my own compositions. In addition, since the beginning of 2010 we have been developing a new instrument, which goes beyond the technical possibilities of the Wii-controllers. I call this instrument the Shells. Besides motion sensors there are three more continuous controllers available: a pressure sensor, a joystick control and ultrasound for distance measurement. About the performers: Alex Nowitz: Composer of vocal, chamber and electronic music as well as music for dance, theatre and opera. Furthermore, he is a voice artist, whistling and singing virtuoso who is classically trained as tenor and countertenor and presents a wide array of diverse and extended techniques. He has been artist in residence at STEIM, Amsterdam, since 2010.
@inproceedings{nime2011-music-Nowitz2011, author = {Nowitz, Alex}, title = {The Shells}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26661484} }
-
Dan Overholt and Lars Grausgaard. 2011. Study No. 1 for Overtone Fiddle. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: This generative / improvisatory work uses an iPod Touch and a tactile sound transducer attached to the Overtone Fiddle’s resonant body as a mobile system to lay out a variety of animated and transformed sound sources over time. About the performers: Dan Overholt: Associate Professor in the Department of Architecture, Design and Media Technology at Aalborg University, Denmark. He received a PhD in Media Arts and Technology from the University of California, Santa Barbara, a M.S. from the MIT Media Lab, and studied Music and Electronics Engineering and at CSU, Chico. As a musician, he composes and performs internationally with experimental human-computer interfaces and musical signal processing algorithms. Lars Graugaard: Free-lance composer, laptop performer and researcher. He holds a PhD in Artistic and Technological Challenges of Interactive Music from Oxford Brookes University and a MS in flute performance from the Royal Danish Academy of Music. His main interest is the systematic study of music’s expressive capacity applied to score composing, realtime interactive performance, generative and emergent music.
@inproceedings{nime2011-music-Overholt2011, author = {Overholt, Dan and Grausgaard, Lars}, title = {Study No. 1 for Overtone Fiddle}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26661494} }
-
Ivica Ico Bukvic, John Elder, Hillary Guilliams, et al. 2011. L2Ork. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: 13 (Ivica Ico Bukvic): is a game of prime numbers and primal instincts pitting timbre against rhythm. Driven by conductor’s oversight over an array of performer-specific and ensemble-wide parameters, a networked ensemble acts as one large meta-tracker where each individual performer contributes its own gesture-driven motives or tracks. The ensuing meta-tracker texture is superimposed against improvised acoustic percussion in a search of a meaningful discourse and ultimately musical synergy. Serene (Ivica Ico Bukvic): ...the one moment in the day when the world melts away and we catch a glimpse of life that just is... a celebration of this moment through juxtaposition of Taiji (Tai Chi Chuan) choreography and music... Citadel for soprano and L2Ork (Ivica Ico Bukvic) draws inspiration from a famous poem "Himna Slobodi" (Hymn to Freedom) by the 17th century Croatian poet Ivan Gundulic. As the first piece ever written for the newfound ensemble, it relies upon pervasive tonality, in many ways posing as an electronic counterpart to a traditional string ensemble. Using the infinite-bow metaphor to create lush tonal harmonies the piece forms a compelling aural foundation for a lyrical showcase of soloist’s vocal talent. === About the performers: L2Ork: Founded by Dr. Ivica Ico Bukvic in May 2009, is part of the latest interdisciplinary initiative by the Virginia Tech Music Department’s Digital Interactive Sound & Intermedia Studio (DISIS). As an emerging contemporary intermedia ensemble with a uniquely open design, L2Ork thrives upon the quintessential form of collaboration found in the western classical orchestra and its cross-pollination with increasingly accessible human-computer interaction technologies for the purpose of exploring expressive power of gesture, communal interaction, discipline-agnostic environment, and the multidimensionality of arts. Members: Ivica Ico Bukvic (Director), John Elder, Hillary Guilliams, Bennett Layman, David Mudre, Steven Querry, Philip Seward, Andrew Street, Elizabeth Ullrich and Adam Wirdzek === Recorded at: 11th International Conference on New Interfaces for Musical Expression. 30 May - 1 June 2011, Oslo, Norway. http://www.nime2011.org About the performers: L2Ork Founded by Dr. Ivica Ico Bukvic in May 2009, is part of the latest interdisciplinary initiative by the Virginia Tech Music Department’s Digital Interactive Sound & Intermedia Studio (DISIS). As an emerging contemporary intermedia ensemble with a uniquely open design, L2Ork thrives upon the quintessential form of collaboration found in the western classical orchestra and its cross-pollination with increasingly accessible human-computer interaction technologies for the purpose of exploring expressive power of gesture, communal interaction, discipline-agnostic environment, and the multidimensionality of arts. Members: Ivica Ico Bukvic (Director), John Elder, Hillary Guilliams, Bennett Layman, David Mudre, Steven Querry, Philip Seward, Andrew Street, Elizabeth Ullrich and Adam Wirdzek
@inproceedings{nime2011-music-IvicaIcoBukvicDirector2011, author = {Bukvic, Ivica Ico and Elder, John and Guilliams, Hillary and Layman, Bennett and Mudre, David and Querry, Steven and Seward, Philip and Street, Andrew and Ullrich, Elizabeth and Wirdzek, Adam}, title = {L2Ork}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26678669}, url2 = {https://vimeo.com/26678662}, url3 = {https://vimeo.com/26643771} }
-
Akito van Troyer, Jason Freeman, Avinash Sastry, Sang Won Lee, and Shannon Yao. 2011. LOLC. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: In LOLC, the musicians in the laptop orchestra use a textual performance interface, developed specifically for this piece, to create and share rhythmic motives based on a collection of recorded sounds. The environment encourages musicians to share their code with each other, developing an improvisational conversation over time as material is looped, borrowed, and transformed. LOLC was originally created by Akito van Troyer and Jason Freeman and is in active development at the Georgia Tech Center for Music Technology by Jason Freeman, Andrew Colella, Sang Won Lee and Shannon Yao. LOLC is supported by a grant from the National Science Foundation as part of a larger research project on musical improvisation in performance and education (NSF CreativeIT#0855758). About the performers: Aaron Albin, Andrew Colella, Sertan Sentürk and Sang Won Lee are current degree candidates or alumni from the Georgia Tech Center for Music Technology. All are focused on exploring new methods of musical interactivity through projects that involve current technology such as the Kinect, swarm robots, creative video games, and current MIR techniques.
@inproceedings{nime2011-music-Troyer2011, author = {van Troyer, Akito and Freeman, Jason and Sastry, Avinash and Lee, Sang Won and Yao, Shannon}, title = {LOLC}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26678685} }
-
Carles López. 2011. Reactable. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The Reactable was conceived in 2003 and was first presented at the International Computer Music Conference (ICMC) 2005 in Barcelona. Since then, the Reactable team has given more than 300 presentations and concerts in 40 countries, turning it into one of the most worldwide acclaimed new musical instruments of the 21st century. Since 2009, the Barcelona spin-off company Reactable Systems has been producing several Reactable models, such as the Reactable Live for traveling musicians and DJs, or its latest incarnation, Reactable mobile for Apple’s iPhones and iPads. About the performers: Carles López: Musician, producer and DJ born in Barcelona. López has been playing with the Reactable for the last three years. With this instrument he has performed in more than 40 countries, at all kinds of events, clubs and festivals. López also works as a composer for films and contemporary dance.
@inproceedings{nime2011-music-Lopez2011, author = {López, Carles}, title = {Reactable}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26678704} }
-
Sarah Nicolls and Nick Gillian. 2011. Improvisation for piano + motion capture system. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: SN: I wanted to get at the closest relationship possible between my hands and the resulting sound. Having worked with sampling and complex processing and various sensors such as EMG, motion capture with live sound as the source seemed a way to really get inside an improvisation system that was really live and really intuitive. You can judge for yourselves!, NG: Sarah’s movements are sensed using a Kinect 3D motion capture device and the gestures are recognised in real-time using the SEC, a machine learning toolbox that has been specifically developed for musician-computer interaction. About the performers: Sarah Nicolls UK-based experimental pianist and inventor of ‘Inside-out piano’; collaborative researcher with e.g. Atau Tanaka, PA Tremblay; concerts e.g. world premieres of Larry Goves’ Piano Concerto, Richard Barrett’s Mesopotamia/London Sinfonietta/BBC Radio; article in LMJ20; Senior Lecturer at Brunel University; funding: Arts and Humanities Research Council (AHRC), Brunel Research Initiative and Enterprise Fund (BRIEF), Arts Council England. Nick Gillian Post-doctoral researcher currently working on an E.U. project entitled SIEMPRE at the Sonic Arts Research Centre, Belfast. Nick recently completed his PhD in Gesture Recognition for Musician-Computer Interaction under the supervision of R. Benjamin Knapp and Sile O’Modhrain. His interests are in machine learning and pattern recognition and applying these techniques to enable real-time musician-computer interaction.
@inproceedings{nime2011-music-Nicolls2011, author = {Nicolls, Sarah and Gillian, Nick}, title = {Improvisation for piano + motion capture system}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26678719} }
-
Diemo Schwarz and Victoria Johnson. 2011. Suspended Beginnings. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The performance between electric violinist Victoria Johnson and Diemo Schwarz playing his interactive corpus-based concatenative synthesis software CataRT is an improvisation with two brains and four hands controlling one shared symbolic instrument, the sound space, built-up from nothing and nourished in unplanned ways by the sound of the instrument, explored and consumed with whatever the live instant filled it with. It creates a symbiotic relationship between the player of the instrument and that of the software. Live corpus-based concatenative synthesis permits here a new approach to improvisation, where sound from an instrument is recontextualised by interactive, gesture-controlled software. Not knowing what can happen is an integral part of the performance. About the performers: Victoria Johnson works with electric violin, live electronics, improvisation and musical technological issues in her artistic work. Trained as a classical violinist in Oslo, Vienna and London, she gave her debut recital in Oslo in 1995. She has established herself internationally as a soloist, chamber musician and improviser in contemporary, improvised and experimental, cross-disciplinary music and art. Diemo Schwarz: Researcher and developer at Ircam, composer of electronic music, and musician on drums and laptop with gestural controllers. His compositions and live performances, in solo as Mean Time Between Failure, or improvising with other musicians, explore the possibilities of corpus-based concatenative synthesis to re-contextualise any sound source by rearranging sound units into a new musical framework using interactive navigation through a timbral space.
@inproceedings{nime2011-music-Schwarz2011, author = {Schwarz, Diemo and Johnson, Victoria}, title = {Suspended Beginnings}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26679877} }
-
Domenico Sciajno. 2011. Sonolume. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: In this AV performance, images and sound interact: the basic elements of the images (brightness, color, saturation, hue, dislocation and relocation) are sensitive to the fundamental parameters of the sound being generated at that moment. Sound waves (also controlled by light waves during the performance) cross the physical world and alter the data stream that gives life to digital video in the same way that molecules are transformed by the sound contracting and expanding air particles in space. About the performers: Domenico Sciajno: Double bass player and composer of acoustic and electronic music. Thanks to his interest in improvisation and the influence of academic education, his research currently focuses on the creative possibilities provided by the interaction between acoustic instruments, indeterminacy factors and live processing by electronic devices or computers.
@inproceedings{nime2011-music-Sciajno2011, author = {Sciajno, Domenico}, title = {Sonolume}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26679879} }
-
Jason Dixon, Tom Davis, Jason Geistweidt, and Alain B. Renaud. 2011. The Loop. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The Loop explores the possibilities of co-located performance, decentralized composition, and the acoustics of network. This performance begins with a brief improvisation presenting acoustic sources to excite the network. This material is shared, transformed, and reintroduced into the composition. This process continues through successive generations until a predetermined time or a point at which the composition naturally concludes. The result is an integrated meta-instrument and an emergent composition, with no one artist being the sole performer or composer. Remote participants are represented locally by a mono speaker enabling the audiences to hear the transformation of audio through the networked instrument. About the performers: Jason Dixon: Irish composer currently based in Norwich where he is in the process of completing his PhD in composition. His work explores issues of language, perception and memory in music. More recently he has been focusing on the Irish storytelling tradition and its place in contemporary Ireland. Tom Davis: Digital artist working mainly in the medium of sound installation. His practice and theory based output involves the creation of technology led environments for interaction. Davis is currently a lecturer at the University of Bournemouth and holds a PhD from the Sonic Arts Research Centre, Belfast. Jason Geistweidt: Sound artist based at the University or Tromsø, Norway, researching mixed-reality stages and performance systems. He is a former faculty member of Interactive Arts and Media department at Columbia College Chicago. He holds PhD in electro-acoustic composition from the Sonic Arts Research Centre, Queens University, Belfast. Alain B. Renaud: Alain’s research focuses on networked music performance systems with an emphasis on the creation of strategies to interact over a network musically and the notion of shared networked acoustic spaces. He is a lecturer in at Bournemouth University, England and holds a PhD from the Sonic Arts Research Centre.
@inproceedings{nime2011-music-JasonDixon2011, author = {Dixon, Jason and Davis, Tom and Geistweidt, Jason and Renaud, Alain B.}, title = {The Loop}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26679893} }
-
Tone Åse, Siri Gjære, Live Maria Roggen, et al. 2011. Trondheim Voices. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Trondheim Voices is in this performance exploring a new tool in their work with voice sound and improvisation. The ensemble is working with a tracking system for sound positioning to enable a given singer’s position on stage to directly influence the sound processing, both spatialisation and effects. Through their improvisations and compositions they are exploring: a) The effect of the sound “following”’ the singers’ movements on stage. b) The flexible use of processed voice sound within the big vocal ensemble, through the control each singer gets over the sound output by moving on stage. c) The visualization of choices and changes regarding sound, both for the performer and the audience, through the movements of each singer on stage. About the performers: Trondheim Voices Professional ensemble, working with the endless possibilities within the field of vocal improvisation, to find new expressions and new music. Consisting of individual soloists, Trondheim Voices wishes to develop what happens when the unique soloist quality of each singer is set to interact with each other, and to find the collective sound and feeling. All of the singers are educated at NTNU, Trondheim, Norway. Sound: Asle Karstad. Tracking system: John Torger Skjelstad
@inproceedings{nime2011-music-Aase2011, author = {Åse, Tone and Gjære, Siri and Roggen, Live Maria and Skjerve, Heidi and Lode, Ingrid and Huke, Kirsti and Kaasbøll, Anita and Karlsen, Silje R.}, title = {Trondheim Voices}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26680007} }
-
Øyvind Brandtsegg and Carl Haakon Waadeland. 2011. Little Soldier Joe. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The duo Little Soldier Joe uses percussion and live processing to explore thematic and textural ideas that arise in the improvised interplay between these two performers. LSJ uses live sampling and manipulation matter-of-factly as an established manner of music making. The audio manipulation techniques used are based on recent developments in particle synthesis. About the performers: Øyvind Brandtsegg: Composer, musician and professor in music technology at NTNU. His focus lies in Compositionally Enabled Instruments, Particle Synthesis and sound installations. Øyvind has performed with the groups Krøyt and Motorpsycho, written music for interactive dance, theatre and TV, and worked as a programmer for other artists. His latest effort in music software programming is the “Hadron Particle Synthesizer”, to be released as a device for “Ableton Live”’ and as a VST plug-in. Carl Haakon Waadeland: Musician, composer and professor in music at NTNU. His main scientific interest lies within empirical rhythm research and the construction of models that simulate rhythm performance. Waadeland has performed and recorded amongst others with Gary Holton & Casino Steel, Warne Marsh, Siris Svale Band, Mikis Theodorakis & Arja Saijonmaa, Dadafon, and Rasmus og Verdens Beste Band. Waadeland published a book and CD on the Norwegian folk drum tradition in 2008.
@inproceedings{nime2011-music-Brandtsegg2011, author = {Brandtsegg, Øyvind and Waadeland, Carl Haakon}, title = {Little Soldier Joe}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/26680018} }
-
Jacob Selle and Stefan Weinzierl. 2011. Licht & Hiebe. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Licht & Hiebe (2010) is the first concert piece for the new Instrument: The “Hexenkessel” (witch’s cauldron) is a modified 22" timpani that uses LLP technology to turn the drumhead into an intuitive multitouch-interface for the control of live-electronics & dmx-stage-lights. The multitouch technique goes into symbiosis with a traditional instrument, keeping its acoustic qualities, but opening it to the vast possibilities of interactive multimedia. Besides the control of live-electronics the instrument features an interface to dmx-controlled stage-lights to create a situation of intense intermedial fireworks entirely controlled by the performer. The parts needed for this non-destructive timpani-hack cost less than $500. About the performers: Jacob Sello (1976, Hamburg/Germany) studied Audio Engineering, Systematic Musicology and Multimedia Composition in Hamburg. He is highly interested in the exciting possibilities that arise from the conjunction of traditional acoustic instruments and state-of-the-art technology. Pieces for clarinet controlled RC- helicopters or DJ-driven pneumatically prepared disklavier pieces are the outcome. Stefan Weinzierl (1985, Günzburg/Germany) is constantly searching for fascinating challenges beyond genre-boundaries; as a drummer in contemporary solo performances, classical ensembles and orchestras as well as in Jazz- and Rock/Pop bands. He graduated in educational sciences in Regensburg and completed the Percussion Master program at the HfMT Hamburg in 2010.
@inproceedings{nime2011-music-Selle2011, author = {Selle, Jacob and Weinzierl, Stefan}, title = {Licht \& Hiebe}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27687788} }
-
Joshua Clayton. 2011. ROYGBIV. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Refraction of Your Gaze by Indeterminate Variables (ROYGBIV) is an effort to interface sound and the visible spectrum with digital and analog media. A collage of field recording, synth pad, and mechanical noise, ROYGBIV unfolds as wavelengths of light are read with discrete color sensors. Data is communicated through microcontrollers to custom audio software and a slide projector reproduces images of the natural world. ROYGBIV is concerned with fundamental properties of sensing, perception, and the technologies that mediate such experience. Metaphysical dimensions of color and sound are implied as the projected image and rainbow form a dialectic between reflection and refraction. About the performers: Joshua Clayton: New York-based artist whose work occupies a hybrid space of media art and language. His recent projects explore semiotics, mysticism, architecture and the urban landscape, and research-based forms of practice. Joshua has just completed a master’s degree in Interactive Telecommunications from New York University.
@inproceedings{nime2011-music-Clayton2011, author = {Clayton, Joshua}, title = {ROYGBIV}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27690118} }
-
Alexander Dupuis. 2011. All Hail the Dawn. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: An interactive audiovisual feedback loop forms the basis of All Hail the Dawn. The instrument contains two simple light-sensitive oscillators. A crude spectral analysis in Max/MSP is used to filter the oscillators as well as looped buffers recorded from the instrument. A matrix of the spectral analysis, interactively altered in Jitter using audio data, is projected back onto the instrument and performer as a series of shifting patterns. This setup allows both the graphics and sound to drive each other, creating an evolving audiovisual relationship sensitive to slight changes in position, sound or processing. About the performers: Alexander Dupuis: composer, performer, and multimedia artist. His work involves live electronics and guitar, real-time graphics and 3D animation, feedback systems and audiovisual installations. He graduated from Brown University in 2010, and is currently working towards his Masters Degree in the Digital Musics program at Dartmouth College.
@inproceedings{nime2011-music-Dupuis2011, author = {Dupuis, Alexander}, title = {All Hail the Dawn}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27691545} }
-
Doug Van Nort, Pauline Oliveros, and Jonas Braasch. 2011. Distributed Composition #1. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: This piece is written in consideration of two distinct paradigms: telematic music performance and human-machine improvisation. Specifically this work is a structured improvisation for three humans and one intelligent agent, being constrained by sections that determine which pairing (duos, trios) of performers are active. Instrumentation also changes between sections in a way that blurs the line of agency and intent between acoustic human performers, laptop tablet-based human performer, and agent improviser, as the two remote (NY, Stanford) acoustic performers (v-accordion, soprano saxophone) engage with the on-stage laptop performer (GREIS system) and ambient presence of the agent performer (spatialization, loops, textures). About the performers: Doug Van Nort: Experimental musician and digital music researcher whose work includes composition, improvisation, interactive system design and cross-disciplinary collaboration. His writings can be found in Organised Sound and Leonardo Music Journal among other publications, and his music is documented on Deep Listening, Pogus and other labels. Pauline Oliveros: (1932) is a composer and improviser, teaches at RPI, plays a Roland V Accordion in solo and ensemble improvisations. Her works are available through download, cassette, CD, DVD, and Vinyl releases. Oliveros founded the Deep Listening Institute, Ltd. based in Kingston NY. Jonas Braasch: Experimental soprano saxophonist and acoustician with interests in Telematic Music and Intelligent Music Systems. He has performed with Curtis Bahn, Chris Chafe, Michael Century, Mark Dresser, Pauline Oliveros, Doug van Nort and Stuart Dempster – among others. He currently directs the Communication Acoustics and Aural Architecture Research Laboratory at RPI.
@inproceedings{nime2011-music-DougVanNort2011, author = {Nort, Doug Van and Oliveros, Pauline and Braasch, Jonas}, title = {Distributed Composition #1}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27691551} }
-
Satoshi Shiraishi and Alo Allik. 2011. mikro:strukt. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: mikro:strukt is a collaborative performance in which the custom-built e-clambone provides an acoustic source for the ensuing audiovisual environment. E-clambone is custom-built electronic instrument that consists of an aerophone supplied with haptic sensors and digital signal processing algorithms. The performance seeks to integrate elements of electro-acoustic improvisation, timbre composition and artificial intelligence based approach to autonomous audiovisual composition and explore micro level timbre composition in real time. About the performers: Satoshi Shiraishi: Electro-acoustic instrument designer/performer from Japan, currently living in The Hague, The Netherlands. He originally started his music carrier as a rock guitarist. After the meeting with computer music, he moved to The Netherlands to pursue his own way of playing computer generated sound on a stage. Alo Allik: (Estonia) has a musically and geographically restless lifestyle, which has taken him through diverse musical worlds including DJ-ing and producing electronic dance music, live laptop jams, electroacoustic composition, free improvisation, audiovisual installations and performances.
@inproceedings{nime2011-music-Shiraishi2011, author = {Shiraishi, Satoshi and Allik, Alo}, title = {mikro:strukt}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27694202} }
-
Mark Bokowiec and Julie Wilson-Bokowiec. 2011. V’Oct(Ritual). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: V’Oct(Ritual) places the audience inside a circular liminal space of sonic evocation and features the Bodycoder System© the first generation of which was developed by the artists in 1995. The Bodycoder interface is a flexible sensor array worn on the body of a performer that sends data generated by movement to an MSP environment via radio. All vocalisations, decision making, navigation of the MSP environment and qualities of expressivity are selected, initiated and manipulated by the performer, uniquely, this also includes access to gestural control of live 8-channel spatialization. This piece is fully scored with few moments of improvisation. About the performers: Julie Wilson-Bokowiec: has created new works in opera/music theatre, contemporary dance and theatre and has worked with Lindsey Kemp, Genesis P-Orridge, Psychic TV and Hermann Nitsch. Julie is a Research Fellow at CeReNem (Centre for Research in New Music) at the University of Huddersfield. Mark Bokowiec: is the manager of the electroacoustic music studios and the Spacialization and Interactive Research Lab at the University of Huddersfield where he also lectures in interactive performance, interface design and composition. Mark began creating work with interactive technologies in 1995.
@inproceedings{nime2011-music-Bokowiec2011, author = {Bokowiec, Mark and Wilson-Bokowiec, Julie}, title = {V'Oct(Ritual)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27694214} }
-
Daniel Schorno and Haraldur Karlsson. 2011. 7-of-12 dialectologies. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The formalistic identity of “7-of-12” consists of a showcase format for “penta digit instrumental inventions” diffused in quadrophonic audio and 3d interactive video projection. The dialectic intertwining of Karlsson’s abstract art and Schorno’s sonetic world extends into a composition of 12” duration. Eponymous instrument group “EIG”’ consist of two former classmates of Sonology where they among other things studied the making of alternative electronic instruments. The performance“7-of-12 dialectologies” is an outcome of collaborated teachings and methodology in dialogue with past performances. About the performers: Daniel Schorno: composer, born in Zurich in 1963. Studied composition in London with Melanie Daiken and electronic and computer music in The Hague, with Joel Ryan and Clarence Barlow. Invited by Michel Waisvisz he led STEIM - the re-nown Dutch Studio for Electro Instrumental Music, and home of “New Instruments” - as Artistic Director until 2005. He is currently STEIM’s composer-in-research and creative project advisor. Haraldur Karlsson: visual artist, born in Reykjavik 1967. Haraldur studied Multi-media in the art academy in Iceland, Media-art in AKI in Enschede and Sonology in the Royal conservatories The Hague. Haraldur is mainly focused on interactive audio/video/3D installations and performances, and instrumental computer controllers. His fire instrument “TFI”’ is part of the Little Solarsystem “LSS” navigation system that is an audio/video/3D performance.
@inproceedings{nime2011-music-Schorno2011, author = {Schorno, Daniel and Karlsson, Haraldur}, title = {7-of-12 dialectologies}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27694220} }
-
Luke Dahl and Carr Wilkerson. 2011. TweetDreams. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: TweetDreams uses real-time Twitter data to generate music and visuals. During the performance tweets containing specific search terms are retrieved from Twitter. Each tweet is displayed and plays a short melody. Tweets are grouped into trees of related tweets, which are given similar melodies. We invite the audience to participate in TweetDreams by tweeting during performance with the term #Nime2011. This term is used to identify tweets from the audience and performers. Global search terms are used to bring the world into the performance. Any tweet with these terms occurring anywhere in the world becomes part of the piece. About the performers: Luke Dahl: Musician and engineer currently pursuing a PhD at Stanford University’s CCRMA. His research interests include new musical instruments and performance ensembles, musical gesture, rhythm perception, and MIR. He has composed works for the Stanford Laptop and Mobile Phone Orchestras and also creates electronic dance music. Carr Wilkerson: System Administrator at CCRMA specializing in Linux and Mac OS systems. He is a controller and software system builder and sometime performer/impresario, instructor and researcher.
@inproceedings{nime2011-music-Dahl2011, author = {Dahl, Luke and Wilkerson, Carr}, title = {TweetDreams}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27694232} }
-
Yoichi Nagashima. 2011. Ural Power. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: Live computer music (multimedia) work, composed in 2010 and premiered in Russia. For this work, the composer developed a new interface system for musical expression. The new interface has 8 channels of infrared-ray distance sensors. This instrument is set up with two mic-stands on the stage. The performer also wears the specially developed instrument called MiniBioMuse-III which is 16 channels EMG sensor of the performance. The graphic part of this work is real-time OpenGL 3D graphics, which is live-controlled by the performance. This work is programmed in Max/MSP/jitter environment. About the performer: Yoichi Nagashima: composer/researcher/PE, was born in 1958 in Japan. Since 1991 he has been the director of the Art & Science Laboratory in Hamamatsu, Japan. He is a professor of Shizouka University of Art and Culture, Faculty of Design, Department of Art and Science. He was the General Chair of NIME04.
@inproceedings{nime2011-music-Nagashima2011, author = {Nagashima, Yoichi}, title = {Ural Power}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/27731875} }
-
Andrew Stewart. 2011. With Winds (for soprano t-stick). Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: The t-sticks grew out of a collaborative project by Joseph Malloch and composer D. Andrew Stewart, at McGill University. The first prototype was completed in 2006. The t-sticks form a family of tubular digital musical instruments, ranging in length from 0.6 metres (soprano) to 1.2 metres (tenor). They have been designed and constructed to allow a large variety of unique interaction techniques. As a result, a significant emphasis is placed on the gestural vocabulary required to manipulate and manoeuvre the instrument. The musical experience for both the performer and audience is characterised by a unique engagement between performer body and instrument. About the performers: D. Andrew Stewart (Hexagram-MATRALAB, Concordia University, Montreal, Canada): composer, pianist, clarinettist and digital musical instrumentalist. Stewart has been working in the field of music composition since 1994. Since 2000, he has been pursuing a career in live electronics – gesture-controlled – performance, after developing his own sensor-suit.
@inproceedings{nime2011-music-Stewart2011, author = {Stewart, Andrew}, title = {With Winds (for soprano t-stick)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/28226070} }
-
Tom Mays. 2011. L’instant. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Norwegian Academy of Music.
Download PDFProgram notes: "L’instant" (2011) : Solo performance for Karlax instrument and laptop. Composed and performed by Tom Mays. Originally an 8 channel tape piece, it was completely re-constructed as a live solo for the composer performing on a Karlax instrument – a gestural controller developed by Da Fact in France (see http://www.dafact.com/). Musically, "L’instant" is a musical interpretation of subatomic instantons, employing rotation and layering of parts who’s rhythms and timbres are built out of the combining and crossing of series of numbers... The scenario is roughly “from the big bang to entropy”, and a “surround sound” 5.1 diffusion space is critical to the sense of immersion within the rotating sound objects and textures. About the performer: Tom Mays: composer, computer musician and teacher, teaches at the National Superior Conservatory of Music in Paris, and is currently working on PhD at the University of Paris 8 with Horacio Vaggione. He is especially interested in gestural performance of real-time computer systems for both written and improvised music, as well as in interaction between music and video.
@inproceedings{nime2011-music-Mays2011, author = {Mays, Tom}, title = {L'instant}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Innervik, Kjell Tore and Frounberg, Ivar}, year = {2011}, month = jun, publisher = {Norwegian Academy of Music}, address = {Oslo, Norway}, url = {https://vimeo.com/28238543} }
2010
-
Stephen Barrass and Diane Whitmer. 2010. Baroque Basso Continuo for Cello, Heart (ECG) and Mind (EEG). Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: The Cellist’s brain signals and pulse figure against the Baroque Basso Continuo that they are playing. The Cellist wears the state of the art Enobio system which transmits EEG, ECG and EOG from brain activity, eye movements, muscle contractions, and pulse to a laptop computer. These signals are mapped into sound in realtime with specially designed sonication algorithms. About the performers: Stephen Barrass teaches and researches Digital Design and Media Arts at the University of Canberra. Diane Whitmer is a Neuroscientist at Starlab Pty. Ltd. in Barcelona. Geoffrey Gartner is a Cellist in the Ensemble Offspring in Sydney
@inproceedings{nime2010-music-Barrass2010, author = {Barrass, Stephen and Whitmer, Diane}, title = {Baroque Basso Continuo for Cello, Heart (ECG) and Mind (EEG)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Andrew Brown. 2010. A Live Coding Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Brown2010, author = {Brown, Andrew}, title = {A Live Coding Performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Nicolas Collins. 2010. Salvage. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Collins2010, author = {Collins, Nicolas}, title = {Salvage}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Nick Collins. 2010. Kinesics. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: Kinesics is a structure for improvisation; the computer uses extensive machine listening technology to track the pianist and generates feature-based effects. The computer also guides the pianist to explore actions from a catalogue of gestures, some of which are heavy-handed. A feedback loop is established of interpretation of sounding and physical gesture. The Computer was born in China in 2009, but eventually found its way to England to the ownership of a grubby handed computer musician. Though ostensibly based somewhere near Brighton, it went on to have many adventures around the world, and is grateful to its owner at least for never putting it in hold luggage. Though suffering an alarming logic board failure of cataclysmic proportions before even reaching its first birthday, replacement surgery by qualified though over familiar service personnel saved its life. Philosophical questions remain about the extent to which its current personality is contiguous with the old, as evidenced in various proprietary programs temporarily refusing to believe in their host brain anymore. But it is just happy it can be here tonight to play for you. There will also be a dispensable human being on stage. About the performers: Computer - Electronics Nick Collins - Piano
@inproceedings{nime2010-music-Collins2011, author = {Collins, Nick}, title = {Kinesics}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Jon Drummond. 2010. Jet Stream. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: The interactive electroacoustics in Jet Stream are created through the use of an underlying virtual model of a flute. This hybrid virtual instrument is controlled through parameters such as bore length, blow intensity, pressure, canal width, labium position. Lamorna’s “real” flute sounds are analyzed with respect to tone color, volume envelopes, frequency and spectral content. These sonic gestures are then mapped to performance parameters for the computer’s virtual flute sonification. Of course the virtual flute doesn’t have to conform to the physical constraints of the “real-world”. About the performer: Jon Drummond is a Sydney based composer, sound artist, programmer, academic and researcher. His creative work spans the fields of instrumental music, electroacoustic, interactive, sound and new media arts. Jon’s electroacoustic and interactive work has been presented widely including the International Computer Music Conferences (Denmark 1994, Canada 1995, Greece 1997, China 1999, Singapore 2003), Electrofringe, Totally Huge New Music Festival, Darwin International Guitar Festival and the Adelaide Festival of Arts. Jon is currently employed as a researcher at MARCS Auditory Laboratories, the University of Western Sydney.
@inproceedings{nime2010-music-Drummond2010, author = {Drummond, Jon}, title = {Jet Stream}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Mei-Ling Dubrau and Mark Havryliv. 2010. P[r]o[pri]et[a]ry in[ternet] [Ad]mo[ni]tion[s]. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Dubrau2010, author = {Dubrau, Mei-Ling and Havryliv, Mark}, title = {P[r]o[pri]et[a]ry in[ternet] [Ad]mo[ni]tion[s]}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Charisma Ensemble and Kirsty Beilharz. 2010. Diamond Quills Hyper-Ensemble. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Ensemble2010, author = {Ensemble, Charisma and Beilharz, Kirsty}, title = {Diamond Quills Hyper-Ensemble}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Georg Essl. 2010. Mobile Phone Orchestras presents... Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Essl2010, author = {Essl, Georg}, title = {Mobile Phone Orchestras presents...}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Bukhchuluun Ganburged, Martin Slawig, and Roger Mills. 2010. Ethernet Orchestra - Remote Networked Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFBukhchuluun Ganburged , Fiddle and Throat Singing , Martin Slawig , Roger Mills
@inproceedings{nime2010-music-Ganburged2010, author = {Ganburged, Bukhchuluun and Slawig, Martin and Mills, Roger}, title = {Ethernet Orchestra - Remote Networked Performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
2010. Stanford Mobile Phone Orchestra (MoPhO). Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-GeWang2010, author = {}, title = {Stanford Mobile Phone Orchestra (MoPhO)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Christian Haines. 2010. SOMETHING TO GO HEAR #4. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Haines2010, author = {Haines, Christian}, title = {SOMETHING TO GO HEAR #4}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Mark Havryliv. 2010. Warming for Blackall. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: This performance draws on a natural feature of a particular class of chaotic oscillators described by Julien Sprott, namely that they require a driving force in order to perform as chaotic attractors. In the unmodified equations driving forces are introduced mathematically, however, as we calculate the chaotic systems in real-time we open the door to using a performers audio signal as an input force. This class of oscillator exhibits interesting behavior in response to different frequency inputs; in particular, the systems are sensitive to changes in low frequency tones. This encourages the use of Just Intonation as a method of determining tuning systems with easily defined difference tones; the scale developed by Kraig Grady features many difference tones in an excitable range for the chaotic oscillators. About the performers: Mark Havryliv is a doctoral student developing a haptic musical instrument at the University of Wollongong. Aside from that research, he is interested in the musical possibilities of integrating real-time sonification with other disciplines like game design and creative writing. Kraig Grady, an Anaphorian now living in Australia, composes almost exclusively for acoustic instruments of his own making or modification tuned to just intonation. Often his work is combined with his Shadow Theatre productions. His work has been presented at Ballhaus Naunyn Berlin (Germany), the Chateau de la Napoule (France), the Norton Simon Museum of Art, the UCLA Armand Hammer Museum, the Pacific Asia Museum, the Los Angeles Philharmonics American Music Weekend and New Music America 1985. He was chosen by Buzz Magazine as one of the "100 coolest people in Los Angeles". Kraig Grady - Just Intonation Tuned Marimba Mark Havryliv - Saxophone
@inproceedings{nime2010-music-Havryliv2010, author = {Havryliv, Mark}, title = {Warming for Blackall}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Donna Hewitt and Avril Huddy. 2010. Idol. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Hewitt2010, author = {Hewitt, Donna and Huddy, Avril}, title = {Idol}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Holland Hopson. 2010. Life on (Planet). Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: Life on (Planet) is a work for two rocks and interactive computer processing. The performer clicks and rubs the rocks together in front of a stereo microphone. The computer responds to how the rocks are played, with particular regard to changes in tempo, articulation, volume, and position of the rocks relative to the left/right stereo field of the microphone. Complex combinations of (somewhat) controllable sounds arise from the accretion of input sound combined with feedback from the space. About the performer: Holland Hopson is a composer, improviser, and electronic artist. As an instrumentalist he performs on soprano saxophone, clawhammer banjo and electronics. He has held residencies at STEIM, Amsterdam; Experimental Music Studios, Krakow and Katowice, Poland; Sonic Arts Research Studio, Vancouver, Canada; LEMURPlex, Brooklyn; and Harvestworks Digital Media Arts, New York where he developed a sound installation based on Marcel Duchamp’s sculpture, With Hidden Noise. An avid phonographer, Holland has recorded sounds on four continents and in over a dozen countries. Holland’s latest recording is With Hidden Noises released on Grab Rare Arts (www.grabrarearts.com).
@inproceedings{nime2010-music-Hopson2010, author = {Hopson, Holland}, title = {Life on (Planet)}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Holland Hopson. 2010. Banjo & Electronics. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Hopson2011, author = {Hopson, Holland}, title = {Banjo & Electronics}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Andrew Johnston. 2010. Touching Dialogue. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: This audiovisual work for acoustic instruments and interactive software uses simple models of physical structures to mediate between acoustic sounds and computer generated sound and visuals. Musicians use their acoustic instruments to playfully interact with a physically modelled virtual sound sculpture which is projected onto the screen. The musicians use sounds produced on their acoustic instruments to reach into the virtual world and grasp, push and hit the sculpture. In response the structure glows, spins, bounces around and generates its own sounds. The pitch and timbre of the live acoustic sounds are captured and transformed by the virtual sculpture which sings back in its own way. Each individual object (or mass) in the physical model is linked to a synthesis engine which uses additive and subtractive synthesis techniques to produce a wide range of sonic textures. The frequency of oscillators of the synthesis engines are set by the acoustic sounds played by the acoustic musicians and the volume of sound produced is controlled by the movement of the masses. The effect is that the sound sculpture produces evocative sounds clearly linked to the sonic gestures of the performers and the movement of the onscreen sculpture. During performance the physical structure and characteristics of the sculpture are altered. Links between masses are cut, spring tension of the links altered and damping is ramped up and down. Thus, while transparency of operation is maintained, the complexity of the interaction between the acoustic and electronic performers and the sound sculpture itself leads to rich conversational musical interactions. About the performers: Andrew Johnston is a musician and software developer living in Sydney, Australia. He completed a music performance degree at the Victorian College of the Arts in 1995 and has performed with several Australian symphony orchestras and a number of other ensembles. Subsequently he has completed a Masters degree in Information Technology and in 2009 he completed a PhD investigating the design and use of software to support an experimental, exploratory approach to live music making. Andrew currently holds the position of Lecturer in the Faculty of Engineering and IT at the University of Technology, Sydney. Phil Slater - Trumpet Jason Noble - Clarinet
@inproceedings{nime2010-music-Johnston2010, author = {Johnston, Andrew}, title = {Touching Dialogue}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Ryo Kanda. 2010. Tennendai no 0m0s. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Kanda2010, author = {Kanda, Ryo}, title = {Tennendai no 0m0s}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Chi-Hsia Lai and Charles Martin. 2010. Strike On Stage. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFInteractivity between performer and technology is a crucial part of media performance. This is made possible through creative integration of software and hardware devices. The performance work, Strike On Stage, uses such an integration to bridge three aspects of performance: performers’ body movements, audio processing, and projected video. For the NIME 2010 concert performance, we are proposing to present a media work composed for the Strike On Stage instrument demonstrating a wide variety of interactions between the two performers, the instrument itself and the video projection. The instrument for Strike On Stage is a large performance surface for multiple players to control computer based musical instruments and visuals. This concept is a new perspective on Chi-Hsia Lai’s MPhil research project, Hands On Stage (video documentation can be found at <http://www.laichihsia.com/project>), which was a solo, audiovisual performance work created during 2007 and 2008.
@inproceedings{nime2010-music-Lai2010, author = {Lai, Chi-Hsia and Martin, Charles}, title = {Strike On Stage}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Somaya Langley. 2010. ID-i/o. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Langley2010, author = {Langley, Somaya}, title = {ID-i/o}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Eric Lyon and Ben Knapp. 2010. Stem Cells. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Lyon2010, author = {Lyon, Eric and Knapp, Ben}, title = {Stem Cells}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Robert Mackay. 2010. Altered Landscapes. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: This piece was written for Duo Contour, and uses the following poem ‘Under the Slates’ by Martin Daws as its inspiration: We are earth people Long have we hidden In the rock heavy heart And harboured our strengths Among the agonies of stone Ours is the granite Wind withered to pinnacles And the whispered secret Passed behind its scream And the dark slate blasted Into fragments of its nature Shattered forgotten bodies Patterned random Heaped on houses Dropped on churches Silenced hymns On buried villages lost to light We mourn our eagles Count our sheep Lay our seed on crusted bed spring Spines shrunk with the gravity Dreams pulled out of star flight Driven back to earth to bone To wakeful vision raw with piling rock Against the sun We are the subjects of a skyline Held in hard embrace Its dark love a sanctuary For our healing The poem reflects Daws’ response to the altered landscape formed by slate quarrying in the village of Bethesda in North Wales. The role of the instrumentalists in this piece is to create a textural accompaniment to the words. Through different sound transformation techniques, the sounds of the instruments are altered in real-time to create word-painting effects. Video sequences of the poet himself are juxtaposed against images of the area in question. These images are manipulated in real-time by the sound of the instruments themselves. The video imagery, like the music, is intended to re flect the meaning of the text. For this piece, I created my own software tools in Max/MSP/Jitter for live audio/video interaction. About the performer: Rob Mackay is a composer, sound artist and performer. He obtained a degree in Geology and Music at the University of Keele, studying composition there with Mike Vaughan, before going on to complete a Master’s and PhD with Andrew Lewis at the University of Wales, Bangor. Currently he is a lecturer in Creative Music Technology at the University of Hull, Scarborough Campus, and is the course director. Recent projects have moved towards a cross-disciplinary approach, including theatre, audio/visual installation work, and human/computer interaction. Prizes and honours include: IMEB Bourges (1997 and 2001); EAR99 from Hungarian Radio (1999); Confluencias (2003); La Muse en Circuit (2004 and 2006). His work has received over 100 performances in 16 countries (including several performances on BBC Radio 3). He has held composer residencies at Slovak Radio (Bratislava), La Muse en Circuit (Paris), and the Tyrone Guthre Arts Centre (Ireland). He has played, written and produced in a number of bands and ensembles, including the Welsh Hip-Hop collective "Tystion" with whom he collaborated alongside John Cale on the film ‘A Beautiful Mistake’, as well as recording two John Peel sessions on BBC Radio 1 and supporting PJ Harvey. More recently, he has done session work for Gowel Owen and Euros Childs. 6 CDs including his compositions are available. More information and pieces at: www.myspace.com/robflute www.digital-music-archives.com
@inproceedings{nime2010-music-Mackay2010, author = {Mackay, Robert}, title = {Altered Landscapes}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Thor Magnusson. 2010. Ixi Lang Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Magnusson2010, author = {Magnusson, Thor}, title = {Ixi Lang Performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Christopher Martinez. 2010. Radio Healer. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Martinez2010, author = {Martinez, Christopher}, title = {Radio Healer}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Chikashi Miyama. 2010. Black Vox. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Miyama2010, author = {Miyama, Chikashi}, title = {Black Vox}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Garth Paine. 2010. Grace Space. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: Grace Space is a new work for clarinet and realtime electronic transformation. It plays with sonic space and non-space; the use of the grace note to define relationships of transition from a forgotten or distant space to a familiar space, a known pitch. The piece contemplates memory, the experience of snapping out of a daydream, from distant imaginings or recollections to the real space, the events right in front of you. The realtime electronic transformation makes the space fluid and introduces a height ened depth of perspective. Surround sound spatialization techniques are used also to bring the sound of the clarinet off the stage and around the audience, subverting the audience as spectator relationship to one where the audience is at the core of the work, the position the performer usually occupies. About the performers: Garth Paine has exhibited immersive interactive environments in Australia, Europe, Japan, USA, Hong Kong and New Zealand. He is on the organizing and peer review panels for the International Conference On New Interfaces for Musical Expression (NIME), the International Computer Music Conference. He has twice been guest editor of Organized Sound Journal (Cambridge University Press) for special editions on interactive systems in music and sound installation. He is often invited to run workshops on interactivity for musical performance and commissioned to develop interactive system for realtime musical composition for dance and theatre performances. He was selected as one of ten creative professionals internationally for exhibition in the 10th New York Digital Salon; DesignX Critical Reflections, and as a millennium leader of innovation by the German Keyboard Magazine in 2000. Dr Paine was awarded the Australia Council for the Arts, New Media Arts Fellowship in 2000, and The RMIT Innovation Research Award in 2002. He is a member of the advisory panel for the Electronic Music Foundation and one of 17 advisors to the UNESCO funded Symposium on the Future, which is developing a taxonomy / design space of electronic musical instruments. Recently Dr Paine been invited to perform at the Agora Festival, Centre Pompidou, Paris (2006) and the New York Electronic Arts Festival (2007), and in 2009 will perform in Sydney, Melbourne, Perth, Lymerik Ireland, New York City, Montreal and Quebec in Canada, and Phoenix Arizona. In 2008 Dr Paine received the UWS Vice-Chancellor’s Excellence Award for Postgraduate Research Training and Supervision. Jason Noble - Clarinet
@inproceedings{nime2010-music-Paine2010, author = {Paine, Garth}, title = {Grace Space}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Stéphane Perrin and Utako Shibatsuji. 2010. The Ningen Dogs Orchestra. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Perrin2010, author = {Perrin, Stéphane and Shibatsuji, Utako}, title = {The Ningen Dogs Orchestra}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Bob Pritchard and Marguerite Witvoet. 2010. What Does A Body Know? Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Pritchard2010, author = {Pritchard, Bob and Witvoet, Marguerite}, title = {What Does A Body Know?}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Anthony Ptak. 2010. Live Bar-Coding. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Ptak2010, author = {Ptak, Anthony}, title = {Live Bar-Coding}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Robert Ratcliffe and Jon Weinel. 2010. Mutations. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDFProgram notes: "Mutations" is an interactive work exploring notions of the DJ set and the remix through the integration of various streams of piano based material in live performance. Incorporating human and machine-generated material, a “realization” of the piece involves the management of a pool of audio files, MIDI files, and score fragments, which are drawn upon during performance. In this way, the performer is required to control and shape the various streams of material in the same way that a DJ would select and combine records during the structuring of a live set (an alternative realization of ‘Mutations’ may involve the playback of mixed material, in which the trajectory of the narrative has been determined in advance). The supply of audio files, MIDI files, and score fragments used in the construction of the piece takes existing works from the piano repertoire as source material, both transformed and quoted intact, resulting in a spectrum of recognizability ranging from the easily identifiable, to the ambiguous, to the non-referential. The integration of this borrowed material within the three strands of the piece highlights various connections between traditional forms of musical borrowing, transformative imitation, improvisation, electroacoustic sound transformation and quotation, EDM sampling practices, remix practices and DJ performance. This version of ‘Mutations’ features a pre-recorded electronic part, realized using a software application created by Jon Weinel. About the performers: Robert Ratcliffe is currently completing a PhD in composition (New Forms of Hybrid Musical Discourse) at Keele University (UK). He is the first composer to develop a musical language based on the cross fertilization of contemporary art music and electronic dance music (EDM). http://www.myspace.com/visionfugitive. Zubin Kanga - Piano Robert Ratcliffe - Electronics Jon Weinel - Software Author
@inproceedings{nime2010-music-Ratcliffe2010, author = {Ratcliffe, Robert and Weinel, Jon}, title = {Mutations}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Robert Sazdov and Giuseppe Torre. 2010. MOLITVA. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Sazdov2010, author = {Sazdov, Robert and Torre, Giuseppe}, title = {MOLITVA}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Greg Schiemer. 2010. Mandala 9. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Schiemer2010, author = {Schiemer, Greg}, title = {Mandala 9}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Alexander Schubert. 2010. Laplace Tiger. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Schubert2010, author = {Schubert, Alexander}, title = {Laplace Tiger}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Michael Schunior. 2010. HAITIAN HAARPS. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Schunior2010, author = {Schunior, Michael}, title = {{HAITIAN HAARPS}}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Andrew Sorensen. 2010. Live Coding Improvisation. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Sorensen2010, author = {Sorensen, Andrew}, title = {Live Coding Improvisation}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
-
Sebastian Tomczak and Poppi Doser. 2010. Antia. Music Proceedings of the International Conference on New Interfaces for Musical Expression, University of Technology Sydney.
Download PDF
@inproceedings{nime2010-music-Tomczak2010, author = {Tomczak, Sebastian and Doser, Poppi}, title = {Antia}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {}, year = {2010}, month = jun, publisher = {University of Technology Sydney}, address = {Sydney, Australia} }
2008
-
Pascal Baltazar. 2008. Pyrogenesis. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: The composition of Pyrogenesis took inspiration from several aspects of the blacksmithing, not in a literal way, but much as a set of correspondences : First, the gesture, by which the blacksmith models the matter continuously; striking, heating, twisting, soaking metals to gradually print a form into them. Then, the tool: Just like the blacksmith manufactures his own tools, I work on developing my own electro-acoustic instrument: an instrument to write sound, in space and with a gestural input. Lastly, the organic construction of the form: Gilles Deleuze says "Why is the blacksmith a musician? It is not simply because the forging mill makes noise, it is because the music and the metallurgy are haunted by the same problem: that the metallurgy puts the matter in the state of continuous variation just as the music is haunted by putting the sound in a state of continuous variation and to found in the sound world a continuous development of the form and a continuous variation of the matter". On a more technical/scientific point of view, the interaction with the performer uses two interfaces : a Wacom tablet, and a set of force- resistive-sensors (through an analog-to-digital converter), which common point is that they both allow control by the pressure of hands, and thus offer a very “physical” mode of control. The composition/performance environment consists of a set of generative audio modules, fully addressable and presettable, including a mapping engine allowing a quick yet powerful set of mapping strategies from controllers inputs and volume envelopes to any parameter, including those of the mappers themselves, allowing a very precise, flexible, and evolutive sound/gesture relationship in time. The composition has been realized through a constant dialogue between improvisations in a pre-determined trajectory, and afterwards- listening of the produced result. Thus, most of the details of the composition have been generated by an improvisation/learning-through- repetition process, without any visual support - thus allowing to emphasize expressivity while keeping a very direct relationship to the musical gesture. About the performer: Pascal Baltazar is a composer and research coordinator at GMEA, National Center for Musical Creation in Albi, France. His research focuses on spatial and temporal perception of sound, and its relationship to the body and musical gesture. He is coordinating the Virage research platform, on control and scripting novel interfaces for artistic creation and entertainment industries, granted by the French Research Agency, in the frame of its Audiovisual and Multimedia program, for the 2008-2009 period. He is an active member of the Jamoma collective. He has studied Aesthetics (Masters of Philosophy Thesis The sonic image : material and sensation, 2001, Toulouse III, France) and electroacoustic composition at the National Conservatoire of Toulouse. He has then been implied as a composer or interactive designer in diverse artistic projects : concerts, performing arts shows and interactive installations. He has been commissioned for musical works by several institutions, as the French State, INA-GRM, GMEA, IMEB... and participated in international festivals (Présences Électroniques, Paris / Radio France Festival, Montpellier / Synthèse, Bourges / Videomedeja, Novi Sad / Space + Place, Berlin...).
@inproceedings{nime2008-music-Baltazar2008, author = {Baltazar, Pascal}, title = {Pyrogenesis}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Mark A. Bokowiec and Julie Wilson-Bokowiec. 2008. The Suicided Voice. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: The Suicided Voice is the second piece in the Vox Circuit Trilogy, a series of interactive vocal works completed in 2007. In this piece the acoustic voice of the performer is “suicided” and given up to digital processing and physical re-embodiment. Dialogues are created between acoustic and digital voices. Gender specific registers are willfully subverted and fractured. Extended vocal techniques make available unusual acoustic resonances that generate rich processing textures and spiral into new acoustic and physical trajectories that traverse culturally specific boundaries crossing from the human into the virtual, from the real into the mythical. The piece is fully scored, there are no pre-recorded soundfiles used and no sound manipulation external to the performer’s control. In The Suicided Voice the sensor interface of the Bodycoder System is located on the upper part of the torso. Movement data is mapped to live processing and manipulation of sound and images. The Bodycoder also provides the performer with real-time access to processing parameters and patches within the MSP environment. All vocalisations, decisive navigation of the MSP environment and Kinaesonic expressivity are selected, initiated and manipulated by the performer. The primary expressive functionality of the Bodycoder System is Kinaesonic. The term Kinaesonic is derived from the compound of two words: Kinaesthetic meaning the movement principles of the body and Sonic meaning sound. In terms of interactive technology the term Kinaesonic refers to the one-to-one, mapping of sonic effects to bodily movements. In our practice this is usually executed in real-time. The Suicided Voice was created in residency at the Banff Centre, Canada and completed in the electro-acoustic music facilities of the University of Huddersfield. About the performers: Mark Bokowiec (Composer, Electronics & Software Designer): Mark is the manager of the electro-acoustic music studios and the new Spacialization and Interactive Research Lab at the University of Huddersfield. Mark lectures in interactive performance, interface design and composition. Composition credits include: Tricorder a work for two quarter tone recorders and live MSP, commissioned by Ensemble QTR. Commissions for interactive instruments include: the LiteHarp for London Science Museum and A Passage To India an interactive sound sculpture commissioned by Wakefield City Art Gallery. CD releases include: Route (2001) the complete soundtrack on MPS and Ghosts (2000) on Sonic Art from Aberdeen, Glasgow, Huddersfield and Newcastle also on the MPS label. Mark is currently working on an interactive hydro-acoustic installation. Julie Wilson-Bokowiec (vocalist/performer, video and computer graphics): Julie has creating new works in opera/music theatre, contemporary dance and theatre including: Salome (Hammersmith Odeon – Harvey Goldsmith/Enid production) Suspended Sentences (ICA & touring) Figure Three (ICA) for Julia Bardsley, Dorian Grey (LBT/Opera North), Alice (LBT) and a variety of large-scale site-specific and Body Art works. As a performer and collaborator Julie has worked with such luminaries as Lindsey Kemp, Genesis P-Orridge and Psychic TV and the notorious Austrian artist Hermann Nitsch. Julie and Mark began creating work with interactive technologies in 1995 developing the first generation of the Bodycoder System in 1996.
@inproceedings{nime2008-music-Bokowiec2008, author = {Bokowiec, Mark A. and Wilson-Bokowiec, Julie}, title = {The Suicided Voice}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Mark A. Bokowiec and Julie Wilson-Bokowiec. 2008. Etch. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Etch is the third work in the Vox Circuit Trilogy (2007). In Etch extended vocal techniques, Yakut and Bell Canto singing, are coupled with live interactive sound processing and manipulation. Etch calls forth fauna, building soundscapes of glitch infestations, howler tones, clustering sonic-amphibians, and swirling flocks of synthetic granular flyers. All sounds are derived from the live acoustic voice of the performer. There are no pre-recorded soundfiles used in this piece and no sound manipulation external to the performer’s control. The ability to initiate, embody and manipulate both the acoustic sound and multiple layers of processed sound manipulated simultaneously on the limbs – requires a unique kind of perceptual, physical and aural precision. This is particularly evident at moments when the source vocal articulates of the performer, unheard in the diffused soundscape, enter as seemingly phantom sound cells pitch-changed, fractured and heavily processed. In such instances the sung score, and the diffused and physically manipulated soundscape seem to separate and the performer is seen working in counterpoint, articulating an unheard score. Etch is punctuated by such separations and correlations, by choric expansions, intricate micro constructions and moments when the acoustic voice of the performer soars over and through the soundscape. Although the Bodycoder interface configuration for Etch is similar to that of The Suicided Voice, located on the upper torso - the functional protocols and qualities of physical expressivity are completely different. Interface flexibility is a key feature of the Bodycoder System and allows for the development of interactive works unrestrained by interface limitations or fixed protocols. The flexibility of the interface does however present a number of challenges for the performer who must be able to adapt to new protocols, adjust and temper her physical expressivity to the requirements of each piece. The visual content of both Etch and The Suicided Voice was created in a variety of 2D and 3D packages using original photographic and video material. Images are processed and manipulated using the same interactive protocols that govern sound manipulation. Content and processing is mapped to the physical gestures of the performer. As the performer conjures extraordinary voices out of the digital realm, so she weaves a multi-layered visual environment combining sound, gesture and image to form a powerful ’linguistic intent’. Etch was created in residency at the Confederation Centre for the Arts on Prince Edward Island, Nova Scotia in June 2007. About the performers: Mark Bokowiec (Composer, Electronics & Software Designer). Mark is the manager of the electro-acoustic music studios and the new Spacialization and Interactive Research Lab at the University of Huddersfield. Mark lectures in interactive performance, interface design and composition. Composition credits include: Tricorder a work for two quarter tone recorders and live MSP, commissioned by Ensemble QTR. Commissions for interactive instruments include: the LiteHarp for London Science Museum and A Passage To India an interactive sound sculpture commissioned by Wakefield City Art Gallery. CD releases include: Route (2001) the complete soundtrack on MPS and Ghosts (2000) on Sonic Art from Aberdeen, Glasgow, Huddersfield and Newcastle also on the MPS label. Mark is currently working on an interactive hydro-acoustic installation. Julie Wilson-Bokowiec (vocalist/performer, video and computer graphics). Julie has creating new works in opera/music theatre, contemporary dance and theatre including: Salome (Hammersmith Odeon – Harvey Goldsmith/Enid production) Suspended Sentences (ICA & touring) Figure Three (ICA) for Julia Bardsley, The Red Room (Canal Café Theatre) nominated for the Whitbread London Fringe Theatre Award, Dorian Grey (LBT/Opera North), Alice (LBT) and a variety of large- scale site-specific and Body Art works. As a performer and collaborator Julie has worked with such luminaries as Lindsey Kemp, Genesis P-Orridge and Psychic TV and the notorious Austrian artist Hermann Nitsch. She guest lectures in digital performance at a number of University centres, and together with Mark, regularly publishes articles on interactive performance practice. Julie and Mark began creating work with interactive technologies in 1995 developing the first generation of the Bodycoder System an on- the-body sensor interface that uses radio to transmit data in 1996. They have created and performed work with the Bodycoder System at various events and venues across Europe the US and Canada and at artist gatherings including ISEA and ICMC. Major works include Spiral Fiction (2002) commissioned by Digital Summer (cultural programme of the Commonwealth Games, Manchester). Cyborg Dreaming (2000/1) commissioned by the Science Museum, London. Zeitgeist at the KlangArt Festival and Lifting Bodies (1999) at the Trafo, Budapest as featured artists at the Hungarian Computer Music Foundation Festival NEW WAVES supported by the British Council.
@inproceedings{nime2008-music-Bokowiec2009, author = {Bokowiec, Mark A. and Wilson-Bokowiec, Julie}, title = {Etch}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Thomas Ciufo. 2008. Silent Movies: an Improvisational Sound / Image Performance. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Silent Movies is an attempt to explore and confront some of the possible relationships / interdependencies between visual and sonic perception. In collaboration with a variety of moving image artists, this performance piece complicates visual engagement through performed / improvised sound. In a sense, Silent Movies plays with the live soundtrack idea, but from a somewhat different vantage point. Or maybe it is an inversion; a visual accompaniment to an improvised sonic landscape? For this performance, I will use a hybrid extended electric guitar / computer performance system, which allows me to explore extended playing techniques and sonic transformations provided by sensor controlled interactive digital signal processing. For tonight’s performance, the moving image composition is by Mark Domino (fieldform.com). For more information, please refer to online documentation: Guitar performance system : http://ciufo.org/eighth_nerve_guitar.html Performance documentation: http://ciufo.org/silent_movies.html About the performer: Thomas Ciufo is an improviser, sound / media artist, and researcher working primarily in the areas of electroacoustic improvisational performance and hybrid instrument / interactive systems design, and is currently serving as artist-in- residence in Arts and Technology at Smith College. Recent and ongoing sound works include, three meditations, for prepared piano and computer, the series, sonic improvisations #N, and eighth nerve, an improvisational piece for prepared electric guitar and computer. Recent performances include off-ICMC in Barcelona, Visione Sonoras in Mexico City, the SPARK festival in Minneapolis, the International Society for Improvised Music conference in Ann Arbor, and the Enaction in Arts conference in Grenoble.
@inproceedings{nime2008-music-Ciufo2008, author = {Ciufo, Thomas}, title = {Silent Movies: an Improvisational Sound / Image Performance}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Jon Drummond. 2008. Sonic Construction. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Inspired by the swirls, vortices and lemniscate like patterns created by moving water and other fluids, Sonic Construction uses the movement of coloured dyes in a semi-viscous liquid to generate and control sound. The work is performed by dropping different coloured dyes (red, green, yellow, blue) into a clear glass vessel filled with water, made slightly viscous through the addition of a sugar syrup (Figure 2). Through the use of video tracking, the speed, colour and spatial location of the different coloured drops of dye are analysed as they are dropped into the glass vessel and subsequently expand, swirl, coil and entwine in the water. The control data derived from the video tracking of the ink drops is used to define both the shape and the way in which individual grains of sound are combined using FOF (Fonction d’Onde Formatique translated as Formant Wave-Form or Formant Wave Function) synthesis [1] [2], to create a rich and varied timbral sound environment. In developing Sonic Construction I sought to create a system that would provide a sense of connection with the interactive processes being employed and at the same time to create a system over which I had only limited direct control; ideally being influenced by the system’s responses as much as I was influencing the system. Timbres produced by the system include bass-rich pulse streams, vocal textures and a variety of bell like sounds. The fluid movement of the coloured dye in the liquid is further used to spatialise the outputs of the FOF synthesis. The video captured of the dyes in the liquid, used for motion analysis and colour matching, is also projected back into the performance space, slightly processed using contrast, saturation and hue effects. About the performer: Jon Drummond is a Sydney based composer and performer. His creative work spans the fields of instrumental music, electroacoustic, interactive, sound and new media arts. Jon’s electroacoustic and interactive work has been presented widely including the International Computer Music Conferences (Denmark 1994, Canada 1995, Greece 1997, China 1999, Singapore 2003), Electrofringe, Totally Huge New Music Festival, Darwin International Guitar Festival and the Adelaide Festival of Arts. Many of his acoustic and electronic compositions have been commissioned and performed by leading Australian performers and ensembles including austraLYSIS, The Song Company, Ros Dunlop and Kathleen Gallagher. Recently Jon has been exploring the use of environmental signals from the natural world as generative devices for creating electroacoustic sound - video tracking the fluid motions of water in "Sonic Construction" and the motion of air through the use of kites in "Sounding the Winds".
@inproceedings{nime2008-music-Drummond2008, author = {Drummond, Jon}, title = {Sonic Construction}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Stuart Favilla, Joanne Cannon, and Tony Hicks. 2008. Heretic’s Brew. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Bent Leather Band introduces their new extended instrument project, Heretics Brew. The aim of this project is to develop an extended line up with the aim of building a larger ensemble. So far the project [quintet] has developed a number of new extended saxophone controllers and is currently working on trumpets and guitars. Their instruments are based on Gluion OSC, interfaces; programmable frame gate array devices that have multiple configurable inputs and outputs. For NIME08, the ensemble trio will demonstrate their instruments, language and techniques through ensemble improvisation. About the performers: Joanne Cannon, composer/improviser, is one of Australia’s leading bassoonists. Although she began her career as a professional orchestral musician, she now works as a composer and improviser, exploring extended techniques. Stuart Favilla has a background in composition and improvisation. Together they form the Bent Leather Band, a duo that has been developing experimental electronic instruments for over twenty years in Australia. Bent Leather Band blurs virtuosity and group improvisation across a visual spectacle of stunning original instruments. These were made in conjunction with Tasmanian leather artist, Garry Greenwood. The instruments include fanciful dragon headed Light-Harps, leather Serpents and Monsters that embody sensor interfaces, synthesis and signal processing technology. Practicable and intuitive instruments, they have been built with multi-parameter control in mind. Joint winners of the Karl Szucka Preis, their work of Bent Leather has gained selection at Bourges and won the IAWM New Genre Prize. Inspired by the legacy of Percy Grainger’s Free music, i.e. “music beyond the constraints of conventional pitch and rhythm” [Grainger, 1951], Bent Leather Band has strived to develop a new musical language that exploits the potentials of synthesis/signal processing, defining new expressive boundaries and dimensions and yet also connecting with a heritage of Grainger’s musical discourse. Grainger conceived his music towards the end of the 19th Century, and spent in excess of fifty years bringing his ideas to fruition through composition for theremin ensemble, the development of 6th tone instruments [pianos and klaviers], the development of polyphonic reed instruments for portamento control and a series of paper roll, score driven electronic oscillator instruments. Tony Hicks enjoys a high profile reputation as Australia’s most versatile woodwind artist. Equally adept on saxophones, flutes and clarinets, his abilities span a broad spectrum of music genres. A student of Dr. Peter Clinch, Tony also studied at the Eastman School of Music. He has performed throughout Australia, and across Europe, the United States, Japan and China with a number of leading Australian ensembles including the Australian Art Orchestra, Elision, and the Peter Clinch Saxophone Quartet. He has performed saxophone concertos with the Melbourne Symphony Orchestra, and solo’d for Stevie Wonder and his band. As a jazz artist he has performed and recorded with leading jazz figures Randy Brecker, Billy Cobham, notable Australian artists, Paul Grabowsky, Joe Chindamo, David Jones, and also lead a number of important groups in the local Australian scene. An explorer of improvised music, he consistently collaborates with numerous artists both in Australia and overseas.
@inproceedings{nime2008-music-Favilla2008, author = {Favilla, Stuart and Cannon, Joanne and Hicks, Tony}, title = {Heretic's Brew}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Nicola Ferrari. 2008. The Bow is Bent and Drawn. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Based on the installation "Mappe per Affetti Erranti", designed and developed by Antonio Camurri, Corrado Canepa, Nicola Ferrari, Gualtiero Volpe texts from Edmund Spenser’s The Faire Queen and William Shakespeare’s King Lear with support of EU ICT Project SAME. The bow is a theatrical mise-en-scene of the installation Mappe per Affetti Erranti. During the Science Festival 2007, as a preparatory work for the EU ICT Project SAME on active listening (www.sameproject.org), the audience was invited to explore and experience a song by John Dowland (see the paper on these proceedings by Camurri et al). The audience could walk inside the polyphonic texture, listen to the singles parts, change the expressive quality of musical interpretation by their movement on the stage of Casa Paganini analysed with EyesWeb XMI. Aesthetically, the most interesting result consists in the game of hiding and revealing a known piece. The idea could be matched with the classical theatrical topos of recognition. So, the musical potentiality of the ’interactive performance’ of a prerecorded music becomes a new dramaturgical structure. Roberto Tiranti and his madrigalistic group recorded, under the supervision of Marco Canepa, different anamorphic interpretations of a bachian choral. Thanks to the interactive application developed with EyesWeb XMI, the group of dancers conducted by the choreographer Giovanni Di Cicco, mix and mould the recorded music material in real time. At the same time, the live sound of the vocal group explores the whole space of Casa Paganini, as a global (both real and imaginary) musical instrument. In a metamorphic game where, according to Corrado Canepa’s compositive lesson, electronic and acoustic technologies merge and interchange their specificity, this interactive score of losing and finding, multiplying and distillating the ancient bachian palimpsest tries to tell the dramatic history of King Lear, the most tragic western figure of difficulty to reach the affects you possess without being able to know or express. About the performers: Nicola Ferrari was born in 1973. He studied composition with Adriano Guarnieri and took his degree at ’G. B. Martini’ Conservatory in Bologna. He took his Master Degree and PhD from the Faculty of Arts and Philosophy at University of Genoa. Since 2005 he is a member of the staff of the InfoMus Lab. For many years he directed the ’S.Anna’ polyphonic choir. He wrote scores for theatrical performances. Vocalists - Roberto Tiranti (tenor and vocal conductor), Valeria Bruzzone (alto), Chiara Longobardi (soprano), Edoardo Valle (bass) Dancers - Giovanni Di Cicco (choreography), Luca Alberti, Filippo Bandiera, Nicola Marrapodi Recording engineer and music consultant - Marco Canepa Sound Engineers - Corrado Canepa (director), Chiara Erra (assistant) EyesWeb interactive systems design - Paolo Coletta, Barbara Mazzarino, Gualtiero Volpe
@inproceedings{nime2008-music-Ferrari2008, author = {Ferrari, Nicola}, title = {The Bow is Bent and Drawn}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Georg Essl Ge Wang and Henri Penttinen. 2008. MoPho – A Suite for a Mobile Phone Orchestra. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: The Mobile Phone Orchestra is a new repetoire-based ensemble using mobile phones as the primary musical instrument. The MoPhO Suite contains a selection of recent compositions that highlights different aspects of what it means to compose for and perform with such an instrument in an ensemble setting. Brief program note: The Mobile Phone Orchestra of CCRMA (MoPhO) presents an ensemble suite featuring music performed on mobile phones. Far beyond ring-tones, these interactive musical works take advantage of the unique technological capabilities of today’s hardware, transforming phone keypads, built-in accelerometers, and built-in microphones into powerful and yet mobile chamber meta-instruments. The suite consists of selection of representative pieces: ***Drone In/Drone Out (Ge Wang): human players, mobile phones, FM timbres, accelerometers. ***TamaG (Georg Essl): TamaG is a piece that explores the boundary of projecting the humane onto mobile devices and at the same time display the fact that they are deeply mechanical and artificial. It explores the question how much control we have in the interaction with these devices or if the device itself at times controls us. The piece work with the tension between these positions and crosses the desirable and the alarming, the human voice with mechanical noise. The alarming effect has a social quality and spreads between the performers. The sounding algorithm is the non-linear circle map which is used in easier-to-control and hard-to-control regimes to evoke the effects of control and desirability on the one hand the the loss of control and mechanistic function on the other hand. ***The Phones and Fury (Jeff Cooper and Henri Penttinen): how much damage can a single player do with 10 mobile phones? Facilitating loops, controllable playback speed, and solo instruments. ***Chatter (Ge Wang): the audience is placed in the middle of a web of conversations... About the performers: Ge Wang received his B.S. in Computer Science in 2000 from Duke University, PhD (soon) in Computer Science (advisor Perry Cook) in 2008 from Princeton University, and is currently an assistant professor at Stanford University in the Center for Computer Research in Music and Acoustics (CCRMA). His research interests include interactive software systems (of all sizes) for computer music, programming languages, sound synthesis and analysis, music information retrieval, new performance ensembles (e.g., laptop orchestra) and paradigms (e.g., live coding), visualization, interfaces for human-computer interaction, interactive audio over networks, and methodologies for education at the intersection of computer science and music. Ge is the chief architect of the ChucK audio programming language and the Audicle environment. He was a founding developer and co-director of the Princeton Laptop Orchestra (PLOrk), the founder and director of the Stanford Laptop Orchestra (SLOrk), and a co-creator of the TAPESTREA sound design environment. Ge composes and performs via various electro-acoustic and computer-mediated means, including with PLOrk/SLOrk, with Perry as a live coding duo, and with Princeton graduate student and comrade Rebecca Fiebrink in a duo exploring new performance paradigms, cool audio software, and great food. Georg Essl is currently Senior Research Scientist at Deutsche Telekom Laboratories at TU-Berlin, Germany. He works on mobile interaction, new interfaces for musical expression and sound synthesis algorithms that are abstract mathematical or physical models. After he received his Ph.D. in Computer Science at Princeton University under the supervision of Perry Cook he served on the faculty of the University of Florida and worked at the MIT Media Lab Europe in Dublin before joining T-Labs. Henri Penttinen was born in Espoo, Finland, in 1975. He completed his M.Sc. and PhD (Dr. Tech.) degrees in Electrical Engineering at the Helsinki University of Technology (TKK) in 2002 and 2006, respectively. He conducted his studies and teaches about digital signal processors and audio processing at the Department of Signal Processing and Acoustics (until 2007 known as Laboratory of Acoustics and Signal Processing) at TKK. Dr. Penttinen was a visiting scholar at Center for Computer Research in Music and Acoustics (CCRMA), Stanford University, during 2007 and 2008. His main research interests are sound synthesis, signal processing algorithms, musical acoustics, real-time audio applications in mobile environments. He is one of the co-founders and directors, with Georg Essl and Ge Wang, of the Mobile Phone Orchestra of CCRMA (MoPhO). He is also the co-inventor, with Jaakko Prättälä, of the electro-acoustic bottle (eBottle). His electro-acoustic pieces have been performed around Finland, in the USA, and Cuba. Additional Composer Biography: Jeffrey Cooper is a musician / producer from Bryan, Texas. Having worked as a programmer and DJ for a number of years, he is currently finishing a Master Degree in Music, Science, and Technology at Stanford University / CCRMA. Co- composer of music for mobile phones with the honorable Henri Penttinen.
@inproceedings{nime2008-music-GeWang2008, author = {Ge Wang, Georg Essl and Penttinen, Henri}, title = {MoPho – A Suite for a Mobile Phone Orchestra}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Roberto Girolin. 2008. Lo specchio confuso dall’ombra. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Lo specchio confuso dall’ombra can be translated as “The mirror confused by its shadow” and it is between a distributed installation and a concert, in which opposing groups of performers in two remote places play solo or interact. The audience (two people at a time, one for each installation) activates video and sound transformations, depending on the space they occupy and their gesture. The two installation are in the Foyer and in the Auditorium, respectively, so the two persons from the audience cannot see and talk each other. Multimodal data and expressive gesture cues are extracted in real- time by an EyesWeb patch, interacting and playing with the electronic performer. The interaction occurs both between the electronic performer and the two places where the audience has access, and between the two remote installations. There are two different levels of intervention in the audio and video transformation: autonomous, depending on the single person and conditioned, depending on the behaviour and the actions occurring in the other, separate installation. Further, the entrance of the concert hall has microphones, which capture words, sentences, coughs, laughs or other noise, which are transformed in real-time and thus entering into the piece. Lo specchio confuso dall’ombra can’t bind the audience remain seated or follow a specific pattern in his behaviour. His duration is indefinite: it changes every time it is performed. About the performers: Roberto Girolin (1975) was born in Pordenone, Italy, and after studying of the classical guitar he began to study the piano and composition at the "J. Tomadini" Conservatory in Udine. He studied the vocal and instrumental counterpoint, graduating in choral music and conducting in the same Conservatory. He has conducted many choirs and orchestras, exploring different kinds of repertories from Gregorian music to contemporary music. He has deepened the study of contemporary music at the University of Udine with Dr.A.Orcalli and then with Dr.N.Venzina at "B.Maderna" Archive in Bologna (Italy). He has followed several Masterclasses and seminars: choral music, chamber music, composition (Salvatore Sciarrino, Fabio Nieder, Mauro Bonifacio), electronic music (Lelio Camilleri, Agostino Di Scipio), a Sound Design course with Trevor Wishart, an Audio Digital Signal Processing for Musical Applications (Lab workshop, lessons and applications) with Giuseppe Di Giugno and live electronics in Luigi Nono’s works with Alvise Vidolin and André Richard (Experimental Studio Freiburg für Akustische Kunst). He graduated with full marks in Electronic Music and Multimedia at the Musical Academy of Pescara (Italy) and in 2006 he also got his degree at the Conservatory of Venice under the direction of Alvise Vidolin with full marks (cum Laude). He is actively involved in performing and investigating the compositional and performance potential offered by electronic&multimedia music systems. His music is performed in Italy and abroad. He has recently won the “Call 2007”, (Italian CEMAT Competition) and a Mention at the 34th "Concours Internationaux de Musique et d’Art Sonore Electroacoustiques de Bourges", France. Paolo Coletta, Simone Ghisio and Gualtiero Volpe - EyesWeb interactive systems design
@inproceedings{nime2008-music-Girolin2008, author = {Girolin, Roberto}, title = {Lo specchio confuso dall'ombra}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Keith Hamel, François Houle, and Aleksandra Dulic. 2008. Intersecting Lines. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Intersecting Lines is a collaboration between clarinetist François Houle, interactive video artist Aleksandra Dulic and computer music composer Keith Hamel. The work grew out of Dulic’s research in visual music and involves mapping a live clarinet improvisation onto both the visual and audio realms. In this work an intelligent system for visualization and signification is used to develop and expand the musical material played by the clarinet. This system monitors and interprets various nuances of the musical performance. The clarinetist’s improvisations, musical intentions, meanings and feelings are enhanced and extended, both visually and aurally, by the computer system, so that the various textures and gestured played by the performer have corresponding visuals and computer-generated sounds. The melodic line, as played by the clarinet, is used as the main compositional strategy for visualization. Since the control input is based on a classical instrument, the strategy is based on calligraphic line drawing using artistic rendering: the computer-generated line is drawn in 3D space and rendered using expressive painterly and ink drawing styles. The appearance of animated lines and textures portray a new artistic expression that transforms a musical gesture onto a visual plane. Kenneth Newby made contributions to the development of the animation software. This project was made possible with generous support of Social Sciences and Humanities Research Council of Canada. About the performers: François Houle has established himself as one of Canada’s finest musicians. His performances and recordings transcend the stylistic borders associated with his instrument in all of the diverse musical spheres he embraces: classical, jazz, new music, improvised music, and world music. As an improviser, he has developed a unique language, virtuosic and rich with sonic embellishments and technical extensions. As a soloist and chamber musician, he has actively expanded the clarinet’s repertoire by commissioning some of today’s leading Canadian and international composers and premieringover one hundred new works. An alumnus of McGill University and Yale University, François has been an artist-in-residence at the Banff Centre for the Arts and the Civitella Ranieri Foundation in Umbria, Italy. Now based in Vancouver, François is a leader in the city’s music community and is considered by many to be Canada’s leading exponent of the clarinet. Keith Hamel is a Professor in the School of Music, an Associate Researcher at the Institute for Computing, Information and Cognitive Systems (ICICS), a Researcher at the Media and Graphics Interdisciplinary Centre (MAGIC) and Director of the Computer Music Studio at the University of British Columbia. Keith Hamel has written both acoustic and electroacoustic music and his works have been performed by many of the finest soloists and ensembles both in Canada and abroad. Many of his recent compositions focus on interaction between live performers and computer-controlled electronics. Aleksandra Dulic is media artist, theorist and experimental filmmaker working at the intersections of multimedia and live performance with research foci in computational poetics, interactive animation and cross-cultural media performance. She has received a number of awards for her short animated films. She is active as a new media artist, curator, a writer, an educator, teaching courses, presenting art projects and publishing papers, across North America, Australia, Europe and Asia. She received her Ph.D. from the School of Interactive Art and Technology, Simon Fraser University in 2006. She is currently a Postdoctoral research fellow at the Media and Graphics Interdisciplinary Centre, University of British Columbia funded by Social Sciences and Humanities Research Council of Canada (SSHRC).
@inproceedings{nime2008-music-Hamel2008, author = {Hamel, Keith and Houle, François and Dulic, Aleksandra}, title = {Intersecting Lines}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Giorgio Klauer. 2008. Tre Aspetti del Tempo per Iperviolino e Computer. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Putting a distance sensor under the scroll of the instrument and an inclination sensor on the wrist, the detection of the displacements of the limbs of the interpreter becomes possible. These displacements, drawn onto a cartesian plane, give the coordinates of a track in an ideal performing space, whose third dimension is increased and formed by the passing of time. Actually, the computer permits to assimilate to the aforesaid track the sounding path proposed by the interpreter, hence to rehear it. Also in the latter case, the coordinates to access it are given by current gestures, therefore the dimension of time results bundled, somehow like considering a parchment palimpsest: the sounding form returned by the computer results increasingly dense and inexplicable and needs an electroacoustic exegesis that unleash it at least in shreds. The procedures of musical production are here a metaphor for knowledge; alike are the compositional methods at the root of the score, which providing the prescriptions of the musical path, portrays in addition a mental track. About the performer: Giorgio Klauer studied electronic music, instrumental composition, flute and musicology in Trieste, where he was born in 1976, in Cremona and in Liège. He is professor at the Conservatory of Como, school of music and sound technologies.
@inproceedings{nime2008-music-Klauer2008, author = {Klauer, Giorgio}, title = {Tre Aspetti del Tempo per Iperviolino e Computer}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Martin Messier and Jacques Poulin-Denis. 2008. The Pencil Project. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: The Pencil Project is a performance piece created by sound artists Martin Messier and Jacques Poulin-Denis. Their intention was to craft a live electronic music piece inspired by the physicality of writing and the imagery it articulates. The performers translate scribbling, scratching, dotting and drawing with pencil music. The computers are hidden and untouched throughout the piece, allowing object manipulation and the creation of sound to be the performers’ main focus. The Pencil Project is about musicianship. Liberated from the computer screen and equipped with hands-on objects, the performers explore a new form of expressivity. Through an authentic and stimulating performance, the musicians bring computer music intimately close to playing an actual musical instrument. About the performers: Martin Messier: Holding a diploma in drums for jazz interpretation, Martin Messier has completed a bachelor’s degree in electroacoustic composition at the University of Montreal, and De Montfort University in England. Recently, Martin has founded a solo project called « et si l’aurore disait oui... », through which he develops live electroacoustic performance borrowing stylistic elements from Intelligent Dance Music, acousmatic and folk. Based on strong aptitudes for rhythm, Martin’s esthetic can be defined as a complex, left field and happily strange sound amalgam, constantly playing with construction and deconstruction. Jacques Poulin-Denis is active in projects that intersect theater, dance and music. He has completed his undergraduate studies in electroacoustic composition from the University of Montreal, and De Montfort University in England. Most of his music was composed for theater and dance. Jacques explores innovative ways of presenting electro-acoustic music. Jacques’ musical style is evocative and filled with imagery. Combining traditional and electronic instruments with anecdotic sound sources of everyday life, he creates vibrant music that is fierce and poetic.
@inproceedings{nime2008-music-Messier2008, author = {Messier, Martin and Poulin-Denis, Jacques}, title = {The Pencil Project}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Chikashi Miyama. 2008. Keo Improvisation for sensor instrument Qgo. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: "Keo" is a performance for voice improvisation, Qgo sensor instrument, and live electronics. The author attempts to realize three concepts in the work. The first is "dual-layered control," in which the performer improvises phrases by singing and providing sound materials for a computer. Simultaneously, he sends commands to the computer to process vocals using a pair of sensor devices worn on both hands. between the visuality of the performance and the musical gestures. In most parts of the performance, the movement of the sensor instrument and the musical parameters are clearly connected. If the performer moves his hand even slightly, particular aspects of the sound are influenced in an obvious manner. The third is the strong connection between music and theatricality. In several parts of this work, the body motions of the performer not only control the sensor device, but also provide some theatrical meanings. About the performer: Chikashi Miyama received his BA(2002) and MA(2004) from the Sonology Department, Kunitachi College of Music, Tokyo, Japan and Nachdiplom(2007) from Elektronisches studio, Musik-Akademie der Stadt Basel, Basel, Switzerland. He is currently attending the State University of New York at Buffalo for his ph.D. He has studied under T.Rai, C.Lippe, E.Ona, and G.F.Haas. His works, especially his interactive multimedia works, have been performed at international festivals, such as June in Buffalo 2001 (New york, USA) , Mix ’02 (Arfus, Denmark), Musica Viva ’03 (Coimbra, Portugal), Realtime/non-realtime electronic music festival (Basel, Switzerland), Next generation’05 (Karlsruhe, Germany), as well as various cities in Japan. His papers about his works and realtime visual processing software "DIPS" have also been accepted by ICMC, and presented at several SIGMUS conferences. Since 2005, he has been performing as a laptop musician, employing his original sensor devices and involving himself in several Media-art activities, such as Dorkbot, Shift-Festival, SPARK, and SGMK workshops. His compositions have received honorable mention in the Residence Prize section of the 30th International Electroacoustic Music Competition Bourges and have been accepted by the International Computer Music Conference in 2004, 2005, 2006 and 2007. Several works of him are published, including the Computer Music Journal Vol.28 DVD by MIT press and the ICMC 2005 official CD.
@inproceedings{nime2008-music-Miyama2008, author = {Miyama, Chikashi}, title = {Keo Improvisation for sensor instrument Qgo}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Ernesto Romero and Esthel Vogrig. 2008. Vistas. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: VISTAS(2005) - Choreography with video, one musician playng live electronics and two dancers with metainstruments interacting with the music. Divided in three scenes the work is conceptually based in the “self-other” cognitive phenomena inspired by Edgar Morin’s idea of the evolution of society through interdisciplinary interaction. The interdisciplinary feature of the piece is carefully constructed using 2 metainstruments that link the formal elements in a structural way. This metainstruments are two wireless microphones plugged into two stethoscopes attached to the dancers hands. The movements of the dancers make the microphones generate an amplitude that is transmitted to the computer and mapped into different music elements. Some live voice participations from the dancers add dramatic accents to the piece. Vistas is en integral piece in wich the music supports the choreography as well as the choreography gets influenced by the music. The video supports the scene creating an abstract space that changes and evolves according to the performance. The musical aesthetic has Noise elements and voice sample manipulation playing with texture and density contrast in a very dynamic way. The language of the choreography comes from an exploration of the planes in a 3rd dimension space by separate first and united later. The language is also influenced by the need to achieve the best usage as possible of the metainstrument. About the performers: Los Platelmintos are a group of artists, living in Mexico City, that work under the premise of interdiscipline and experimentation. Dance, music and electronic media are fundamental elements in their work. Ernesto Romero: music composition and electronic media. Studies Composition, Mathematics and Choir conduction in México. Chief of the Audio Department at the National Center for the Arts in México where he researches and developes technology applied to the arts. Esthel Vogrig : Coreographer and dancer. Studies contemporary dance and coreography in México, V ienna and the United States. Director of Los PLatelmintos company. Recipient of the "Grant for Investigation and Production of Art Works and New Media” from the National Council of the Arts and the Multimedia Center in Mexico. This grant was used to produce the piece Vistas. Karina Sánchez: Dancer. Studies contemporary dance and coreography in Chile, Spain and México.
@inproceedings{nime2008-music-Romero2008, author = {Romero, Ernesto and Vogrig, Esthel}, title = {Vistas}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Alessandro Sartini. 2008. Aurora Polare. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Aurora Polare (Polar Dawn) is a short piece for cymbals, tam-tam, vibraphone, live electronics and EyesWeb system. This piece was inspired by the smooth movements of waves, the drawings created by polar dawns and the cold weather in polar seas – that’s the reason why only metallophones are used. The first matter to fight with was making the percussionist elaborate the sound they produce while playing their instruments and crafting a brand-new easy way to specify every movement. That’s why, under the traditional notation score, two special lines follow the music specifying the direction to move to: up-down and left-right/near-far. A line approaching the top or the bottom of the Y axis tells the way to track. You can find an example here on the left. All of those movements fully interact with EyesWeb and MAX MSP thru two 30fps accelerometer bracelets worn by the performers. Every vertical movement controls the volume of the processed sound, while horizontal movements manage a different patch in MAX MSP suited to every instrument: a tam-tam sample speed controller (this make the instrument play without being touched), an harmonizer to make cymbals sing just like a Theremin, but with their own processed sound, and the rate of a delay. In the control room a MIDI controller and a computer will be used to manage live additional effects and parameters, like granular synthesis, reverb and multi-slider filters. Thanks to Martino Sarolli for helping me with MAX MSP, to Matteo Rabolini and Matteo Bonanni for playing my composition. About the performer: Alessandro Sartini: Born in Genoa in 1982, he studied piano with Canzio Bucciarelli and attends the last year of Composition at the Conservatory of Genoa with Riccardo Dapelo, who introduced him to “live electronic” treatments. His first public exhibition was at the Auditorium Montale of the Carlo Felice Theatre in Genoa, during the concert commemorating the 50th anniversary of Béla Bartók’s death in 1995. From that year on he established a great number of collaboration with various solo musicians, who really appreciated his way to accompany; this guided him to work in partnership with a good number of professional soloists. In 1999 he joined the class of Composition at the Conservatory of Genoa with Luigi Giachino, who introduced him to film music: this interest led him to win the third prize at the Lavagnino International Film Music Festival in Gavi in 2006 and the first prize at the “Concorso Internazionale di Composizione di Alice Belcolle" in 2007. With Valentina Abrami, he is the founder of the “Associazione Musica in Movimento”, which operates at the “International School in Genoa”.
@inproceedings{nime2008-music-Sartini2008, author = {Sartini, Alessandro}, title = {Aurora Polare}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }
-
Alison Rootberg and Margaret Schedel. 2008. NIME Performance - The Color of Waiting. Music Proceedings of the International Conference on New Interfaces for Musical Expression, Casa Paganini.
Download PDFProgram notes: Developed in Amsterdam, at STEIM, The Color of Waiting uses animation, movement and video to portray themes of expectation. This collaboration (between animator Nick Fox-Gieg, chorographer/dancer Alison Rootberg, composer/programmer Margaret Schedel, and set designer Abra Brayman) deals with the anticipation of events by understanding the way time unfolds. The performers shift between frustration and acceptance as they portray the emotions evoked when waiting for something or someone. The Color of Waiting is an experience and a mood, an abstraction depicting human interaction. About the performers: Alison Rootberg and Margaret Schedel founded The Kinesthetech Sense in 2006 with the intent to collaborate with visual artists, dancers, and musicians, creating ferociously interactive experiences for audiences throughout the world. Rootberg, the Vice President of Programming for the Dance Resource Center, focuses on incorporating dance with video while Schedel, an assistant professor of music at Stony Brook University, combines audio with interactive technologies. Oskar Fischinger once said that, "everything in the world has its own spirit which can be released by setting it in motion." Together Rootberg and Schedel create systems which are set in motion by artistic input, facilitating interplay between computers and humans. Kinesthetech Sense has had their work presented throughout the US, Canada, Denmark, Germany, Italy, and Mexico. For more info, please go to: www.ksense.org
@inproceedings{nime2008-music-Schedel2008, author = {Rootberg, Alison and Schedel, Margaret}, title = {NIME Performance - The Color of Waiting}, booktitle = {Music Proceedings of the International Conference on New Interfaces for Musical Expression}, editor = {Doati, Roberto}, year = {2008}, month = jun, publisher = {Casa Paganini}, address = {Genova, Italy} }