Nicolas Collins Hardware Hacking (Tutorial)
A hands-on tutorial (what I call a ?workshop?) in ?handmade electronic music?, tailored for the NIME audience. Since 2004 I've presented dozens of workshops in hardware hacking all over the world. Assuming no technical background whatsoever, these workshops guide the participants through a series of sound-producing electronic construction projects, from making simple contact microphones, through "bending" toys, to making oscillators and other circuits from scratch. For NIME 2010 I present a one-day tutorial that emphasizes performance interfaces for direct control of electronic sounds. The projects would include:
Please note that participation in this tutorial requires a cash materials fee of $10 AUD on the day.
Nicholas J. Bryan, Jorge Herrera, Jieun Oh, Luke Dahl and Ge Wang iPhone Instrument Design
An overview of iPhone software development for music performance and instrument applications will be addressed. Background discussions and historical context will jump start the tutorial in a workshop-like presentation and follow by a general outline of design approaches to mobile music. Specific topics include real-time audio I/O, accelerometer, compass, GPS, multi-touch, 3D graphics, OSC networking, and simple GUI interaction. A particular emphasis will be placed on the development of instrument design for performance using the recently released Stanford Mobile Phone Orchestra (MoPhO) iPhone API. The MoPhO API pro-vides a unified method of accessing the iPhones numerous onboard sensors, allowing for a consolidated and straightforward introduction to mobile music development.
The tutorial will target computer musicians, programmers, and composers interested in mobile music development for the iPhone. While no prior iPhone development or Objective-C experience is necessary, familiarity with beginner to intermediate C or C++ skills (or similar language) is required. Participants are required to have an apple computer running OS X leopard or snow leopard and an updated version of Xcode. Participants are encouraged to bring their own iPhone or iPod Touch; however, a limited number of iPhone 3GS and iPod Touch devices will be available on loan throughout the duration of the tutorial.
Sidney Fels and Michael Lyons NIME Primer: An Overview of the First Ten Years
Advances in digital audio technologies have led to a situation where computers play a role in most music production and performance. Digital technologies offer unprecedented opportunities for the creation and manipulation of sound, however the flexibility of these new technologies implies a confusing array of choices for musical composers and performers. Some artists have faced this challenge by using computers directly to create music and leading to an explosion of new musical forms. However, most would agree that the computer is not a musical instrument, in the same sense as traditional instruments, and it is natural to ask 'how to play the computer? using interface technology appropriate for human brains and bodies. A decade ago we organized the first workshop on New Interfaces for Musical Expression (NIME) with the aim of answering this question by exploring connections with the better established field of human-computer interaction. This course summarizes some of the major lessons that have been learned at NIME. We begin with an overview of the theory and practice of new musical interface design, asking what makes a good musical interface and whether there are any useful design principles or guidelines available. We also discuss topics such as the mapping from human action to musical output, and control intimacy. Practical information about the tools for creating musical interfaces will be given, including an overview of sensors and microcontrollers, audio synthesis techniques, and communication protocols such as Open Sound Control (and MIDI). The remainder of the course will consist of several specific case studies representative of the major broad themes of the NIME conference, including augmented and sensor based instruments, mobile and networked music, and NIME pedagogy.
Andrew Sorensen Impromptu Live Coding Tutorial
A half-day workshop exploring the Impromptu audiovisual development environment. Audio and visual aspects of the environment will be covered and attendees will be introduced to some of the features that make Impromptu suitable for live coding performances. The tutorial would be open to any attendees with some background in programming. This tutorial would be broadly of interest to any attendees interested in audiovisual programming on the OSX platform.
Dick Rijken, Takuro Lippit, Kristina Andersen Instrument Design, the next generation: musical concepts
The topic of this tutorial is the design of instruments for live performance, with a special emphasis on the different levels of consideration that need to be designed: Body behaviour, Instrument hardware, Sensor technology, Data mapping, Parameter control, Conceptual structure, and Musical context.
This tutorial focuses on the core subject of NIME and will attempt to take this discussion to a higher level by emphasing an integrated view of musicality and conceptual aspects of instrument design. Art comes from concepts and musical structure, not from sensors and parameters. This will be the focus of the workshop: conceptual structures like mental models, and the musical manipulation of complex structures (such as in playing with samples) will be presented as the next step in instrument design, where even the instrument itself can become a work of art.
The intended audience can range from mechanical engineers to musicians to hardware and interaction designers. Instrument design is a very multidisciplinary activity, and the workshop will focus on how concepts and methods from these disciplines can work together to create new instrument. No single person will have knowledge of all the disciplines. Participants should be familiar with any discipline involved, so that they can relate to the design process involved and define their role in that process.
Nicolas d'Alessandro and Sidney Fels Hands on Speech and Singing Synthesis
Purposes of voice synthesis have significantly evolved over the last 50 years, tackling more and more complex challenges. The early first challenge has been to produce intelligible speech. Intelligible means that the message (words, sentences) can be understood by others. For this purpose, voice articulation is defined as rules, trajectories on production models parameters. Later, we see the challenge of producing voices with a more natural timbre. Here natural means that it has to be close enough from a real human voice, in order to be confusing and thus comfortable for the listener. This challenge has been addressed for the last 15 years by using prerecorded voice material (databases) and reorganize contents (by concatenating subdivided units) in order to meet the requested sentence.
This tutorial will be useful for anybody who is interested in producing synthetic voice in a realtime application, a digital instrument or an interactive artistic project. We require for some basic understanding of how voice is produced, not more than the basic source/ filter representation, as more complex aspects of voice synthesis (such as spectral processing, articulatory models, unit management) will be reexplained and illustrated with examples. Participants are also required to be able to understand and build Max/MSP or Pd patches. Indeed studied voice synthesis techniques will be described through simple patch examples.
Ross Bencina AudioMulch Tutorial @ NIME 2010
Ross Bencina, creator of AudioMulch (www.audiomulch.com) will present a tutorial of the AudioMulch interactive music software. The tutorial will provide an introduction/overview of the software and cover a range of specific techniques for using the software. An emphasis will be placed on using AudioMulch for live music performance.
The tutorial will be aimed at practicing musicians and composers with a basic knowledge of digital sound processing/synthesis techniques and software. The software offers an easy to use graphical user interface. No programming knowledge is required. The AudioMulch software runs on Windows PCs and Macintosh computers. It is not available for Linux.
Axel Mulder I-CubeX
In the workshop, participants will learn how to apply sensor technology (with special attention to capturing human movements) to develop their own concepts of electronic media control (with special attention to controlling sound and music media), without needing to become a hardware engineer. Starting with a discussion and analysis of what the participants would like to achieve, various relevant sensor technologies will be presented together with examples of I-CubeX sensor products, and/or similar electronic art projects. If applicable, demo applications of I-CubeX products will be presented in some detail to provide suggestions to the participants how to proceed with their vision of a controller, interactive installation etc.. Each participant will then be setup with I-CubeX equipment as best matches their needs, and we'll then proceed with a crash course for getting all setups operational.
Participants will need to bring their own computer, and software such as Max/MSP, Live, Max4Live, Flash etc.. The participants need to be familiar with the software of their choice already, but no knowledge of sensors and sensor interfacing is assumed, while it will not be necessary for participants to hack hardware or solder electronic components. The participants will be offered to purchase the I-CubeX equipment at a significant discount.
This is an introductory course intended to reach (not only) NIME community but also multidisciplinary performers, such as installation artists, dancers, ...