This paper documents the first developmental phase of aninterface that enables the performance of live music usinggestures and body movements. The work included focuseson the first step of this project: the composition and performance of live music using hand gestures captured using asingle data glove. The paper provides a background to thefield, the aim of the project and a technical description ofthe work completed so far. This includes the developmentof a robust posture vocabulary, an artificial neural networkbased posture identification process and a state-based system to map identified postures onto a set of performanceprocesses. The paper is closed with qualitative usage observations and a projection of future plans.