In this paper, we introduce and analyze four gesture-controlled musical instruments. We briefly discuss the test platform designed to allow for rapid experimentation of new interfaces and control mappings. We describe our design experiences and discuss the effects of system features such as latency, resolution and lack of tactile feedback. The instruments use virtual reality hardware and computer vision for user input, and three-dimensional stereo vision as well as simple desktop displays for providing visual feedback. The instrument sounds are synthesized in real-time using physical sound modeling.