This paper presents a prototypical tool for sound selection driven by users’ gestures. Sound selection by gesturesis a particular case of "query by content" in multimedia databases. Gesture-to-Sound matching is based on computing the similarity between both gesture and sound parameters’ temporal evolution. The tool presents three algorithms for matching gesture query to sound target. Thesystem leads to several applications in sound design, virtualinstrument design and interactive installation.