This paper describes the theoretical underpinnings, design, and development of a hyper–instrumental performance system driven by gestural data obtained from an electric guitar. The system combines a multichannel audio feed from the guitar (which is parsed for its pitch, spectral content and note inter–onset time data to provide abstractions of sounded performance gestures) with motion tracking of the performer’s larger–scale bodily movements using a Microsoft Xbox Kinect sensor. These gestural materials are used to provide the basis for the structures of relational mappings, informed by the embodied image schema structures of Lakoff and Johnson. These theoretical perspectives are refined via larger-scale ecological-embodied structural relationships in electroacoustic music outlined in Smalley’s theory of spectromorphology, alongside the incorporation of an additional active-agential response structure through the use of the boids flocking algorithm by Reynolds to control the spatialization of outputs and other textural processes. The paper aims to advance a broadly-applicable ’performance gesture ecology’, providing a shared spatial-relational mapping (a ’basic gestural space’) which allows for creative (but still coherent) mappings from the performance gestures to the control of textural and spatial structures.