In this paper, we describe an adaptive approach to gesture mapping for musical applications which serves as a mapping system for music instrument design. A neural network approach is chosen for this goal and all the required interfaces and abstractions are developed and demonstrated in the Pure Data environment. In this paper, we will focus on neural network representation and implementation in a real-time musical environment. This adaptive mapping is evaluated in different static and dynamic situations by a network of sensors sampled at a rate of 200Hz in real-time. Finally, some remarks are given on the network design and future works.