This paper describes a package of modular tools developed for use with virtual reality peripherals to allow for music composition, performance and viewing in ‘real-time’ across networks within a spectralist paradigm. The central tool is SpectraScore, a Max/MSP abstraction for analysing audio signals and ranking the resultant partials according to their harmonic pitch class profiles. This data triggers the generation of objects in a virtual world based on the ‘topography’ of the source sound, which is experienced by network clients via Google Cardboard headsets. They use their movements to trigger audio in various microtonal tunings and incidentally generate scores. These scores are transmitted to performers who improvise music from this notation using Leap Motion Theremins, also in VR space. Finally, the performance is broadcast via a web audio stream, which can be heard by the composer-audience in the initial virtual world. The ‘real-time composers’ and performers are not required to have any prior knowledge of complex computer systems and interact either using head position tracking, or with a Oculus Rift DK2 and a Leap Motion Camera.