In this paper, we describe the development of multi-platform tools for Audiovisual and Kinetic installations. These involve the connection of three development environments: Python, SuperCollider and Processing, in order to drive kinetic art installations and to combine these with digital synthesis of sound and image in real time. By connecting these three platforms via the OSC protocol, we enable the control in real time of analog physical media (a device that draws figures on sand), sound synthesis and image synthesis. We worked on the development of algorithms for drawing figures and synthesizing images and sound on all three platforms and experimented with various mechanisms for coordinating synthesis and rendering in different media. Several problems were addressed: How to coordinate the timing between different platforms? What configuration to use? Clientserver (who is the client who the server?), equal partners, mixed configurations. A library was developed in SuperCollider to enable the packaging of algorithms into modules with automatic generation of GUI from specifications, and the saving of configurations of modules into session files as scripts in SuperCollider code. The application of this library as a framework for both driving graphic synthesis in Processing and receiving control data from it resulted in an environment for experimentation that is also being used successfully in teaching interactive audiovisual media.