The use of physiological signals in Human Computer Interaction (HCI) is becoming popular and widespread, mostly due to sensors miniaturization and advances in real-time processing. However, most of the studies that use physiology based interaction focus on single-user paradigms, and its usage in collaborative scenarios is still in its beginning. In this paper we explore how interactive sonification of brain and heart signals, and its representation through physical objects (physiopucks) in a tabletop interface may enhance motivational and controlling aspects of music collaboration. A multimodal system is presented, based on an electrophysiology sensor system and the Reactable, a musical tabletop interface. Performance and motivation variables were assessed in an experiment involving a test "Physio" group(N=22) and a control "Placebo" group (N=10). Pairs of participants used two methods for sound creation: implicit interaction through physiological signals, and explicit interaction by means of gestural manipulation. The results showed that pairs in the Physio Group declared less difficulty, higher confidence and more symmetric control than the Placebo Group, where no real-time sonification was provided as subjects were using pre-recorded physiological signal being unaware of it. These results support the feasibility of introducing physiology-based interaction in multimodal interfaces for collaborative music generation.