This paper introduces studies conducted with musicians that aim to understand modes of human-robot interaction, situated between automation and human augmentation. Our robotic guitar system used for the study consists of various sound generating mechanisms, either driven by software or by a musician directly. The control mechanism allows the musician to have a varying degree of agency over the overall musical direction. We present interviews and discussions on open-ended experiments conducted with music students and musicians. The outcome of this research includes new modes of playing the guitar given the robotic capabilities, and an understanding of how automation can be integrated into instrument-playing processes. The results present insights into how a human-machine hybrid system can increase the efficacy of training or exploration, without compromising human engagement with a task.