BRAAHMS: A Novel Adaptive Musical Interface Based on Users’ Cognitive State

Yuksel, Beste Filiz and Afergan, Daniel and Peck, Evan and Griffin, Garth and Harrison, Lane and Chen, Nick and Chang, Remco and Jacob, Robert

Proceedings of the International Conference on New Interfaces for Musical Expression

We present a novel brain-computer interface (BCI) integrated with a musical instrument that adapts implicitly (with no extra effort from user) to users’ changing cognitive state during musical improvisation. Most previous musical BCI systems use either a mapping of brainwaves to create audio signals or use explicit brain signals to control some aspect of the music. Such systems do not take advantage of higher level semantically meaningful brain data which could be used in adaptive systems or without detracting from the attention of the user. We present a new type of real-time BCI that assists users in musical improvisation by adapting to users’ measured cognitive workload implicitly. Our system advances the state of the art in this area in three ways: 1) We demonstrate that cognitive workload can be classified in real-time while users play the piano using functional near-infrared spectroscopy. 2) We build a real-time, implicit system using this brain signal that musically adapts to what users are playing. 3) We demonstrate that users prefer this novel musical instrument over other conditions and report that they feel more creative.