We are very excited to meet other EEG hackers at BrainHack, Paris, February 24-26. Here are the outlines of our 5 minute scientific and artistic pitch. Further down you see us connect pedals, several synthesizers and sequencers, all controlled by a dancer.

EEGsynth: scientific and development pitch

Stephen Whitmarsh & Robert Oostenveld (presenters), Per Huttner & Jean-Louis Huhta

We are very happy to be able to present to you the EEGsynth – a hardware module and codebase for real-time sonificantion of EEG, ECG and EMG for the purpose of artistic and educational performances. The project is organized specifically for distributed open-source development by neuroscientists and programmers, and developed together with artists to function ‘out of the box’  in most electronic music situations:

  • The EEGsynth is a Eurorack module that can be readily patched with analogue synthesizer modules using both analogue CV/Gate signals and MIDI.
  • User definable parameters can be controlled remotely using an html-based interface.
  • Analysis and output parameters can be controlled in real-time (live) using MIDI controllers.
  • While primarily developed for OpenBCI, it can be readily paired with commercially available EEG devices such as the Emotiv and Neurosky.

In our presentation we will explain how we optimizedthe hardware and code for distributed parallel development of all aspects of its functionality, inspired by our experience in developing open-source real-time analysis (the FieldTrip toolbox), and modularity in live analogue sound synthesis:

  • The code is developed using Python.
  • Data acquisition and availability to the code-base are implemented using the FieldTrip realtime buffer using a flexible and transparent TCP/IP protocol
  • The code is separated in independent modules according to functionality, such as I/O, MIDI control, frequency analysis, heart-beat detection & visualization.
  • Modules run in parallel, communicating in approximate real-time using a Redis database.

EEGsynth: artistic pitch

Per Huttner (presenter), Jean-Louis Huhta, Robert Oostenveld & Stephen Whitmarsh

The EEGsynth has been developed by neuroscientists, artists and musicians. It takes naturally occurring electrical signals from brain and muscles and transforms these to into sounds. The project was set up with two goals in mind:

  • To create a tool that can be used neuroscientists, artists and musicians to allow new research, performances, collaborations and dialogues.
  • To create a platform for meetings between different disciplines to deepen the everyday practice of neuroscientists, artists, musicians and pedagogues through formal and informal dialogues.

Both aspects of the project have functioned well: We have set up workshops, exchanges and performances, while the work has offered us personal and professional development. We are now entering the next phase where we want to introduce the EEGsynth into new contexts and groups. We will shortly present how we have used the EEGsynth to date to explore art, music and dance. We hope that at BrainHack we can share topics of mutual inspiration, knowledge and experience in a continuous exchange between art and science.