Select Page

Experiments in neurodata-driven immersive visual art using the X.on EEG headset

by Balázs Knakker, Grastyán Translational Research Centre, University of Pécs
& Andrea Sztojánovits, Intermedia Department, Hungarian University of Fine Arts

Many visual artists gain inspiration from psychology and neuroscience, and quite a few use existing or newly acquired data in their artistic process. Interdisciplinary collectives that bring together artists, scientists, and technologists offer an ideal environment for those who strive to find a way to create artwork that is authentic both from an artistic and a scientific perspective. One such interdisciplinary effort is the ‘12 Hertz‘ [1] initiative, launched in early 2024 in Budapest, Hungary within the Lighthouse collective of light- and new media artists, curated by artists Andrea Sztojánovits, Gábor Kitzinger and curator Viola Lukács. The initiator of the 12 Hertz exhibition, audiovisual artist Andrea Sztojánovits, senior lecturer from the Hungarian University of Fine Arts has been commissioned by Light Art Museum Budapest to curate a one-week masterclass in the summer of 2024. Building on her research on the connection of art and neuroscience, she developed the concept of the ‘IMMERSIVE SPACE | IMMERSIVE MIND’ masterclass centered on the use of electroencephalographic data as a basis of a visual installation set in the unique, zeppelin-shaped immersive projection environment of the museum. She in turn brought EEG researcher Balázs Knakker from the University of Pécs on board to work together on the neuroscientific and neurotechnological part of the course.

The masterclass was held between 8 and 14, July, 2024 in Budapest with 13 participants, kicking off with an introductory overview of neuro-art, data art and immersive space theories on a conference day open to the public with speakers invited from several parts of the world. Besides the aforementioned NEURO lab section led by Andrea and Balázs, the curriculum involved immersive space engineering and concepts in the SPACE lab with Viktor Vicsek (co-founder of Limelight) and Takeshi Yamada (from teamLab), data visualization in the DATA lab with Gábor Kitzinger and Machiel Veltkamp from the University of Utrecht. After an intense week, the participants familiarized with the basics of recording and processing EEG with some forays into the neuroscientific background and interpretation, and created an immersive artwork based on their own EEG data recorded during the masterclass.

Experiments in neurodata-driven immersive visual art using the X.on EEG headset

Participant Réka Harsányi’s screenshot from her data visualization project in TouchDesigner based on the EEG data recorded with X.on

For the NEURO lab section of the course we – Andrea Sztojánovits and Balázs Knakker – needed a lightweight, easy to use technological stack that still has the features required for neuroscientific research. Many tools used in neuro-art applications satisfy the first, usability requirement, but fall short of the important scientific requirement of transparency. In particular, many EEG headsets already provide pre-processed and derivative signals so that the whole process is not easy to adjust or see through. Also, many EEG headsets record ongoing EEG, but are not suited to record Event-related Potentials that are the basis of the bulk of EEG research in neuroscience, and since the masterclass focused on visual immersive spaces, we wanted to demonstrate the recording and processing of visual Event-related Potentials (VEPs). While we could have arranged to visit an EEG lab in one of Budapest’s several neuroscience research facilities, we also wanted to have an adaptable toolset that could as well be part of various artistic scenarios. In summary, we looked for an EEG device and processing stack with good usability that is transparent, suitable for ERP and also adaptable to artistic applications.

Balázs has been using Brain Products hardware for many years for his academic research, and has been in contact with Hungarian vendor Tamás Stolmár, who, besides providing valuable advice for our project, has drawn our attention to the newly released X.on EEG headset. After testing it among a few other devices, we decided to use the X.on during the course. We were particularly convinced by the usability and good product design, both regarding the hardware and the software: on the course we could literally start recording EEG after just a few minutes of preparation. The ease of use partly stems from the choice of the electrode positions, and we accepted the compromise that we had no electrodes at the O1 and O2 positions of the 10/20 system that are closest to the visual cortex – we could still record and demonstrate resting visual cortical alpha activity from the available P3 and P4 electrodes, which is probably the most important phenomenon to show to someone who is familiarizing with EEG. We were particularly satisfied with how the electrodes can be adjusted to fit the unique head geometry of the participants despite the fact that the device is a one-size-fits-all headset. From the wearer’s perspective, the headset was very comfortable, almost unnoticeable after a few minutes, which could be very beneficial if used in an immersive artistic setting.

Experiments in neurodata-driven immersive visual art using the X.on EEG headset

The X.on is set up for recording on the head of course participant and teacher Gianluca del Gobbo. The wire connected at the back of the headset to the AUX input receives the photo signal from the stimulus display via a Photo Sensor connected to StimTrak (on the table on the left side of the picture; photo by Andrea Sztojánovits).

Experiments in neurodata-driven immersive visual art using the X.on EEG headset

Instructors demonstrating the use of the X.on app to the participants (photo by Eszter Gyöngyösi).

So, it was clear that we could use the device for demonstrating and recording resting EEG, but when we were preparing the course we were first unsure regarding our equally important requirement to record VEPs. We shared our concerns with Brain Products, and their support and development teams shared that they’ve been working on a related feature and supplied us with a ‘bridge’ cable that can connect the auxiliary input of the X.on device with Brain Products’ existing solution for event synchronization, the StimTrak. They also tested this bridge cable solution and sent us the results. Andrea created simple visual pattern stimuli and added a small square to the bottom right corner of the composition, where we attached the Photo Sensor, the signal of which was sent to the AUX input through StimTrak – just like in research labs. [2]

Thanks to the ease-of-use and the newly available features of X.on, we could make altogether five three-minute recordings with each of the 13 participants: eyes-open and eyes-closed resting EEG, a simple, short visual odd-ball paradigm to demonstrate the P300 ERP component, and two recordings with repetitive visual stimulation to demonstrate evoked and steady-state VEPs. The participants came in groups and took turns being EEG recording subjects, while the others could practice the recording set-up and could observe the live EEG signal. The X.on recording software smoothly ran even on Balázs’ 4-year-old mid-tier Android phone, but also on any Android phone that we tried. During recording, we used a display filter, so the 50-Hz noise that is characteristic of high-impedance recordings in unshielded rooms did not interfere with seeing the data live, but the recorded data was raw. We could choose the exact online filtering settings as we would in a lab-based EEG setup. We eventually did not record directly on the phone but streamed the EEG to a laptop via the LSL [3] (Lab Streaming Layer) protocol, where Brain Products’ open-source LSL Viewer and LabRecorder was receiving the data for viewing and saving, respectively.

After everyone’s EEG had been recorded, we went on to preprocess the data. We used the precompiled version of EEGLAB (it does not require a MATLAB license, so it is completely free). We applied an offline notch filter to get rid of the line noise, and the resulting signal quality was comparable to that of research EEG systems. Also, we were a bit surprised to find how effective ICA was to remove eye blinks from just 7 channels of data, and it also ran very quickly because of the short recordings. So, while it is considered an advanced method, ICA turned out to be the most useful preprocessing tool even in this specific setting of artists who were novice in EEG.

Experiments in neurodata-driven immersive visual art using the X.on EEG headset

Demonstration of EEG signal inspection and artifact rejection in EEGLAB (photo by Machiel Veltkamp)

As described above, we first recorded, then preprocessed and used the data – actually we wilfully resisted the temptation of using the X.on to drive live visuals during the course, since it would have shifted the focus away from the purpose of the course, that is, to get to know and carefully process the signals. During the preparation of the course we nevertheless tested the scenario of generating live visuals with X.on. We found one easy-to-use software that supports both OSC [4] and LSL: NeuroPype. We could easily build an LSL-to-OSC bridge pipeline in NeuroPype, through which we could stream EEG to the generative visual software TouchDesigner live from the X.on device. It was just a proof-of-concept, but after some testing and validation a pipeline like this would most probably work in production.

In summary, the X.on proved to be an invaluable tool in the NEURO lab of the Masterclass, meeting our requirements in ease-of-use, transparency, interoperability and adaptability. Besides artists who want to use EEG as an artistic tool in a way that is aligned with the philosophy we tried to disseminate during the course, we would also recommend it to someone who wants to provide a jump-start introduction to their students to research-grade EEG systems.

Footnotes

[1] „The ’12 Hertz’ process exhibition was inspired by Brion Gysin’s “Dream Machine” concept (1958), which is a closed eye installation and stimulates the brain with flickering light.” https://lighthouse.art/12-hertz/

[2] Supplementary markers could have been sent via LSL (see also footnote 3) over Wi-fi, we tested it but did not use it during the course. Also note that LSL markers can also have fair timing accuracy, see here and here.

[3] LSL (Lab Streaming Layer) is an open-source networked middleware ecosystem to stream, receive, synchronize, and record neural, physiological, and behavioral data streams acquired from diverse sensor hardware. See https://labstreaminglayer.org/ for more.

[4] Open Sound Control, the industry-standard protocol in the creative field

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

Brain Products GmbH
Zeppelinstraße 7
82205 Gilching
Germany

Contact us
Email: info@xon-eeg.com
Phone: +49 (0) 8105 733 84 0
Fax: +49 (0) 8105 733 84 505

Stay in touch

— X.on news alert

— Follow us

Quick links

— Imprint | Site Notice
— Privacy Policy
© 2024 Brain Products GmbH | All Rights Reserved.
Loading...