You are here

Decoding spatial attention with EEG and virtual acoustic space

Decoding spatial attention with EEG and virtual acoustic space

Dong, Yue ; Raif, Kaan E ; Determan, Sarah C ; Gai, Yan
Physiological Reports

Decoding spatial attention based on brain signals has wide applications in brain–computer interface (BCI). Previous BCI systems mostly relied on visual patterns or auditory stimulation (e.g., loudspeakers) to evoke synchronous brain signals. There would be difficulties to cover a large range of spatial locations with such a stimulation protocol. The present study explored the possibility of using virtual acoustic space and a visual‐auditory matching paradigm to overcome this issue. The technique has the flexibility of generating sound stimulation from virtually any spatial location. Brain signals of eight human subjects were obtained with a 32‐channel Electroencephalogram (EEG). Two amplitude‐modulated noise or speech sentences carrying distinct spatial information were presented concurrently. Each sound source was tagged with a unique modulation phase so that the phase of the recorded EEG signals indicated the sound being attended to. The phase‐tagged sound was further filtered with head‐related transfer functions to create the sense of virtual space. Subjects were required to pay attention to the sound source that best matched the location of a visual target. For all the subjects, the phase of a single sound could be accurately reflected over the majority of electrodes based on EEG responses of 90 s or less. The electrodes providing significant decoding performance on auditory attention were fewer and may require longer EEG responses. The reliability and efficiency of decoding with a single electrode varied with subjects. Overall, the virtual acoustic space protocol has the potential of being used in practical BCI systems.


© Copyright 1999 - 2018 ANT Neuro | | Terms of Use | Privacy Statement | Contact | USA Customers