TY - JOUR
T1 - Talking with hands and feet
T2 - Selective somatosensory attention and fMRI enable robust and convenient brain-based communication
AU - de Wauw, Cynthia Van
AU - Riecke, Lars
AU - Goebel, Rainer
AU - Kaas, Amanda
AU - Sorger, Bettina
N1 - Copyright © 2023. Published by Elsevier Inc.
PY - 2023/5/23
Y1 - 2023/5/23
N2 - In brain-based communication, voluntarily modulated brain signals (instead of motor output) are utilized to interact with the outside world. The possibility to circumvent the motor system constitutes an important alternative option for severely paralyzed. Most communication brain-computer interface (BCI) paradigms require intact visual capabilities and impose a high cognitive load, but for some patients, these requirements are not given. In these situations, a better-suited, less cognitively demanding information-encoding approach may exploit auditorily-cued selective somatosensory attention to vibrotactile stimulation. Here, we propose, validate and optimize a novel communication-BCI paradigm using differential fMRI activation patterns evoked by selective somatosensory attention to tactile stimulation of the right hand or left foot. Using cytoarchitectonic probability maps and multi-voxel patterns analysis (MVPA), we show that the locus of selective somatosensory attention can be decoded from internally generated fMRI signal patterns from fMRI-signal patterns in (especially primary) somatosensory cortex with high accuracy and reliability, with the highest classification accuracy (85.93%) achieved when using Brodmann area 2 (SI-BA2) at a probability level of 0.2. Based on this outcome, we developed and validated a novel somatosensory attention-based yes/no communication procedure and demonstrated its high effectiveness even when using only a limited amount of (MVPA) training data. For the BCI user, the paradigm is straightforward, eye-independent, and requires only limited cognitive functioning. In addition, it is BCI-operator friendly given its objective and expertise-independent procedure. For these reasons, our novel communication paradigm has high potential for clinical applications.
AB - In brain-based communication, voluntarily modulated brain signals (instead of motor output) are utilized to interact with the outside world. The possibility to circumvent the motor system constitutes an important alternative option for severely paralyzed. Most communication brain-computer interface (BCI) paradigms require intact visual capabilities and impose a high cognitive load, but for some patients, these requirements are not given. In these situations, a better-suited, less cognitively demanding information-encoding approach may exploit auditorily-cued selective somatosensory attention to vibrotactile stimulation. Here, we propose, validate and optimize a novel communication-BCI paradigm using differential fMRI activation patterns evoked by selective somatosensory attention to tactile stimulation of the right hand or left foot. Using cytoarchitectonic probability maps and multi-voxel patterns analysis (MVPA), we show that the locus of selective somatosensory attention can be decoded from internally generated fMRI signal patterns from fMRI-signal patterns in (especially primary) somatosensory cortex with high accuracy and reliability, with the highest classification accuracy (85.93%) achieved when using Brodmann area 2 (SI-BA2) at a probability level of 0.2. Based on this outcome, we developed and validated a novel somatosensory attention-based yes/no communication procedure and demonstrated its high effectiveness even when using only a limited amount of (MVPA) training data. For the BCI user, the paradigm is straightforward, eye-independent, and requires only limited cognitive functioning. In addition, it is BCI-operator friendly given its objective and expertise-independent procedure. For these reasons, our novel communication paradigm has high potential for clinical applications.
U2 - 10.1016/j.neuroimage.2023.120172
DO - 10.1016/j.neuroimage.2023.120172
M3 - Article
C2 - 37230207
SN - 1053-8119
VL - 276
SP - 120172
JO - NeuroImage
JF - NeuroImage
ER -