Enabling Adaptive Interaction in the Metaverse Using a Hybrid EEG-Based Brain–Computer Interface
This paper presents a hybrid model for the control of brain-computer interfaces (BCIs) for Metaverse environments, with the goal of advancing the capabilities of such interfaces beyond the traditional motor imagery (MI) or P300-based brain-computer interfaces. This hybrid model utilizes P300 for virtual devices’ interaction and MI for navigation and movement imagination in the Metaverse, with each EEG modality being dedicated to a particular control state and state changes being made sequentially based on the context of the interaction. In the experiment, the imagined movement of the left and right hands is used for rotational navigation, while discrete devices’ actions use P300 responses under a five-stimulus oddball paradigm. In the performance evaluation, the paper shows that the hybrid model, with the use of MI and P300 under a single BCI, achieves accuracy comparable to single-mode BCIs, with advantages over existing BCI systems regarding their capabilities for interaction and adaptability, thus proving the effectiveness of hybrid control for achieving dynamic and flexible Metaverse interactions.