SASLO: A Scene-Aware Spatial Layout Optimization System for AR-SSVEP

arXiv:2604.06190v1 Announce Type: new
Abstract: Steady-state visual evoked potential (SSVEP) is widely used in brain-computer interfaces (BCIs) due to its reliability. With the integration of augmented reality (AR), AR-SSVEP enables more intuitive interaction by embedding visual stimuli into real-world environments. However, unlike conventional computer screen-based SSVEP (CS-SSVEP) systems with stable visual conditions, AR-SSVEP performance is influenced by real-world scene factors, such as luminance and color, which degrade stimulus perception and weaken SSVEP elicitation. Nevertheless, existing studies primarily focus on offline analyses of SSVEP-related factors in indoor settings, while online adaptive optimization for outdoor AR-SSVEP remains limited.
Therefore, a scenario-aware spatial layout optimization (SASLO) system for AR-SSVEP is proposed, which jointly considers scene luminance and inter-stimulus distance (ISD) for adaptive stimulus layout optimization. Scene luminance is estimated using an RGB-CIE based method, and the extracted context is incorporated into a linear contextual bandit (LCB) model to recommend optimized spatial layouts. Two pilot single-factor experiments are conducted to characterize the effects of luminance and ISD on SSVEP performance and to construct reliable rewards for model training. An outdoor online experiment with ten subjects further validates the proposed joint optimization method, achieving an average accuracy of 0.89 and an information transfer rate of 35.74 bits/min with a 3 s input window, and consistently outperforming two baseline methods. Overall, the proposed SASLO system is shown to improve the robustness of AR-SSVEP in real-world outdoor environments.

Liked Liked