This is a repository of the scripts used for the study: “EEG-based Brain-Computer Interface Enables Real-time Robotic Hand Control at Individual Finger Level”.
Ding, Y., Udompanyawit, C., Zhang, Y., & He, B. (2025). EEG-based brain-computer interface enables real-time robotic hand control at individual finger level. Nature communications, 16(1), 5401. https://doi.org/10.1038/s41467-025-61064-x
The scripts include the online processing and decoding as well as offline deep learning model training. EEGNet is used in this study and the tensorflow implementation of EEGNet is downloaded from the Army Research Laboratory (ARL) EEGModels project [1].
This work was supported by the National Institutes of Health via grants NS124564, NS131069, NS127849, and NS096761 to Dr. Bin He.
[1] Lawhern, V. J., Solon, A. J., Waytowich, N. R., Gordon, S. M., Hung, C. P., & Lance, B. J. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. Journal of neural engineering, 15, 056013. (2018).
Army Research Laboratory (ARL) EEGModels project repository: https://github.com/vlawhern/arl-eegmodels