Work in progress. This project is based on Helsinki VideoMEG project and currently works with video files recorded using software from that project.
Screenshot of the browser extension showing video from our validation measurement (in which a plushie named Herba kindly
volunteered to be the test subject), synchronized with MNE-Python's sample MEG data.
In addition to MNE-Python, this project currently requires package OpenCV
for standard video file reading.
Here's how to set up a environment with all the necessary dependencies:
-
Create a new conda environment (named
mne-videomeg
) with MNE-Python and OpenCV:conda create --channel=conda-forge --strict-channel-priority --name=mne-videomeg mne opencv
-
Clone this repository and navigate to project root.
-
Activate the environment:
conda activate mne-videomeg
-
Install the package in editable mode:
pip install -e .
Now you should be able to test the video browser by running example scripts in scripts/
directory. For example:
python scripts/run_sync_demo_with_sample_data.py
Script run_sync_demo_with_sample_data.py
uses a sample dataset from MNE Python and a fake video file. Other examples require you
to have your own raw data and video files in a correct format.
Tests are located in directory tests/
and they run using package pytest
. You can install it to your environment by running:
pip install -e .[dev]
Then you can run all the tests with:
pytest
You can also selectively run tests in a specific file/class/method. See pytest documentation for details.