Official implementation of 'MOCHA: Real-Time Motion Characterization via Context Matching'.
Project Page: https://dk-jang.github.io/MOCHA_SIGASIA2023/
The code provides:
- Data preparation
- Pretrained model and Demo
- Trainging code for MOCHA
First, clone the repo. Then, we recommend creating a clean conda environment, installing all dependencies, and finally activating the environment, as follows:
git clone https://github.com/DK-Jang/MOCHA_private.git
cd MOCHA_private
conda env create -f environment.yml
conda activate MOCHA
- To run the demo, please download the bvh, and pre-trained parameters both.
[Recommend] To download the datasets and the pretrained-networks, run the following commands:
bash download.sh datasets
bash download.sh pretrained-network
If you want to generate train datasets from the scratch, run the following commands:
python ./preprocess/generate_database_bin.py # generate mirrored bvh files
After downloading the pre-trained parameterss, you can run the demo.
We use the post-processed results (Ours_*.bvh
) for all the demo.
To generate motion characterization results, run following commands:
python test_fullframework.py
Generated motions(bvh format) will be placed under ./results
.
Src_*.bvh
: ground-truth source motion,
Ours_*.bvh
: characterized output motion,
This repository contains pieces of code from the following repositories:
Learned Motion Matching.
Motion Puzzle: Arbitrary Motion Style Transfer by Body Part.