Skip to content

Code for our ICRA23 paper "Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing"

License

Notifications You must be signed in to change notification settings

Anvilondre/kifnet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

KIFNet: Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing

This is a repository, containing training source code we used for our paper accepted to ICRA 2023.

Model architecture

In order to run our code you will need:

  1. Egocentric Vision & Kinematics dataset. Download it and change appropriate data config files to match your paths.
  2. ml-mobileone package and model weights (we used S0 unfused), that you need to place in ./ml-mobileone/weights/ folder.
  3. An existing Weight & Biases project. We used W&B for experiment tracking and configuration, hence it is required to run our pipeline if you are not willing to modify the code.

Contributors:

About

Code for our ICRA23 paper "Continuous Prediction of Leg Kinematics during Walking using Inertial Sensors, Smart Glasses, and Embedded Computing"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages