Skip to content

Commit 9e50624

Browse files
Merge pull request #44 from InfoMusCP/feature/readme
update README.md and add CITATION.cff
2 parents 7bd8e5e + bc62f9b commit 9e50624

File tree

2 files changed

+108
-130
lines changed

2 files changed

+108
-130
lines changed

CITATION.cff

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
# This CITATION.cff file was generated with cffinit.
2+
# Visit https://bit.ly/cffinit to generate yours today!
3+
4+
cff-version: 1.2.0
5+
title: >-
6+
PyEyesWeb: A Python Toolkit for Expressive Movement
7+
Analysis
8+
message: >-
9+
If you use this software, please cite it using the
10+
metadata from this file.
11+
type: software
12+
authors:
13+
- name: InfoMus Lab - Casa Paganini
14+
repository-code: 'https://github.com/InfoMusCP/PyEyesWeb'
15+
url: 'https://infomuscp.github.io/PyEyesWeb/'
16+
license: MIT

README.md

Lines changed: 92 additions & 130 deletions
Original file line numberDiff line numberDiff line change
@@ -1,154 +1,116 @@
1-
# PyEyesWeb
2-
## Quantitative movement analysis toolkit
1+
# PyEyesWeb
2+
## Expressive movement analysis toolkit
3+
*A modern, modular, and accessible Python library for expressive movement analysis — bridging research, health, and the arts*
34

4-
PyEyesWeb is a research toolkit for extracting quantitative features from human movement data. It offers computational methods to analyze different qualities of movement. The repository is still in development, with the goal of creating an easy-to-use library for extracting movement qualities from raw motion data.
5+
[![PyPI version](https://img.shields.io/pypi/v/pyeyesweb.svg)](https://pypi.org/project/pyeyesweb/)
6+
[![Docs](https://img.shields.io/badge/docs-latest-blue.svg)](https://infomuscp.github.io/PyEyesWeb/)
7+
[![License](https://img.shields.io/github/license/USERNAME/PyEyesWeb.svg)](LICENSE)
58

6-
## Overview
7-
8-
Movement analysis involves extracting meaningful features from high-dimensional spatiotemporal data. PyEyesWeb provides computational methods to quantify movement characteristics at multiple levels, from basic kinematics to complex coordination patterns.
9-
10-
At the moment, this toolkit addresses four key movement qualities:
11-
- **Smoothness**: Measuring smoothness and control with SPARC and jerk-based metrics
12-
- **Bilateral symmetry**: Analyzing left-right coordination through canonical correlation analysis and phase synchronization
13-
- **Contration-Expansion Index**: Measuring contraction and expansion patterns in movement trajectories
14-
- **Synchronization Index**: Assessing synchronization between multiple participants or body segments
9+
`PyEyesWeb` is a research toolkit for extracting quantitative features from human movement data.
10+
It builds on the **Expressive Gesture Analysis** library of [EyesWeb](https://casapaganini.unige.it/eyesweb_bp), bringing expressive movement analysis into Python as a core aim of the project.
11+
The library provides computational methods to analyze different qualities of movement, supporting applications in **research, health, and the arts**.
12+
It is designed to facilitate adoption in **artificial intelligence and machine learning pipelines**, while also enabling seamless integration with creative and interactive platforms such as **TouchDesigner, Unity, and Max/MSP**.
1513

1614
## Installation
1715

1816
```bash
19-
git clone https://github.com/InfoMusCP/PyEyesWeb.git
20-
cd PyEyesWeb
21-
pip install -e .
22-
23-
# For development
24-
pip install -e .[dev]
17+
pip install pyeyesweb
2518
```
2619

27-
## Quick start
28-
20+
## Usage
21+
A minimal example of extracting movement features with `PyEyesWeb`
22+
:
2923
```python
30-
from pyeyesweb import Smoothness, BilateralSymmetryAnalyzer
31-
from pyeyesweb.data_models.sliding_window import SlidingWindow
32-
import numpy as np
24+
from pyeyesweb.data_models import SlidingWindow
25+
from pyeyesweb.mid_level import Smoothness
3326

3427
# Movement smoothness analysis
3528
smoothness = Smoothness(rate_hz=50.0)
36-
window = SlidingWindow(window_size=100)
37-
window.add_frame(motion_data)
29+
window = SlidingWindow(max_length=100, n_columns=1)
30+
window.append([motion_data])
31+
# here `motion_data` is a float representing a single sample of motion data
32+
# (e.g., the x coordinate of the left hand at time t).
3833

39-
metrics = smoothness(window)
40-
print(f"SPARC: {metrics['sparc']}, Jerk RMS: {metrics['jerk_rms']}")
41-
42-
# Bilateral symmetry analysis
43-
symmetry_analyzer = BilateralSymmetryAnalyzer()
44-
symmetry_index = symmetry_analyzer.calculate_symmetry_index(
45-
left_trajectory, right_trajectory
46-
)
34+
sparc, jerk = smoothness(window)
4735
```
36+
!!! tip
37+
For more advanced and complete use cases see the [Documentation](https://infomuscp.github.io/PyEyesWeb/)
38+
and the [examples](examples) folder.
4839

49-
## Core Modules
50-
51-
### Smoothness Analysis
52-
Movement smoothness quantification using a few metrics such as:
53-
- SPARC (Spectral Arc Length)
54-
- Root mean square jerk
55-
56-
We also use Savitzky-Golay filtering to smoothen signals.
57-
58-
### Bilateral Symmetry Analysis
59-
Research-validated methods for bilateral coordination assessment:
60-
- Canonical Correlation Analysis (CCA)
61-
- Phase synchronization using Hilbert transform
62-
- Coefficient of variation-based symmetry indices
63-
64-
Based on methodology from:
65-
- Bilateral motion data fusion research (Pubmed: 29993408)
66-
- Wheelchair propulsion symmetry analysis (MDPI Symmetry, 2022)
67-
68-
### Contraction/Expansion Analysis
69-
Geometric analysis of spatial movement patterns:
70-
- Numba-optimized area and volume calculations
71-
- Real-time contraction/expansion rate computation
72-
- 2D and 3D spatial pattern detection
73-
74-
### Synchronization Analysis
75-
Multi-participant coordination measurement:
76-
- Cross-correlation analysis
77-
- Temporal alignment algorithms
78-
- Phase coherence metrics
79-
80-
### TSV Reader
81-
Efficient motion capture data processing:
82-
- TSV file parsing
83-
- Data integrity check
84-
- Memory-efficient streaming
85-
86-
## Proposed Analysis Framework
87-
88-
The toolkit is designed to grow over time, with analyses organized into three levels: low, mid, and high. Each level builds on the one below it, moving from raw measurements to processed characteristics and finally to complex patterns of coordination.
89-
90-
### Low-Level Features
91-
These include fundamental measurements captured directly from motion data:
92-
- Position, velocity, and acceleration trajectories
93-
- Joint angles and angular velocities
94-
- Derivative-based measures such as jerk
95-
96-
### Mid-Level Features
97-
These are derived characteristics that describe how movements are performed:
98-
- Smoothness indices that reflect control and fluidity
99-
- Symmetry coefficients comparing left and right sides
100-
- Spatial metrics capturing contraction, expansion, and trajectory patterns
101-
- Phase relationships between different body parts
102-
103-
### High-Level Features
104-
These include complex patterns that emerge from coordination and strategy:
105-
- Synchronization between multiple participants
106-
- Quality of bilateral coordination
107-
- Classification of overall movement strategies
108-
- Recognition of recurring temporal patterns
109-
110-
## Documentation Structure
111-
112-
| Resource | Content |
113-
|----------|---------|
114-
| [Installation Guide](docs/installation.md) | Dependencies and setup procedures |
115-
| [Module Documentation](docs/modules/README.md) | Detailed module descriptions |
116-
117-
## Research Applications
118-
119-
### Movement Disorder Assessment
120-
```python
121-
smoothness_analyzer = Smoothness(rate_hz=100.0)
122-
patient_metrics = smoothness_analyzer(patient_data)
123-
# Quantify movement deficits in neurological conditions
124-
```
40+
## Documentation
12541

126-
### Biomechanical Analysis
127-
```python
128-
symmetry_analyzer = BilateralSymmetryAnalyzer()
129-
bilateral_coordination = symmetry_analyzer.analyze_gait_symmetry(
130-
left_limb_data, right_limb_data
131-
)
132-
# Assess gait asymmetries and compensation patterns
133-
```
42+
Comprehensive documentation for `PyEyesWeb` is available online and includes tutorials, API references, and the theoretical and scientific background of the implemented metrics:
13443

135-
### Motor Learning Studies
136-
```python
137-
coordination_tracker = SynchronizationAnalyzer()
138-
learning_progress = coordination_tracker.track_skill_acquisition(
139-
baseline_data, training_data
140-
)
141-
# Monitor coordination improvements during skill acquisition
44+
- [Getting Started](https://infomuscp.github.io/PyEyesWeb/getting_started): step-by-step guide to installation and basic usage.
45+
- [API Reference](https://infomuscp.github.io/PyEyesWeb/api_reference): technical descriptions of modules, classes, and functions.
46+
- [Theoretical Foundation](https://infomuscp.github.io/PyEyesWeb/user_guide): background on the scientific principles and research behind the metrics.
47+
48+
## Support
49+
50+
If you encounter issues or have questions about `PyEyesWeb`, you can get help through the following channels:
51+
52+
- **GitHub Issues:** report bugs, request features, or ask technical questions on the [PyEyesWeb GitHub Issues page](https://github.com/infomuscp/PyEyesWeb/issues).
53+
- **Discussions / Q&A:** participate in conversations or seek advice in [GitHub Discussions](https://github.com/infomuscp/PyEyesWeb/discussions).
54+
- **Email:** Reach out to the maintainers at `cp.infomus@gmail.com` for direct support or collaboration inquiries.
55+
56+
Please provide clear descriptions, minimal reproducible examples, and version information when submitting issues—it helps us respond faster.
57+
58+
## Roadmap
59+
60+
`PyEyesWeb` is under active development, and several features are planned for upcoming releases:
61+
62+
- **Expanded feature extraction:** addition of more movement expressivity metrics (you can find an example of which features to expect in related [conceptual layer guide]().
63+
- **Improved examples and tutorials:** more interactive Jupyter notebooks and example datasets to facilitate learning and adoption.
64+
- **Cross-platform compatibility:** streamlined integration with creative and interactive platforms (e.g., [TouchDesigner plugin](https://github.com/InfoMusCP/PyEyesWebTD), Unity, Max/MSP).
65+
66+
Future development priorities may evolve based on user feedback and research needs.
67+
Users are encouraged to suggest features or improvements via [GitHub Issues](https://github.com/infomuscp/PyEyesWeb/issues).
68+
69+
## Contributing
70+
71+
Contributions to `PyEyesWeb` are welcome! Whether it's reporting bugs, adding features, improving documentation, or providing examples, your help is appreciated.
72+
73+
### How to Contribute
74+
1. **Fork the repository** and create a branch for your feature or bug fix:
75+
```bash
76+
git checkout -b feature/your-feature-name
77+
```
78+
2. Set up the development environment:
79+
```bash
80+
pip install pyeyesweb[dev]
81+
```
82+
3. Make your changes, ensuring code quality and adherence to the project's coding standards.
83+
4. Submit a pull request to the `main` branch, with a clear description of your changes.
84+
5. Engage in code reviews and address any feedback provided by maintainers.
85+
86+
## Citation
87+
88+
If you use `PyEyesWeb` in your research, please cite it as follows:
89+
90+
### BibTeX
91+
```bibtex
92+
@misc{pyeyesweb2025,
93+
title = {PyEyesWeb: A Python Toolkit for Expressive Movement Analysis},
94+
author = {InfoMus Lab – Casa Paganini},
95+
year = {2025},
96+
howpublished = {\url{https://github.com/infomuscp/PyEyesWeb}}
97+
}
14298
```
14399
144-
## Methodological Foundation
100+
## Authors & Acknowledgments
101+
102+
`PyEyesWeb` is developed by [**InfoMus Lab – Casa Paganini**](http://www.casapaganini.org/index_eng.php){:target="_blank"}, University of Genoa, as part of the **[Resilence EU Project](https://www.resilence.eu/)**, funded by the European Union’s Horizon programme.
145103
146-
PyEyesWeb implements peer-reviewed computational methods from:
147-
- Motor control and biomechanics literature
148-
- Signal processing, multivariate stats and time series analysis
149-
- Computational geometry and spatial analysis
104+
<div align="center">
105+
<img src="docs/assets/cp-logo.png" alt="InfoMus Lab Logo" width="512" style="margin:15px"/>
106+
<img src="docs/assets/resilence-logo.png" alt="Resilence Project Logo" width="200" style="margin:15px"/>
107+
<img src="docs/assets/eu-logo.png" alt="EU Logo" width="100" style="margin:15px"/>
108+
</div>
150109
151-
We are working on having parameter validation and numerical stability checks for all algorithms in this repository.
110+
### Maintainers & Contributors
111+
<a href="https://github.com/InfoMusCP/PyEyesWeb/graphs/contributors">
112+
<img src="https://contrib.rocks/image?repo=InfoMusCP/PyEyesWeb" />
113+
</a>
152114
153115
## License
154116

0 commit comments

Comments
 (0)