|
| 1 | +# Getting Started |
| 2 | + |
| 3 | +PyEyesWeb is an open-source Python library for the analysis of expressive qualities in human movement. |
| 4 | +It provides algorithms to extract movement features from motion data, enabling researchers, artists, and developers to quantify and analyze movement expressivity. |
| 5 | + |
| 6 | +## Installation |
| 7 | + |
| 8 | +You can install PyEyesWeb using pip. Open your terminal and run: |
| 9 | + |
| 10 | +```bash |
| 11 | +pip install pyeyesweb |
| 12 | +``` |
| 13 | + |
| 14 | +## Quick Start |
| 15 | + |
| 16 | +Here's a simple example to get you started with PyEyesWeb. This example demonstrates how to load motion data and extract basic movement features. |
| 17 | + |
| 18 | +```python |
| 19 | +from pyeyesweb.data_models import SlidingWindow |
| 20 | +from pyeyesweb.mid_level import Smoothness |
| 21 | + |
| 22 | +# Movement smoothness analysis |
| 23 | +smoothness = Smoothness(rate_hz=50.0) |
| 24 | +window = SlidingWindow(max_length=100, n_columns=1) |
| 25 | +window.append([motion_data]) #(1)! |
| 26 | + |
| 27 | +sparc, jerk = smoothness(window) |
| 28 | +``` |
| 29 | + |
| 30 | +1. Here, `motion_data` should be replaced with your actual motion data input within a loop. For this example, we assume it's a single sample of motion data (e.g., x coordinate at time t). |
| 31 | + |
| 32 | +## Subpackages |
| 33 | + |
| 34 | +PyEyesWeb is organized into subpackages analyzing movement features at different levels of abstraction and time scales [^1]. |
| 35 | + |
| 36 | +| <div style="min-width:150px">Subpackage</div> | Description | Implemented | |
| 37 | +|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------|------------------| |
| 38 | +| [`physical_signals`](../user_guide/physical_signals/index.md) | Data acquisition from physical and virtual sensors (e.g., motion capture, IMU, video, physiological signals). | :material-close: | |
| 39 | +| [`low_level`](../user_guide/low_level/index.md) | Extraction of instantaneous descriptors from raw data (e.g., velocity, acceleration, kinetic energy, posture). | :material-check: | |
| 40 | +| [`mid_level`](../user_guide/mid_level/index.md) | Structural and amodal descriptors over movement units or windows (e.g., fluidity, coordination, lightness). | :material-check: | |
| 41 | +| [`high_level`](../user_guide/high_level/index.md) | Expressive and communicative qualities perceived by an observer (e.g., emotion, saliency, social signals). | :material-close: | |
| 42 | +| [`analysis_primitives`](../user_guide/analysis_primitives/index.md) | General-purpose operators applied at all levels (e.g., statistical moments, entropy, recurrence, predictive models). | :material-check: | |
| 43 | + |
| 44 | +## References |
| 45 | + |
| 46 | +[^1]: Camurri, A., Volpe, G., Piana, S., Mancini, M., Niewiadomski, R., Ferrari, N., & Canepa, C. (2016, July). The dancer in the eye: towards a multi-layered computational framework of qualities in movement. In Proceedings of the 3rd International Symposium on Movement and Computing (pp. 1-7). |
0 commit comments