This repository stores the ATLAS model alongside training, inference, and deployment libraries for real-time global-scale GPS trajectory modeling. The models were developed at Ai2 and are currently deployed in the skylight maritime intelligence platform. Read more about the model on arXiv.
The repository is organized into the following top-level directories and subdirectories:
ais
: Contains scripts and modules for loading, cleaning, and preprocessing AIS data.src/atlantes
: Holds the core logic for the Atlantes system, including data processing, model training, and deployment.requirements.txt
: Specifies the required Python packages for running the Atlantes system as well as the dev requirements for development.data
: Stores various ais related data files used throughout the project in a non-production environment.tests
: Contains unit tests and integration tests for the Atlantes system.test-data
: Contains test data for the Atlantes system.
For all projects, please ensure that you follow the initial steps to setup the environment and then refer to the relevant README file in the specific project's directory.
Run:
$ pip install pre-commit
$ pre-commit install
Note that the code in ais repo will be required to pass these pre commits in order to be merged. In particular that means, besides the typical linting requirement, there are additional requirements on 100% static type annotations (which is enforced via mypy) and at least 90% documentation coverage for including both modules and functions (enforced via interrogate). Note that either setting can be bypassed for a particular directory or file by adding that file/directory to the excluded list of the corresponding module in the .pre-commit-config.yaml file. In other words these checks are opt-out rather than opt-in.
Pre commit hooks can be executed on git commit
after following the above two steps.
These steps describe how to stand up the Atlas Activity and Entity inference services locally, make requests to them, and view the responses.
- export GOOGLE_APPLICATION_CREDENTIALS to a path containing your gcp credentials
- run
docker-compose up -d --wait --build
in the ais directory
- cd to
atlantes/ais/src/examples
- run
python atlas_activity_request.py
to do activity classification - run
python atlas_entity_request.py
to do entity classification
These will save the responses to sample_response_activity.json
and sample_response_entity.json
respectively.
These show the final classification and details for the inference run.
In (add VSCode Settings), we have recommended settings for VSCode complete with required formatter and settings as well as recommended extensions.
Required Extensions:
- black
Recommended:
- Copilot
- Augment
- Gitless
- Parquet Viewer
- githubactions
Contributions to this repository are welcome! If you'd like to contribute, please follow these steps:
- Clone the repository.
- Create a new branch for your feature or bug fix.
- Make your changes and commit them.
- Push your branch to the remote repository.
- Open a pull request against the
main
branch of this repository.
Please ensure that your code follows the project's coding standards and includes appropriate tests.
Apache 2.0
This project was developed by the Allen Institute for Artificial Intelligence (Ai2).
We appreciate the following organizations for their data contributions:
- Spire for providing AIS data.
- NOAA for their AIS dataset, available at NOAA Digital Coast.