Skip to content

palladiumkenya/KenyaEMR_IIT_Docker

Repository files navigation

Run Makefile Targets

KenyaEMR_IIT

Docker build container

  1. docker build -t kenyaemr-inference .

Data and settings

Check that these files exist:

  1. /opt/ml/iit/settings.json -- facility specific settings
  2. /opt/ml/iit/locational_variables_latest.csv -- facility location variables
  3. /opt/ml/iit/models/thresholds_latest.pkl -- thresholds
  4. /opt/ml/iit/models/site_thresholds_latest.pkl -- site thresholds
  5. /opt/ml/iit/models/ohe_latest.pkl -- model
  6. /opt/ml/iit/models/mod_latest.json -- model
  7. /opt/ml/iit/models/feature_order.pkl -- features
  8. /opt/ml/iit/models/mod_latest.so -- model
  9. /opt/ml/iit/models/iit_test.sqlite -- demo db
  10. /opt/ml/iit/models/locational_variables_latest.csv -- facility location variables

Docker run

  1. docker run -v /opt/ml/iit/settings.json:/app/data/settings.json
    -v /opt/ml/iit/locational_variables_latest.csv:/app/data/locational_variables_latest.csv
    -v /opt/ml/iit/models/thresholds_latest.pkl:/app/data/models/thresholds_latest.pkl
    -v /opt/ml/iit/models/site_thresholds_latest.pkl:/app/data/models/site_thresholds_latest.pkl
    -v /opt/ml/iit/models/ohe_latest.pkl:/app/data/models/ohe_latest.pkl
    -v /opt/ml/iit/models/mod_latest.json:/app/data/models/mod_latest.json
    -v /opt/ml/iit/models/feature_order.pkl:/app/data/models/feature_order.pkl
    -v /opt/ml/iit/models/mod_latest.so:/app/data/models/mod_latest.so
    -v /opt/ml/iit/models/iit_test.sqlite:/app/data/models/iit_test.sqlite
    -v /opt/ml/iit/models/locational_variables_latest.csv:/app/data/models/locational_variables_latest.csv
    --add-host=host.docker.internal:host-gateway
    --network host
    -p 8000:8000
    kenyaemr-inference

Or Docker Compose

With local rebuild

docker compose up --build

By pulling from docker hub

docker compose up

Clean up docker images to save space

Safe cleanup

  1. docker container prune
  2. docker image ls
  3. docker rmi "IMAGE ID"
  4. docker builder prune

Total cleanup -- Use with caution

  1. docker rmi -f $(docker images -aq)
  2. docker system prune -a --volumes -f

Example Payload

curl -X POST "http://localhost:8000/inference" -H "Content-Type: application/json" -d '{"ppk": "7E14A8034F39478149EE6A4CA37A247C631D17907C746BE0336D3D7CEC68F66F", "sc": "13074", "start_date": "2021-01-01", "end_date": "2025-01-01"}'

Development

  1. python3.12 -m venv myenv
  2. source myenv/bin/activate
  3. pip --version
  4. python --version
  5. pip install --no-cache-dir -r requirements-inference.txt
  6. uvicorn src.inference.api:app --host 0.0.0.0 --port 8000

About

Dockerized IIT ML API Implementation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •