Skip to content

Fix retinanet github action #108

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jan 4, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .github/workflows/test-mlperf-inference-resnet50.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# Run MLPerf inference ResNet50
name: MLPerf inference ResNet50

on:
Expand Down Expand Up @@ -36,6 +37,7 @@ jobs:
if: matrix.os == 'windows-latest'
run: |
git config --system core.longpaths true

- name: Install cm4mlops on Windows
if: matrix.os == 'windows-latest'
run: |
Expand All @@ -44,6 +46,7 @@ jobs:
if: matrix.os != 'windows-latest'
run: |
CM_PULL_DEFAULT_MLOPS_REPO=no pip install cm4mlops

- name: Pull MLOps repo
cm pull repo --url=${{ github.event.pull_request.head.repo.html_url }} --checkout=${{ github.event.pull_request.head.ref }}
- name: Test MLPerf Inference ResNet50 (Windows)
Expand Down
11 changes: 8 additions & 3 deletions .github/workflows/test-mlperf-inference-retinanet.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
# Run MLPerf inference Retinanet

name: MLPerf inference retinanet

Expand Down Expand Up @@ -39,9 +38,15 @@ jobs:
if: matrix.os == 'windows-latest'
run: |
git config --system core.longpaths true
- name: Install dependencies
- name: Install cm4mlops on Windows
if: matrix.os == 'windows-latest'
run: |
$env:CM_PULL_DEFAULT_MLOPS_REPO = "no"; pip install cm4mlops
- name: Install dependencies on Unix Platforms
if: matrix.os != 'windows-latest'
run: |
CM_PULL_DEFAULT_MLOPS_REPO=no pip install cm4mlops
- name: Pull MLOps repo
cm pull repo --url=${{ github.event.pull_request.head.repo.html_url }} --checkout=${{ github.event.pull_request.head.ref }}
- name: Test MLPerf Inference Retinanet using ${{ matrix.backend }} on ${{ matrix.os }}
if: matrix.os == 'windows-latest'
Expand Down
72 changes: 72 additions & 0 deletions HISTORY.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
## Timeline of CM developments

### **🚀 2022: Foundation and Early Developments**

- **March 2022:** Grigori Fursin began developing **CM (Collective Mind)**, also referred to as **CK2**, as a successor to CK [at OctoML](https://github.com/octoml/ck/commits/master/?since=2022-03-01&until=2022-03-31).
- **April 2022:** **Arjun Suresh** joined OctoML and collaborated with Grigori on developing **CM Automation** tools.
- **May 2022:** The **CM CLI** and **Python interface** were successfully [implemented and stabilized](https://github.com/octoml/ck/commits/master/?since=2022-04-01&until=2022-05-31) by Grigori.

---

### **🛠️ July–September 2022: MLPerf Integration and First Submission**

- Arjun completed the development of the **MLPerf Inference Script** within CM.
- OctoML achieved **first MLPerf Inference submission (v2.1)** using **CM Automation** ([progress here](https://github.com/octoml/ck/commits/master/?since=2022-06-01&until=2022-09-30)).

---

### **📊 October 2022 – March 2023: End-to-End Automation**

- End-to-end MLPerf inference automations using CM was successfully [completed in CM](https://github.com/octoml/ck/commits/master/?since=2022-10-01&until=2023-03-31).
- **Additional benchmarks** and **Power Measurement support** were integrated into CM.
- **cTuning** achieved a successful MLPerf Inference **v3.0 submission** using CM Automation.

---

### **🔄 April 2023: Transition and New Funding**

- Arjun and Grigori departed OctoML and resumed **CM development** under funding from **cKnowledge.org** and **cTuning**.

---

### **🚀 April–October 2023: Expanded Support and Milestone Submission**

- MLPerf inference automations were [extended](https://github.com/mlcommons/ck/commits/master?since=2023-04-01&until=2023-10-31) to support **NVIDIA implementations**.
- **cTuning** achieved the **largest-ever MLPerf Inference submission (v3.1)** using CM Automation.

---

### **🤝 November 2023: MLCommons Partnership**

- **MLCommons** began funding CM development to enhance support for **NVIDIA MLPerf inference** and introduce support for **Intel** and **Qualcomm MLPerf inference** implementations.

---

### **🌐 October 2023 – March 2024: Multi-Platform Expansion**

- MLPerf inference automations were [expanded](https://github.com/mlcommons/ck/commits/master?since=2023-10-01&until=2024-03-15) to support **NVIDIA, Intel, and Qualcomm implementations**.
- **cTuning** completed the **MLPerf Inference v4.0 submission** using CM Automation.

---

### **📝 April 2024: Documentation Improvements**

- MLCommons contracted **Arjun Suresh** via **GATEOverflow** to improve **MLPerf inference documentation** and enhance CM Automation on various platforms.

---

### **👥 May 2024: Team Expansion**

- **Anandhu Sooraj** joined MLCommons to collaborate with **Arjun Suresh** on CM development.

---

### **📖 June–December 2024: Enhanced Documentation and Automation**

- **Dedicated documentation site** launched for **MLPerf inference**.
- **CM scripts** were developed for **MLPerf Automotive**.
- **CM Docker support** was stabilized.
- **GitHub Actions workflows** were added for **MLPerf inference reference implementations** and **NVIDIA integrations** ([see updates](https://github.com/mlcommons/mlperf-automations/commits/main?since=2024-06-01&until=2024-12-31)).

---

Loading