Skip to content

Commit 2ac0b7b

Browse files
authored
Merge pull request #176 from mlcommons/dev
Sync Dev
2 parents def3863 + 6a91792 commit 2ac0b7b

File tree

298 files changed

+122
-33640
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

298 files changed

+122
-33640
lines changed

.github/workflows/test-mlperf-inference-mixtral.yml

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,8 @@
1-
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
2-
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
3-
41
name: MLPerf inference MIXTRAL-8x7B
52

63
on:
74
schedule:
8-
- cron: "59 19 * * *" # 30th minute and 20th hour => 20:30 UTC => 2 AM IST
5+
- cron: "59 23 * * */5" # 30th minute and 20th hour => 20:30 UTC => 2 AM IST
96

107
jobs:
118
build_reference:

.github/workflows/test-nvidia-mlperf-inference-implementations.yml

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -43,6 +43,11 @@ jobs:
4343
gpu_name=rtx_4090
4444
docker_string=" --docker"
4545
fi
46+
if [ "${{ matrix.model }}" = "bert-99" ] || [ "${{ matrix.model }}" = "bert-99.9" ]; then
47+
category="edge"
48+
else
49+
category="datacenter,edge"
50+
fi
4651
4752
if [ -f "gh_action/bin/deactivate" ]; then source gh_action/bin/deactivate; fi
4853
python3 -m venv gh_action
@@ -51,6 +56,6 @@ jobs:
5156
pip install --upgrade mlcflow
5257
mlc pull repo mlcommons@mlperf-automations --branch=dev
5358
54-
mlcr --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r5.0-dev --preprocess_submission=yes --pull_changes=yes --pull_inference_changes=yes --execution_mode=valid --gpu_name=$gpu_name --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=$hw_name --implementation=nvidia --backend=tensorrt --category=datacenter,edge --division=closed --docker_dt --docker_mlc_repo=mlcommons@mlperf-automations --docker_mlc_repo_branch=dev --adr.compiler.tags=gcc --device=cuda --use_model_from_host=yes --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean $docker_string --quiet
59+
mlcr --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r5.0-dev --preprocess_submission=yes --pull_changes=yes --pull_inference_changes=yes --execution_mode=valid --gpu_name=$gpu_name --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=$hw_name --implementation=nvidia --backend=tensorrt --category=$category --division=closed --docker_dt --docker_mlc_repo=mlcommons@mlperf-automations --docker_mlc_repo_branch=dev --adr.compiler.tags=gcc --device=cuda --use_model_from_host=yes --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean $docker_string --quiet
5560
5661
mlcr --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/mlcommons/mlperf_inference_unofficial_submissions_v5.0 --repo_branch=auto-update --commit_message="Results from GH action on NVIDIA_$hw_name" --quiet --submission_dir=$HOME/gh_action_submissions --hw_name=$hw_name

docs/cm-yaml-guide.md

Lines changed: 0 additions & 46 deletions
This file was deleted.

docs/getting-started.md

Lines changed: 73 additions & 106 deletions
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,71 @@
1+
# Getting Started with MLC Script Automation
12

2-
# Getting Started with CM Script Automation
3+
## Running MLC Scripts
34

4-
## Running CM Scripts
5-
6-
To execute a simple script in CM that captures OS details, use the following command:
5+
To execute a simple script in MLC that captures OS details, use the following command:
76

87
```bash
9-
cm run script --tags=detect,os -j
8+
mlcr detect,os -j
109
```
10+
* Here, `mlcr` is a shortform for `mlc run script --tags=`
1111

1212
This command gathers details about the system on which it's run, such as:
1313

1414
```json
15-
{
16-
"CM_HOST_OS_TYPE": "linux",
17-
"CM_HOST_OS_BITS": "64",
18-
"CM_HOST_OS_FLAVOR": "ubuntu",
19-
"CM_HOST_OS_FLAVOR_LIKE": "debian",
20-
"CM_HOST_OS_VERSION": "24.04",
21-
"CM_HOST_OS_KERNEL_VERSION": "6.8.0-45-generic",
22-
"CM_HOST_OS_GLIBC_VERSION": "2.39",
23-
"CM_HOST_OS_MACHINE": "x86_64",
24-
"CM_HOST_OS_PACKAGE_MANAGER": "apt",
25-
"CM_HOST_OS_PACKAGE_MANAGER_INSTALL_CMD": "DEBIAN_FRONTEND=noninteractive apt-get install -y",
26-
"CM_HOST_OS_PACKAGE_MANAGER_UPDATE_CMD": "apt-get update -y",
27-
"+CM_HOST_OS_DEFAULT_LIBRARY_PATH": [
15+
$ mlcr detect,os -j
16+
[2025-02-03 04:57:23,449 main.py:694 INFO] - Repos path for Index: /home/arjun/MLC/repos
17+
[2025-02-03 04:57:24,167 main.py:837 INFO] - Shared index for script saved to /home/arjun/MLC/repos/index_script.json.
18+
[2025-02-03 04:57:24,167 main.py:837 INFO] - Shared index for cache saved to /home/arjun/MLC/repos/index_cache.json.
19+
[2025-02-03 04:57:24,167 main.py:837 INFO] - Shared index for experiment saved to /home/arjun/MLC/repos/index_experiment.json.
20+
[2025-02-03 04:57:24,210 module.py:574 INFO] - * mlcr detect,os
21+
[2025-02-03 04:57:24,213 module.py:5354 INFO] - ! cd /mnt/arjun/MLC/repos/gateoverflow@mlperf-automations
22+
[2025-02-03 04:57:24,213 module.py:5355 INFO] - ! call /home/arjun/MLC/repos/gateoverflow@mlperf-automations/script/detect-os/run.sh from tmp-run.sh
23+
[2025-02-03 04:57:24,245 module.py:5501 INFO] - ! call "postprocess" from /home/arjun/MLC/repos/gateoverflow@mlperf-automations/script/detect-os/customize.py
24+
[2025-02-03 04:57:24,254 module.py:2195 INFO] - {
25+
"return": 0,
26+
"env": {
27+
"MLC_HOST_OS_TYPE": "linux",
28+
"MLC_HOST_OS_BITS": "64",
29+
"MLC_HOST_OS_FLAVOR": "ubuntu",
30+
"MLC_HOST_OS_FLAVOR_LIKE": "debian",
31+
"MLC_HOST_OS_VERSION": "24.04",
32+
"MLC_HOST_OS_KERNEL_VERSION": "6.8.0-52-generic",
33+
"MLC_HOST_OS_GLIBC_VERSION": "2.39",
34+
"MLC_HOST_OS_MACHINE": "x86_64",
35+
"MLC_HOST_OS_PACKAGE_MANAGER": "apt",
36+
"MLC_HOST_OS_PACKAGE_MANAGER_INSTALL_CMD": "DEBIAN_FRONTEND=noninteractive apt-get install -y",
37+
"MLC_HOST_OS_PACKAGE_MANAGER_UPDATE_CMD": "apt-get update -y",
38+
"+MLC_HOST_OS_DEFAULT_LIBRARY_PATH": [
39+
"/usr/local/lib/x86_64-linux-gnu",
40+
"/lib/x86_64-linux-gnu",
41+
"/usr/lib/x86_64-linux-gnu",
42+
"/usr/lib/x86_64-linux-gnu64",
43+
"/usr/local/lib64",
44+
"/lib64",
45+
"/usr/lib64",
46+
"/usr/local/lib",
47+
"/lib",
48+
"/usr/lib",
49+
"/usr/x86_64-linux-gnu/lib64",
50+
"/usr/x86_64-linux-gnu/lib"
51+
],
52+
"MLC_HOST_PLATFORM_FLAVOR": "x86_64",
53+
"MLC_HOST_PYTHON_BITS": "64",
54+
"MLC_HOST_SYSTEM_NAME": "arjun-spr"
55+
},
56+
"new_env": {
57+
"MLC_HOST_OS_TYPE": "linux",
58+
"MLC_HOST_OS_BITS": "64",
59+
"MLC_HOST_OS_FLAVOR": "ubuntu",
60+
"MLC_HOST_OS_FLAVOR_LIKE": "debian",
61+
"MLC_HOST_OS_VERSION": "24.04",
62+
"MLC_HOST_OS_KERNEL_VERSION": "6.8.0-52-generic",
63+
"MLC_HOST_OS_GLIBC_VERSION": "2.39",
64+
"MLC_HOST_OS_MACHINE": "x86_64",
65+
"MLC_HOST_OS_PACKAGE_MANAGER": "apt",
66+
"MLC_HOST_OS_PACKAGE_MANAGER_INSTALL_CMD": "DEBIAN_FRONTEND=noninteractive apt-get install -y",
67+
"MLC_HOST_OS_PACKAGE_MANAGER_UPDATE_CMD": "apt-get update -y",
68+
"+MLC_HOST_OS_DEFAULT_LIBRARY_PATH": [
2869
"/usr/local/lib/x86_64-linux-gnu",
2970
"/lib/x86_64-linux-gnu",
3071
"/usr/lib/x86_64-linux-gnu",
@@ -38,98 +79,24 @@ This command gathers details about the system on which it's run, such as:
3879
"/usr/x86_64-linux-gnu/lib64",
3980
"/usr/x86_64-linux-gnu/lib"
4081
],
41-
"CM_HOST_PLATFORM_FLAVOR": "x86_64",
42-
"CM_HOST_PYTHON_BITS": "64",
43-
"CM_HOST_SYSTEM_NAME": "intel-spr-i9"
82+
"MLC_HOST_PLATFORM_FLAVOR": "x86_64",
83+
"MLC_HOST_PYTHON_BITS": "64",
84+
"MLC_HOST_SYSTEM_NAME": "arjun-spr"
85+
},
86+
"state": {
87+
"os_uname_machine": "x86_64",
88+
"os_uname_all": "Linux arjun-spr 6.8.0-52-generic #53-Ubuntu SMP PREEMPT_DYNAMIC Sat Jan 11 00:06:25 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux"
89+
},
90+
"new_state": {
91+
"os_uname_machine": "x86_64",
92+
"os_uname_all": "Linux arjun-spr 6.8.0-52-generic #53-Ubuntu SMP PREEMPT_DYNAMIC Sat Jan 11 00:06:25 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux"
93+
},
94+
"deps": []
4495
}
4596
```
4697

47-
For more details on CM scripts, see the [CM documentation](index.md).
48-
49-
### Adding New CM Scripts
98+
For more details on MLC scripts, see the [MLC documentation](index.md).
5099

51-
CM aims to provide lightweight connectors between existing automation scripts and tools without substituting them. You can add your own scripts to CM with the following command, which creates a script named `hello-world`:
52-
53-
```bash
54-
cm add script hello-world --tags=hello-world,display,test
55-
```
56-
57-
This command initializes a CM script in the local repository with the following structure:
58-
59-
```
60-
└── CM
61-
├── index.json
62-
├── repos
63-
│ ├── local
64-
│ │ ├── cfg
65-
│ │ ├── cache
66-
│ │ ├── cmr.yaml
67-
│ │ └── script
68-
│ │ └── hello-world
69-
│ │ ├── _cm.yaml
70-
│ │ ├── customize.py
71-
│ │ ├── README-extra.md
72-
│ │ ├── run.bat
73-
│ │ └── run.sh
74-
│ └── mlcommons@cm4mlops
75-
└── repos.json
76-
```
77100

78101
You can also execute the script from Python as follows:
79102

80-
```python
81-
import cmind
82-
output = cmind.access({'action':'run', 'automation':'script', 'tags':'hello-world,display,test'})
83-
if output['return'] == 0:
84-
print(output)
85-
```
86-
87-
If you discover that your new script is similar to an existing script in any CM repository, you can clone an existing script using the following command:
88-
89-
```bash
90-
cm copy script <source_script> .:<target_script>
91-
```
92-
93-
Here, `<source_script>` is the name of the existing script, and `<target_script>` is the name of the new script you're creating. Existing script names in the `cm4mlops` repository can be found [here](https://github.com/mlcommons/cm4mlops/tree/mlperf-inference/script).
94-
95-
## Caching and Reusing CM Script Outputs
96-
97-
By default, CM scripts run in the current directory and record all new files there. For example, a universal download script might download an image to the current directory:
98-
99-
```bash
100-
cm run script --tags=download,file,_wget --url=https://cKnowledge.org/ai/data/computer_mouse.jpg --verify=no --env.CM_DOWNLOAD_CHECKSUM=45ae5c940233892c2f860efdf0b66e7e
101-
```
102-
103-
To cache and reuse the output of scripts, CM offers a `cache` automation feature similar to `script`. When `"cache":true` is specified in a script's metadata, CM will create a `cache` directory in `$HOME/CM/repos/local` with a unique ID and the same tags as `script`, and execute the script there.
104-
105-
Subsequent executions of the same script will reuse files from the cache, avoiding redundancy. This is especially useful for large files or data sets.
106-
107-
You can manage cache entries and find specific ones using commands like:
108-
109-
```bash
110-
cm show cache
111-
cm show cache --tags=get,ml-model,resnet50,_onnx
112-
cm find cache --tags=download,file,ml-model,resnet50,_onnx
113-
cm info cache --tags=download,file,ml-model,resnet50,_onnx
114-
```
115-
116-
To clean cache entries:
117-
118-
```bash
119-
cm rm cache --tags=ml-model,resnet50
120-
cm rm cache -f # Clean all entries
121-
```
122-
123-
You can completely reset the CM framework by removing the `$HOME/CM` directory, which deletes all downloaded repositories and cached entries.
124-
125-
## Integration with Containers
126-
127-
CM scripts are designed to run natively or inside containers with the same commands. You can substitute `cm run script` with `cm docker script` to execute a script inside an automatically-generated container:
128-
129-
```bash
130-
cm docker script --tags=python,app,image-classification,onnx,_cpu
131-
```
132-
133-
CM automatically handles the generation of Dockerfiles, building of containers, and execution within containers, providing a seamless experience whether running scripts natively or in containers.
134-
135-
This approach simplifies the development process by eliminating the need for separate Dockerfile maintenance and allows for the use of native scripts and workflows directly within containers.

0 commit comments

Comments
 (0)