Skip to content

Migrate MLPerf inference unofficial results repo to MLCommons #59

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 56 commits into from
Dec 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
56 commits
Select commit Hold shift + click to select a range
4897ede
[Automated Commit] Format Codebase
mlcommons-bot Dec 13, 2024
51d4fdf
Fixes for rgat submission generation
arjunsuresh Dec 13, 2024
7ecb64a
[Automated Commit] Format Codebase
mlcommons-bot Dec 13, 2024
77da71d
Fix issue with autoformat
arjunsuresh Dec 13, 2024
c6ba039
[Automated Commit] Format Codebase
mlcommons-bot Dec 13, 2024
85b975a
Fix issue with autoformat
arjunsuresh Dec 13, 2024
541b27c
Update test-mlperf-inference-rgat.yml
arjunsuresh Dec 13, 2024
deeaa9d
Update test-mlperf-inference-rgat.yml
arjunsuresh Dec 13, 2024
55d88b6
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 13, 2024
18cacd4
Update _cm.yaml
arjunsuresh Dec 13, 2024
3c98f5a
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 13, 2024
c1dd207
Make r5.0-dev the default version for mlperf-inference
arjunsuresh Dec 13, 2024
66b2499
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 13, 2024
ed22615
Update _cm.yaml
arjunsuresh Dec 13, 2024
f0326b2
Update CM_MLPERF_LAST_RELEASE for get-mlperf-inference-src
arjunsuresh Dec 13, 2024
6c4d3a4
Update test-mlperf-inference-rgat.yml
arjunsuresh Dec 13, 2024
5ee6846
Pre-create results and measurements dirs for mlperf-inference submiss…
arjunsuresh Dec 13, 2024
06b31af
Pre-create results and measurements dirs for mlperf-inference submiss…
arjunsuresh Dec 13, 2024
4b9a988
Use master branch of inference-src for rgat gh action
arjunsuresh Dec 13, 2024
2861fa9
pytorch version_max change to 2.4.0 from 2.4.1
arjunsuresh Dec 14, 2024
8cf1796
Fix typo in igbh download
arjunsuresh Dec 14, 2024
5307dc4
Improvements to gh-actions-runner
arjunsuresh Dec 15, 2024
311cc3f
Update test-nvidia-mlperf-inference-implementations.yml
arjunsuresh Dec 15, 2024
a2630e8
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 15, 2024
3f17889
Update test-mlperf-inference-sdxl.yaml
arjunsuresh Dec 15, 2024
39b29e5
Update test-mlperf-inference-gptj.yml
arjunsuresh Dec 16, 2024
f7a95ef
Update test-amd-mlperf-inference-implementations.yml
arjunsuresh Dec 16, 2024
294491c
Update test-mlperf-inference-llama2.yml
arjunsuresh Dec 16, 2024
5d0c53d
Update test-mlperf-inference-mixtral.yml
arjunsuresh Dec 16, 2024
65f86e6
Update test-mlperf-inference-dlrm.yml
arjunsuresh Dec 16, 2024
973e319
Update test-scc24-sdxl.yaml
arjunsuresh Dec 16, 2024
01b4d9c
Update test-scc24-sdxl.yaml
arjunsuresh Dec 16, 2024
0b97dc4
Use dev branch and not fork for mlperf inference test runs
arjunsuresh Dec 18, 2024
952ad6d
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 19, 2024
aa5182e
Update setup.py | Use default branch in setup.py
arjunsuresh Dec 19, 2024
597435f
Support nvmitten for aarch64
arjunsuresh Dec 19, 2024
f07bf30
Update VERSION
arjunsuresh Dec 19, 2024
a62bf46
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 19, 2024
2018c6f
Copy bert model for nvidia mlperf inference implementation instead of…
arjunsuresh Dec 20, 2024
85af5f8
Update VERSION
arjunsuresh Dec 20, 2024
ba00bb5
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 20, 2024
0892ae8
Use master branch of inference repo in github action
arjunsuresh Dec 20, 2024
95b444c
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 20, 2024
6109dab
Support relative paths for --outdirname
arjunsuresh Dec 20, 2024
9d5ff32
Support --outdirname for get-dataset-igbh
arjunsuresh Dec 20, 2024
992094e
Rename get-dataset-mlperf-inference-igbh -> get-dataset-igbh
arjunsuresh Dec 20, 2024
92b4096
Merge branch 'mlcommons:dev' into dev
arjunsuresh Dec 20, 2024
d8cd6f6
Use mlcommons repo for uploading unofficial results of nvidia/intel g…
arjunsuresh Dec 21, 2024
3a08a89
Fix format
arjunsuresh Dec 21, 2024
a379d43
Fix rgat download path, added libbz2 deps for draw-graph-from-json
arjunsuresh Dec 21, 2024
a3b2ccb
Support windows for pull-git-repo
arjunsuresh Dec 21, 2024
143c19a
Fix libbz2-dev detect
arjunsuresh Dec 21, 2024
421bdbb
Added separate installation options for libbz2-dev and bzip2
arjunsuresh Dec 21, 2024
f6afbff
Added separate installation options for libbz2-dev and bzip2
arjunsuresh Dec 21, 2024
f60afed
Restrict libbz2-dev install only for ubuntu (install-python-src)
arjunsuresh Dec 21, 2024
aac44d3
Update VERSION
arjunsuresh Dec 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -22,5 +22,5 @@ jobs:
export CM_REPOS=$HOME/GH_CM
pip install --upgrade cm4mlops
pip install tabulate
cm run script --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r4.1-dev --preprocess_submission=yes --execution_mode=valid --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=IntelSPR.24c --implementation=intel --backend=pytorch --category=datacenter --division=open --scenario=Offline --docker_dt=yes --docker_it=no --docker_cm_repo=gateoverflow@cm4mlops --adr.compiler.tags=gcc --device=cpu --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean --docker --quiet
cm run script --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/gateoverflow/mlperf_inference_unofficial_submissions_v5.0 --repo_branch=main --commit_message="Results from GH action on SPR.24c" --quiet --submission_dir=$HOME/gh_action_submissions --hw_name=IntelSPR.24c
cm run script --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r4.1-dev --preprocess_submission=yes --execution_mode=valid --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=IntelSPR.24c --implementation=intel --backend=pytorch --category=datacenter --division=open --scenario=Offline --docker_dt=yes --docker_it=no --docker_cm_repo=mlcommons@mlperf-automations --docker_cm_repo_branch=dev --adr.compiler.tags=gcc --device=cpu --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean --docker --quiet
cm run script --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/mlcommons/mlperf_inference_unofficial_submissions_v5.0 --repo_branch=auto-update --commit_message="Results from GH action on SPR.24c" --quiet --submission_dir=$HOME/gh_action_submissions --hw_name=IntelSPR.24c
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,6 @@ jobs:
pip install --upgrade cm4mlops
cm pull repo

cm run script --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r4.1-dev --preprocess_submission=yes --pull_changes=yes --pull_inference_changes=yes --execution_mode=valid --gpu_name=rtx_4090 --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=$hw_name --implementation=nvidia --backend=tensorrt --category=datacenter,edge --division=closed --docker_dt=yes --docker_it=no --docker_cm_repo=gateoverflow@cm4mlops --adr.compiler.tags=gcc --device=cuda --use_model_from_host=yes --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean --docker --quiet
cm run script --tags=run-mlperf,inference,_all-scenarios,_submission,_full,_r4.1-dev --preprocess_submission=yes --pull_changes=yes --pull_inference_changes=yes --execution_mode=valid --gpu_name=rtx_4090 --pull_changes=yes --pull_inference_changes=yes --model=${{ matrix.model }} --submitter="MLCommons" --hw_name=$hw_name --implementation=nvidia --backend=tensorrt --category=datacenter,edge --division=closed --docker_dt=yes --docker_it=no --docker_cm_repo=mlcommons@mlperf-automations --docker_cm_repo_branch=dev --adr.compiler.tags=gcc --device=cuda --use_model_from_host=yes --use_dataset_from_host=yes --results_dir=$HOME/gh_action_results --submission_dir=$HOME/gh_action_submissions --clean --docker --quiet

cm run script --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/gateoverflow/mlperf_inference_unofficial_submissions_v5.0 --repo_branch=main --commit_message="Results from GH action on NVIDIA_$hw_name" --quiet --submission_dir=$HOME/gh_action_submissions --hw_name=$hw_name
cm run script --tags=push,github,mlperf,inference,submission --repo_url=https://github.com/mlcommons/mlperf_inference_unofficial_submissions_v5.0 --repo_branch=auto-update --commit_message="Results from GH action on NVIDIA_$hw_name" --quiet --submission_dir=$HOME/gh_action_submissions --hw_name=$hw_name
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
0.6.14
0.6.15
18 changes: 14 additions & 4 deletions automation/script/module.py
Original file line number Diff line number Diff line change
Expand Up @@ -1798,9 +1798,16 @@ def _run(self, i):

tmp_curdir = os.getcwd()
if env.get('CM_OUTDIRNAME', '') != '':
if not os.path.exists(env['CM_OUTDIRNAME']):
os.makedirs(env['CM_OUTDIRNAME'])
os.chdir(env['CM_OUTDIRNAME'])
if os.path.isabs(env['CM_OUTDIRNAME']) or recursion:
c_outdirname = env['CM_OUTDIRNAME']
else:
c_outdirname = os.path.join(
env['CM_TMP_CURRENT_PATH'], env['CM_OUTDIRNAME'])
env['CM_OUTDIRNAME'] = c_outdirname

if not os.path.exists(c_outdirname):
os.makedirs(c_outdirname)
os.chdir(c_outdirname)

# Check if pre-process and detect
if 'preprocess' in dir(customize_code) and not fake_run:
Expand Down Expand Up @@ -5860,7 +5867,10 @@ def convert_env_to_script(env, os_info, start_script=None):
key = key[1:]

# Append the existing environment variable to the new value
env_value = f"{env_separator.join(env_value)}{env_separator}{os_info['env_var'].replace('env_var', key)}"
env_value = f"""{
env_separator.join(env_value)}{env_separator}{
os_info['env_var'].replace(
'env_var', key)}"""

# Replace placeholders in the platform-specific environment command
env_command = os_info['set_env'].replace(
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
alias: get-dataset-mlperf-inference-igbh
alias: get-dataset-igbh
automation_alias: script
automation_uid: 5b4e0237da074764
cache: true
Expand Down Expand Up @@ -37,6 +37,8 @@ prehook_deps:
CM_DOWNLOAD_FILENAME: node_feat.npy
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper/
extra_cache_tags: dataset,igbh,paper,node_feat
force_env_keys:
- CM_OUTDIRNAME
force_cache: true
enable_if_env:
CM_DATASET_IGBH_TYPE:
Expand All @@ -54,6 +56,8 @@ prehook_deps:
CM_DOWNLOAD_FILENAME: node_label_19.npy
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper/
extra_cache_tags: dataset,igbh,paper,node_label_19
force_env_keys:
- CM_OUTDIRNAME
force_cache: true
enable_if_env:
CM_DATASET_IGBH_TYPE:
Expand All @@ -72,6 +76,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper/
extra_cache_tags: dataset,igbh,paper,node_label_2K
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -89,6 +95,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper/
extra_cache_tags: dataset,igbh,paper,paper_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -107,6 +115,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper__cites__paper/
extra_cache_tags: dataset,igbh,paper_cites_paper,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -125,6 +135,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/author/
extra_cache_tags: dataset,igbh,author,author_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -142,6 +154,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/author/
extra_cache_tags: dataset,igbh,author,node_feat
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -160,6 +174,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/conference/
extra_cache_tags: dataset,igbh,conference,conference_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -177,6 +193,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/conference/
extra_cache_tags: dataset,igbh,conference,node_feat
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -195,6 +213,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/institute/
extra_cache_tags: dataset,igbh,institute,institute_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -212,6 +232,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/institute/
extra_cache_tags: dataset,igbh,institute,node_feat
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -230,6 +252,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/journal/
extra_cache_tags: dataset,igbh,journal,journal_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -247,6 +271,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/journal/
extra_cache_tags: dataset,igbh,journal,node_feat
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -265,6 +291,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/fos/
extra_cache_tags: dataset,igbh,fos,fos_id_index_mapping
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -282,6 +310,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/fos/
extra_cache_tags: dataset,igbh,fos,node_feat
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -300,6 +330,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/author__affiliated_to__institute/
extra_cache_tags: dataset,igbh,author_affiliated_to_institute,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -318,6 +350,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper__published__journal/
extra_cache_tags: dataset,igbh,paper_published_journal,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -336,6 +370,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper__topic__fos/
extra_cache_tags: dataset,igbh,paper_topic_fos,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -354,6 +390,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper__venue__conference/
extra_cache_tags: dataset,igbh,paper_venue_conference,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand All @@ -372,6 +410,8 @@ prehook_deps:
CM_DOWNLOAD_PATH: <<<CM_DATASET_IGBH_DOWNLOAD_LOCATION>>>/full/processed/paper__written_by__author/
extra_cache_tags: dataset,igbh,paper_written_by_author,edge_index
force_cache: true
force_env_keys:
- CM_OUTDIRNAME
enable_if_env:
CM_DATASET_IGBH_TYPE:
- 'full'
Expand Down
20 changes: 16 additions & 4 deletions script/get-generic-sys-util/_cm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -212,21 +212,33 @@ variations:
brew: ''
dnf: boost-devel
yum: boost-devel
libbz2-dev:
bzip2:
env:
CM_SYS_UTIL_NAME: libbz2_dev
CM_SYS_UTIL_NAME: bzip2
CM_SYS_UTIL_VERSION_CMD_OVERRIDE: bzcat --version 2>&1 | grep bzip > tmp-ver.out
CM_SYS_UTIL_VERSION_RE: ([0-9]+(\.[0-9]+)+)
CM_TMP_VERSION_DETECT_GROUP_NUMBER: 1
new_env_keys:
- CM_BZIP2_VERSION
state:
bzip2:
apt: bzip2
brew: bzip2
dnf: bzip2
yum: bzip2
libbz2-dev:
env:
CM_SYS_UTIL_NAME: libbz2_dev
CM_SYS_UTIL_VERSION_CMD: dpkg -s libbz2-dev | grep 'Version'
CM_SYS_UTIL_VERSION_RE: ([0-9]+(\.[0-9]+)+)
CM_TMP_VERSION_DETECT_GROUP_NUMBER: 0
new_env_keys:
- CM_LIBBZ2_DEV_VERSION
state:
libbz2_dev:
apt: libbz2-dev
brew: bzip2
dnf: libbzip2-devel
yum: libbzip2-devel
zlib-devel: libbz2-devel
libev-dev:
env:
CM_SYS_UTIL_NAME: libev_dev
Expand Down
13 changes: 8 additions & 5 deletions script/get-ml-model-rgat/_cm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,14 @@ prehook_deps:
CM_TMP_REQUIRE_DOWNLOAD:
- 'yes'
env:
CM_DOWNLOAD_FINAL_ENV_NAME: CM_ML_MODEL_PATH
extra_cache_tags: rgat,gnn,model
CM_DOWNLOAD_FINAL_ENV_NAME: RGAT_DIR_PATH
extra_cache_tags: rgat,gnn,model,ml-model
force_cache: true
names:
- dae
tags: download-and-extract
- download-file
tags: download,file
force_env_keys:
- CM_OUTDIRNAME
update_tags_from_env_with_prefix:
_url.:
- CM_DOWNLOAD_URL
Expand Down Expand Up @@ -55,7 +57,7 @@ variations:
group: download-source
rclone:
adr:
dae:
download-file:
tags: _rclone
env:
CM_DOWNLOAD_TOOL: rclone
Expand All @@ -65,3 +67,4 @@ variations:
env:
CM_ML_MODEL_STARTING_WEIGHTS_FILENAME: https://github.com/mlcommons/inference/tree/master/graph/R-GAT#download-model-using-rclone
CM_DOWNLOAD_URL: mlc-inference:mlcommons-inference-wg-public/R-GAT/RGAT.pt
CM_DOWNLOAD_FILENAME: RGAT
7 changes: 4 additions & 3 deletions script/get-ml-model-rgat/customize.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,10 @@ def postprocess(i):
env = i['env']

if env.get('CM_ML_MODEL_RGAT_CHECKPOINT_PATH', '') == '':
env['CM_ML_MODEL_RGAT_CHECKPOINT_PATH'] = os.path.join(
env['CM_ML_MODEL_PATH'], "RGAT.pt")
elif env.get('CM_ML_MODEL_PATH', '') == '':
env['CM_ML_MODEL_RGAT_CHECKPOINT_PATH'] = env.get(
'RGAT_CHECKPOINT_PATH', os.path.join(env['RGAT_DIR_PATH'], "RGAT.pt"))

if env.get('CM_ML_MODEL_PATH', '') == '':
env['CM_ML_MODEL_PATH'] = env['CM_ML_MODEL_RGAT_CHECKPOINT_PATH']

env['RGAT_CHECKPOINT_PATH'] = env['CM_ML_MODEL_RGAT_CHECKPOINT_PATH']
Expand Down
3 changes: 3 additions & 0 deletions script/install-python-src/_cm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,9 @@ deps:
- tags: detect,cpu
- tags: get,generic-sys-util,_libffi-dev
- tags: get,generic-sys-util,_libbz2-dev
enable_if_env:
CM_HOST_OS_FLAVOR:
- ubuntu
- tags: get,generic-sys-util,_libssl-dev
- enable_if_env:
CM_HOST_OS_FLAVOR:
Expand Down
3 changes: 0 additions & 3 deletions script/pull-git-repo/customize.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,6 @@ def preprocess(i):

os_info = i['os_info']

if os_info['platform'] == 'windows':
return {'return': 1, 'error': 'Windows is not supported in this script yet'}

env = i['env']
meta = i['meta']

Expand Down
26 changes: 26 additions & 0 deletions script/pull-git-repo/run.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
@echo off
setlocal enabledelayedexpansion

REM Save the current directory
set "CUR_DIR=%CD%"
set "SCRIPT_DIR=%CM_TMP_CURRENT_SCRIPT_PATH%"

REM Change to the specified path
set "path=%CM_GIT_CHECKOUT_PATH%"
echo cd %path%

cd /d "%path%"
if errorlevel 1 (
echo Failed to change directory to %path%
exit /b %errorlevel%
)

REM Execute the Git pull command
echo %CM_GIT_PULL_CMD%
call %CM_GIT_PULL_CMD%
REM Don't fail if there are local changes
REM if errorlevel 1 exit /b %errorlevel%

REM Return to the original directory
cd /d "%CUR_DIR%"
endlocal
Loading