-
Notifications
You must be signed in to change notification settings - Fork 13
Code changes for supporting llama3_1-405b reference implementation #111
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
MLCommons CLA bot All contributors have signed the MLCommons CLA ✍️ ✅ |
- llama3-402b | ||
skip_if_env: | ||
CM_ML_MODEL_LLAMA3_CHECKPOINT_PATH: | ||
- 'on' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also only if we are in the docker build stage. Otherwise when the path is given we should register it in CM cache
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in commit bfb7cb6
- llama3-402b | ||
skip_if_env: | ||
CM_DATASET_LLAMA3_PATH: | ||
- "on" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same here as for the model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in commit bfb7cb6
script/get-ml-model-llama3/_cm.yaml
Outdated
env: | ||
CM_MODEL_ZOO_ENV_KEY: LLAMA3 | ||
group: huggingface-stub | ||
docker: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this docker section needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in commit a21f2b0
skip_if_env: | ||
CM_DATASET_LLAMA3_PATH: | ||
- "on" | ||
CM_USE_DATASET_FROM_HOST: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This variable is not needed right? Because if the path is directly passed from the host to a container then, this won't work. Same for model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's right, thanks. Also I formatted the app-mlperf-inference-mlcommons-python
_cm.yaml
file with the help of prettier
extension in VS Code.
TODO:
rclone
directory path toget-dataset-mlperf-inference-llama3
.