-
-
Notifications
You must be signed in to change notification settings - Fork 3.3k
[Feat] Make batch size for maximum retention in spend logs a controllable parameter #11459
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR introduces configurability for the spend log cleanup batch size, replacing the hard-coded value with an environment variable and updating documentation accordingly.
- Add a new
SPEND_LOG_CLEANUP_BATCH_SIZE
constant and use it inSpendLogCleanup
- Refactor imports and logging formatting in
spend_log_cleanup.py
- Update docs (
ui_logs.md
,spend_logs_deletion.md
,config_settings.md
) to document the new batch-size parameter
Reviewed Changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
litellm/proxy/db/db_transaction_queue/spend_log_cleanup.py | Use SPEND_LOG_CLEANUP_BATCH_SIZE instead of literal 1000 |
litellm/constants.py | Add SPEND_LOG_CLEANUP_BATCH_SIZE env var constant |
docs/my-website/docs/proxy/ui_logs.md | Document new batch-size env var |
docs/my-website/docs/proxy/spend_logs_deletion.md | Clarify that batch size is configurable |
docs/my-website/docs/proxy/config_settings.md | Add SPEND_LOG_CLEANUP_BATCH_SIZE to configuration table |
Comments suppressed due to low confidence (3)
litellm/proxy/db/db_transaction_queue/spend_log_cleanup.py:76
- The log message hardcodes '1,00,000' based on the default batch size and run loops. Consider calculating the maximum deleted logs dynamically (e.g.
SPEND_LOG_RUN_LOOPS * SPEND_LOG_CLEANUP_BATCH_SIZE
) or updating the message to reference the actual variable values for accuracy.
verbose_proxy_logger.info("Max logs deleted - 1,00,000, rest of the logs will be deleted in next run")
docs/my-website/docs/proxy/config_settings.md:653
- This description still references a hardcoded batch size of 1000. Update it to reflect that batch size is now configurable via
SPEND_LOG_CLEANUP_BATCH_SIZE
, and clarify that it controls the number of batches per cleanup run.
| SPEND_LOG_RUN_LOOPS | Constant for setting how many runs of 1000 batch deletes should spend_log_cleanup task run |
docs/my-website/docs/proxy/spend_logs_deletion.md:80
- The 'Max deletions per run: 500,000 logs' value is now dynamic based on
SPEND_LOG_RUN_LOOPS
andSPEND_LOG_CLEANUP_BATCH_SIZE
. Consider updating this to explain how the total is calculated and that it will vary if batch size is changed.
- **Max deletions per run**: 500,000 logs
@@ -16,21 +21,26 @@ class SpendLogCleanup: | |||
""" | |||
|
|||
def __init__(self, general_settings=None, redis_cache: Optional[RedisCache] = None): | |||
self.batch_size = 1000 | |||
self.batch_size = SPEND_LOG_CLEANUP_BATCH_SIZE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is really the only change, the other deltas are my formatter.
…able parameter (BerriAI#11459) * feat: add SPEND_LOG_CLEANUP_BATCH_SIZE * docs update * test: test_cleanup_batch_size_env_var
Make batch size for maximum retention in spend logs a controllable parameter
Make batch size for maximum retention in spend logs a controllable parameter. Allows user to control the size of batch for the spend log deletion.
Closes LIT-190
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
Type
🆕 New Feature
📖 Documentation
✅ Test
Changes