-
-
Notifications
You must be signed in to change notification settings - Fork 3.3k
[Feat] New LLM API Endpoint - Add List input items for Responses API #11602
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds a new “List Input Items” endpoint to the Responses API, wiring it through the client, router, proxy, and HTTP handler layers, and includes a basic async test.
- Introduces
litellm.alist_input_items
client methods (sync and async) - Updates router, proxy routes, and HTTP handler to support the new endpoint
- Adds a smoke test for the new endpoint and necessary type/enum updates
Reviewed Changes
Copilot reviewed 12 out of 12 changed files in this pull request and generated no comments.
Show a summary per file
File | Description |
---|---|
tests/llm_responses_api_testing/base_responses_api.py | Adds an async test for alist_input_items |
litellm/types/utils.py | Registers alist_input_items in CallTypes |
litellm/router.py | Formats annotations and hooks up alist_input_items |
litellm/responses/main.py | Implements alist_input_items client functions |
litellm/proxy/route_llm_request.py | Adds proxy routing for alist_input_items |
litellm/proxy/response_api_endpoints/endpoints.py | Implements FastAPI proxy endpoint logic |
litellm/proxy/common_request_processing.py | Includes alist_input_items in common processing lists |
litellm/llms/openai/responses/transformation.py | Adds request/response transformers for list endpoint |
litellm/llms/base_llm/responses/transformation.py | Declares abstract methods for list transformers |
litellm/llms/azure/responses/transformation.py | Adds Azure request transformer for list endpoint |
litellm/llms/custom_httpx/llm_http_handler.py | Implements sync/async handlers for list input items |
litellm/model_prices_and_context_window_backup.json | Adds two new Mistral model entries |
Comments suppressed due to low confidence (4)
litellm/proxy/response_api_endpoints/endpoints.py:242
- The handler refers to
request
but it isn't declared in the function signature. Addrequest: Request
as the first parameter so_read_request_body(request=request)
will work.
async def get_response_input_items(
tests/llm_responses_api_testing/base_responses_api.py:334
- The test prints the response but does not assert anything about its shape or contents. Consider adding an assertion (e.g. that it's a dict and contains expected keys like
'data'
).
list_items_response = await litellm.alist_input_items(
litellm/types/utils.py:278
- The new
alist_input_items
member should also be added to theCallTypesLiteral
union so that type hints include this call type.
CallTypesLiteral = Literal[
litellm/llms/azure/responses/transformation.py:204
- Azure provider implements the request transformer but not
transform_list_input_items_response
. You need to add a response transformer (e.g. parsing.json()
) to satisfy the abstract base class and handle responses correctly.
return url, params
…erriAI#11602) * (feat) add list_input_items * add alist_input_items to router * add GET input_items for responses API * test_basic_openai_list_input_items_endpoint * TestTransformListInputItemsRequest * test_ensure_initialize_azure_sdk_client_always_used
…erriAI#11602) * (feat) add list_input_items * add alist_input_items to router * add GET input_items for responses API * test_basic_openai_list_input_items_endpoint * TestTransformListInputItemsRequest * test_ensure_initialize_azure_sdk_client_always_used
[Feat] New LLM API Endpoint - Add List input items for Responses API
This PR adds a new “List Input Items” endpoint to the Responses API, wiring it through the client, router, proxy, and HTTP handler layers, and includes a basic async test.
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
Type
🆕 New Feature
✅ Test
Changes