Skip to content

[Feat] New LLM API Endpoint - Add List input items for Responses API #11602

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jun 10, 2025

Conversation

ishaan-jaff
Copy link
Contributor

@ishaan-jaff ishaan-jaff commented Jun 10, 2025

[Feat] New LLM API Endpoint - Add List input items for Responses API

This PR adds a new “List Input Items” endpoint to the Responses API, wiring it through the client, router, proxy, and HTTP handler layers, and includes a basic async test.

  • Introduces litellm.alist_input_items client methods (sync and async)
  • Updates router, proxy routes, and HTTP handler to support the new endpoint
  • Adds a smoke test for the new endpoint and necessary type/enum updates

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature
✅ Test

Changes

Copy link

vercel bot commented Jun 10, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 10, 2025 10:33pm

@ishaan-jaff ishaan-jaff requested a review from Copilot June 10, 2025 22:09
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR adds a new “List Input Items” endpoint to the Responses API, wiring it through the client, router, proxy, and HTTP handler layers, and includes a basic async test.

  • Introduces litellm.alist_input_items client methods (sync and async)
  • Updates router, proxy routes, and HTTP handler to support the new endpoint
  • Adds a smoke test for the new endpoint and necessary type/enum updates

Reviewed Changes

Copilot reviewed 12 out of 12 changed files in this pull request and generated no comments.

Show a summary per file
File Description
tests/llm_responses_api_testing/base_responses_api.py Adds an async test for alist_input_items
litellm/types/utils.py Registers alist_input_items in CallTypes
litellm/router.py Formats annotations and hooks up alist_input_items
litellm/responses/main.py Implements alist_input_items client functions
litellm/proxy/route_llm_request.py Adds proxy routing for alist_input_items
litellm/proxy/response_api_endpoints/endpoints.py Implements FastAPI proxy endpoint logic
litellm/proxy/common_request_processing.py Includes alist_input_items in common processing lists
litellm/llms/openai/responses/transformation.py Adds request/response transformers for list endpoint
litellm/llms/base_llm/responses/transformation.py Declares abstract methods for list transformers
litellm/llms/azure/responses/transformation.py Adds Azure request transformer for list endpoint
litellm/llms/custom_httpx/llm_http_handler.py Implements sync/async handlers for list input items
litellm/model_prices_and_context_window_backup.json Adds two new Mistral model entries
Comments suppressed due to low confidence (4)

litellm/proxy/response_api_endpoints/endpoints.py:242

  • The handler refers to request but it isn't declared in the function signature. Add request: Request as the first parameter so _read_request_body(request=request) will work.
async def get_response_input_items(

tests/llm_responses_api_testing/base_responses_api.py:334

  • The test prints the response but does not assert anything about its shape or contents. Consider adding an assertion (e.g. that it's a dict and contains expected keys like 'data').
list_items_response = await litellm.alist_input_items(

litellm/types/utils.py:278

  • The new alist_input_items member should also be added to the CallTypesLiteral union so that type hints include this call type.
CallTypesLiteral = Literal[

litellm/llms/azure/responses/transformation.py:204

  • Azure provider implements the request transformer but not transform_list_input_items_response. You need to add a response transformer (e.g. parsing .json()) to satisfy the abstract base class and handle responses correctly.
return url, params

@ishaan-jaff ishaan-jaff merged commit 4dc9626 into main Jun 10, 2025
39 of 46 checks passed
stefan-- pushed a commit to stefan--/litellm that referenced this pull request Jun 12, 2025
…erriAI#11602)

* (feat) add list_input_items

* add alist_input_items to router

* add GET input_items for responses API

* test_basic_openai_list_input_items_endpoint

* TestTransformListInputItemsRequest

* test_ensure_initialize_azure_sdk_client_always_used
X4tar pushed a commit to X4tar/litellm that referenced this pull request Jun 17, 2025
…erriAI#11602)

* (feat) add list_input_items

* add alist_input_items to router

* add GET input_items for responses API

* test_basic_openai_list_input_items_endpoint

* TestTransformListInputItemsRequest

* test_ensure_initialize_azure_sdk_client_always_used
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant