Skip to content

Support prompt caching for LiteLLM #5791

@steve-gore-snapdocs

Description

@steve-gore-snapdocs

What specific problem does this solve?

When using LiteLLM as a provider and a model that supports prompt caching (claued 3.7), Roo does not use prompt caching. This is evident in the pricing shown in Roo and in the LiteLLM usage.

Additional context (optional)

Cline has added this support in cline@801946f

Roo Code Task Links (Optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear impact and context

Interested in implementing this?

  • Yes, I'd like to help implement this feature

Implementation requirements

  • I understand this needs approval before implementation begins

How should this be solved? (REQUIRED if contributing, optional otherwise)

No response

How will we know it works? (Acceptance Criteria - REQUIRED if contributing, optional otherwise)

No response

Technical considerations (REQUIRED if contributing, optional otherwise)

No response

Trade-offs and risks (REQUIRED if contributing, optional otherwise)

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    Status

    Issue [Unassigned]

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions