Skip to content

Token count or the Token progress bar doesn't work when using "VS Code LM API" as the API Provider #6112

@sebinseban

Description

@sebinseban

App Version

v0.0.6302

API Provider

Anthropic

Model Used

Claude 4 Sonnet, Claude 3.7 Sonnet

Roo Code Task Links (Optional)

No response

🔁 Steps to Reproduce

Setup
I am connecting to my Ubuntu linux system from my windows machine via SSH remote option. I am on the Roo Code Nightly version. I extensively use Roo Code, maybe 8 hours on average per day via "VS Code LM API" as the API Provider.
Exact Issue
VSCode LM API provider context length bar remains static and doesn't display usage metrics. When the first message is sent to Roo Code, the context bar moves to anything ranging from 7k to 10.8K (Context Length: 7.6k in one example) , But after that, this value doest even move at all. Even if you use the Orchestrator mode or any other mode for 1 hour also, this bar or Context Length information doesnt change at all. It just remains static with the first message's token value. I believe this might be an issue very specific to VSCode LM API provider. This becomes really hard to actually make any guess of the context being used and when to switch to a totally new task. Also, the tokens field with up and down arrow key right below the "Context Length: " field seems to work perfectly fine as i have dynamic values in all the tasks that i have performed with VS Code LM API as the provider.

💥 Outcome Summary

Context Length: Value and Context length bar should dynamically update to show actual token usage.

📄 Relevant Logs or Errors (Optional)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Issue - Needs ScopingValid, but needs effort estimate or design input before work can start.bugSomething isn't working

    Type

    No type

    Projects

    Status

    Issue [Needs Scoping]

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions