Auto Router for LLM depending on complexity of the task. #1798
tomahawk000
started this conversation in
Feature Requests
Replies: 1 comment 4 replies
-
I believe OpenRouter is working on this type of feature. |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
The Auto Router facilitates LLM usage by placing an LLM at the forefront to determine routing based on task complexity. This approach could save a significant number of tokens.
For example, telling the LLM to 'npm run dev' or run xyz tool doesn't need to call the most expensive API you're utilizing, but a request concerning Firebase authentication or another complex issue would escalate to the top.
I understand this may not align well with caching.
Beta Was this translation helpful? Give feedback.
All reactions