Replies: 1 comment 4 replies
-
It seems these are the right settings for - api_name: deepseek-r1-distill-llama-70B
name: DeepSeek R1 Distill Llama 70B
supports_images: false
supports_tools: false
input_token_cost_cents: '0.000075'
output_token_cost_cents: '0.000099'
best: true
supports_system_message: true
api_service_name: Groq
- api_name: llama-3.3-70b-versatile
name: Llama 3.3 70B Versatile 128k
supports_images: false
supports_tools: false
input_token_cost_cents: '0.000059'
output_token_cost_cents: '0.000079'
best: true
supports_system_message: true
api_service_name: Groq |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
just tried the Groq integration with the above model: this seems to be deprecated:
But there is now available deepseek-r1-distill-llama-70B, which is extremely powerful. Would that be possible to switch that over ?
Beta Was this translation helpful? Give feedback.
All reactions