Replies: 1 comment
-
It seems these are the right settings for - api_name: deepseek-r1-distill-llama-70B
name: DeepSeek R1 Distill Llama 70B
supports_images: false
supports_tools: false
input_token_cost_cents: '0.000075'
output_token_cost_cents: '0.000099'
best: true
supports_system_message: true
api_service_name: Groq
- api_name: llama-3.3-70b-versatile
name: Llama 3.3 70B Versatile 128k
supports_images: false
supports_tools: false
input_token_cost_cents: '0.000059'
output_token_cost_cents: '0.000079'
best: true
supports_system_message: true
api_service_name: Groq |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
just tried the Groq integration with the above model: this seems to be deprecated:
But there is now available deepseek-r1-distill-llama-70B, which is extremely powerful. Would that be possible to switch that over ?
Beta Was this translation helpful? Give feedback.
All reactions