-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Closed
Copy link
Labels
Description
Problem (one or two sentences)
Users are unable to take advantage of the newly supported prompt caching feature by Cerebras' model zai-glm-4.7.
The provider's page clearly states prompt caching is supported: https://inference-docs.cerebras.ai/models/zai-glm-47
Context (who is affected and when)
Whenever the model zai-glm-4.7 is selected in the Cerebras provider, it is set as disabled.
Desired behavior (conceptual, not technical)
Enable prompt caching by default.
Constraints / preferences (optional)
No response
Request checklist
- I've searched existing Issues and Discussions for duplicates
- This describes a specific problem with clear context and impact
Roo Code Task Links (optional)
No response
Acceptance criteria (optional)
No response
Proposed approach (optional)
Set the model zai-glm-4.7's prompt caching to true.
| supportsPromptCache: false, |
Trade-offs / risks (optional)
No response
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Done