Skip to content

Conversation

@roomote
Copy link
Contributor

@roomote roomote bot commented Jan 12, 2026

Related GitHub Issue

Closes: #10601

Description

This PR attempts to address Issue #10601 by enabling prompt caching for the Cerebras zai-glm-4.7 model.

The change is straightforward - updated supportsPromptCache from false to true in the model configuration at packages/types/src/providers/cerebras.ts.

This aligns with the Cerebras documentation which confirms that zai-glm-4.7 supports prompt caching.

Test Procedure

  • Ran existing Cerebras tests via cd src && npx vitest run api/providers/__tests__/cerebras.spec.ts - all 17 tests pass
  • The change is a configuration flag update with no logic changes

Pre-Submission Checklist

  • Issue Linked: This PR is linked to an approved GitHub Issue (see "Related GitHub Issue" above).
  • Scope: My changes are focused on the linked issue (one major feature/fix per PR).
  • Self-Review: I have performed a thorough self-review of my code.
  • Testing: New and/or updated tests have been added to cover my changes (if applicable).
  • Documentation Impact: I have considered if my changes require documentation updates (see "Documentation Updates" section below).
  • Contribution Guidelines: I have read and agree to the Contributor Guidelines.

Documentation Updates

  • No documentation updates are required.

Additional Notes

Feedback and guidance are welcome.


Important

Enable prompt caching for zai-glm-4.7 model in cerebras.ts by setting supportsPromptCache to true.

  • Behavior:
    • Enable prompt caching for zai-glm-4.7 model by setting supportsPromptCache to true in cerebras.ts.
  • Testing:
    • All 17 existing tests in cerebras.spec.ts pass, confirming no adverse effects from the change.

This description was created by Ellipsis for 856ec7a. You can customize this summary. It will automatically update as commits are pushed.

@roomote
Copy link
Contributor Author

roomote bot commented Jan 12, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The change correctly enables prompt caching for the zai-glm-4.7 model, aligning with Cerebras documentation.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

@hannesrudolph hannesrudolph marked this pull request as ready for review January 20, 2026 17:00
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. Enhancement New feature or request labels Jan 20, 2026
@roomote
Copy link
Contributor Author

roomote bot commented Jan 20, 2026

Rooviewer Clock   See task on Roo Cloud

Review complete. No issues found.

The change correctly enables prompt caching for the zai-glm-4.7 model by setting supportsPromptCache: true, which aligns with Cerebras documentation.

Mention @roomote in a comment to request specific changes to this pull request or fix all unresolved issues.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jan 20, 2026
@cte cte merged commit c7ce8aa into main Jan 20, 2026
25 checks passed
@cte cte deleted the feature/enable-zai-glm-4.7-prompt-cache branch January 20, 2026 21:40
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jan 20, 2026
@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Jan 20, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Enhancement New feature or request lgtm This PR has been approved by a maintainer size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

[ENHANCEMENT] Turn on prompt caching for supported Cerebras model zai-glm-4.7

3 participants