-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
fix(core): Intercept .withResponse() to preserve OpenAI stream instru… #19122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
fix(core): Intercept .withResponse() to preserve OpenAI stream instru… #19122
Conversation
Codecov Results 📊Generated by Codecov Action |
size-limit report 📦
|
node-overhead report 🧳Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cursor Bugbot has reviewed your changes and found 1 potential issue.
Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable Autofix in the Cursor dashboard.
| const result = client.chat.completions.create({ | ||
| model: 'gpt-4', | ||
| messages: [{ role: 'user', content: 'Test withResponse' }], | ||
| }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test doesn't cover streaming .withResponse() regression
Low Severity
The PR description states the fix is specifically for streaming calls where .withResponse() returned the original uninstrumented stream. However, the test scenario only tests non-streaming calls (no stream: true parameter). According to the review rules for fix PRs, tests should verify the specific regression being fixed. A streaming test case with .withResponse() would provide confidence that the core bug is actually fixed. Flagging this because the review rules file specifies that fix PRs should include tests for the specific regression.


When Sentry instruments the OpenAI client, we return an instrumented promise. The OpenAI SDK returns a custom promise-like object with extra methods like .withResponse(), so returning a plain Promise breaks those methods.
We added wrapPromiseWithMethods() to proxy SDK methods back onto the instrumented promise. However, for streaming calls, .withResponse() returned the original uninstrumented stream, so we fixed it by intercepting .withResponse() in the proxy:
Closes #19073