Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
213 changes: 135 additions & 78 deletions packages/core/src/tracing/openai/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -162,95 +162,152 @@ function addRequestAttributes(span: Span, params: Record<string, unknown>, opera
}

/**
* Instrument a method with Sentry spans
* Following Sentry AI Agents Manual Instrumentation conventions
* Wrap the original return value so span logic runs on settle while preserving
* API surface (e.g. .withResponse()). Callers can await the wrapper or call .withResponse().
*/
function wrapReturnValue<R>(
result: R & { then?: (onFulfilled?: (value: unknown) => unknown, onRejected?: (reason?: unknown) => unknown) => unknown; withResponse?: () => unknown },
span: Span,
options: { recordOutputs?: boolean; recordInputs?: boolean },
methodPath: InstrumentedMethod,
params: Record<string, unknown> | undefined,
operationName: string,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused params and operationName parameters in wrapReturnValue

Low Severity

The params and operationName parameters are declared in wrapReturnValue (lines 173-174) and passed from instrumentMethod (lines 305-307), but they are never used anywhere in the function body. This appears to be leftover dead code from refactoring.

Fix in Cursor Fix in Web

isStreamRequested: boolean,
): R {
const thenable =
result !== null &&
typeof result === 'object' &&
typeof (result as { then?: unknown }).then === 'function'
? (result as unknown as Promise<unknown>)
: Promise.resolve(result);

const chained = isStreamRequested
? thenable.then(
(stream: unknown) =>
instrumentStream(
stream as OpenAIStream<ChatCompletionChunk | ResponseStreamingEvent>,
span,
options.recordOutputs ?? false,
),
(error: unknown) => {
span.setStatus({ code: SPAN_STATUS_ERROR, message: 'internal_error' });
captureException(error, {
mechanism: { handled: false, type: 'auto.ai.openai.stream', data: { function: methodPath } },
});
span.end();
throw error;
},
)
: thenable.then(
(data: unknown) => {
addResponseAttributes(span, data, options.recordOutputs);
span.end();
return data;
},
(error: unknown) => {
captureException(error, {
mechanism: { handled: false, type: 'auto.ai.openai', data: { function: methodPath } },
});
span.end();
throw error;
},
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-streaming error handler missing span error status

Medium Severity

The non-streaming async error handler (lines 207-213) is missing a call to span.setStatus({ code: SPAN_STATUS_ERROR, message: 'internal_error' }). The streaming error handler (lines 192-199) and the synchronous error handler (lines 293-300) both properly set the span status on error, but this path was overlooked. This causes non-streaming request failures to not have their span marked as errored in Sentry traces.

Fix in Cursor Fix in Web

);

const wrapper = {
then(onFulfilled?: (value: unknown) => unknown, onRejected?: (reason?: unknown) => unknown) {
return chained.then(onFulfilled, onRejected);
},
catch(onRejected?: (reason?: unknown) => unknown) {
return chained.catch(onRejected);
},
finally(onFinally?: () => void) {
return chained.finally(onFinally);
},
} as unknown as R & { withResponse?: () => unknown };

if (typeof result === 'object' && result !== null && typeof (result as { withResponse?: () => unknown }).withResponse === 'function') {
const withResponseOriginal = (result as { withResponse: () => unknown }).withResponse;
wrapper.withResponse = function withResponse() {
const withResponseResult = withResponseOriginal.call(result);
const withResponseThenable =
withResponseResult !== null &&
typeof withResponseResult === 'object' &&
typeof (withResponseResult as { then?: unknown }).then === 'function'
? (withResponseResult as Promise<{ data: AsyncIterable<unknown>; response: unknown }>)
: Promise.resolve(withResponseResult);

if (isStreamRequested) {
return withResponseThenable.then((payload: { data: AsyncIterable<unknown>; response: unknown }) => ({
data: instrumentStream(
payload.data as OpenAIStream<ChatCompletionChunk | ResponseStreamingEvent>,
span,
options.recordOutputs ?? false,
),
response: payload.response,
}));
}
return withResponseThenable.then((payload: { data: unknown; response: unknown }) => {
addResponseAttributes(span, payload.data, options.recordOutputs);
return payload;
});
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-streaming .withResponse() adds attributes after span ends

Medium Severity

When a user calls .withResponse() on a non-streaming request, both the chained promise handler (lines 201-206) and the .withResponse() handler (lines 249-252) process the response. The chained handler calls span.end(), but the .withResponse() handler then tries to call addResponseAttributes on the already-ended span. This happens because chained is created eagerly and its handlers run whenever the underlying promise resolves, regardless of whether the user uses .withResponse().

Additional Locations (1)

Fix in Cursor Fix in Web

};
}

return wrapper as R;
}

/**
* Instrument a method with Sentry spans. Returns the same shape as the original
* (including .withResponse() when present) and runs span logic when the promise settles.
* @see https://docs.sentry.io/platforms/javascript/guides/node/tracing/instrumentation/ai-agents-module/#manual-instrumentation
*/
function instrumentMethod<T extends unknown[], R>(
originalMethod: (...args: T) => Promise<R>,
methodPath: InstrumentedMethod,
context: unknown,
options: OpenAiOptions,
): (...args: T) => Promise<R> {
return async function instrumentedMethod(...args: T): Promise<R> {
): (...args: T) => R {
return function instrumentedMethod(...args: T): R {
const requestAttributes = extractRequestAttributes(args, methodPath);
const model = (requestAttributes[GEN_AI_REQUEST_MODEL_ATTRIBUTE] as string) || 'unknown';
const operationName = getOperationName(methodPath);

const params = args[0] as Record<string, unknown> | undefined;
const isStreamRequested = params && typeof params === 'object' && params.stream === true;

if (isStreamRequested) {
// For streaming responses, use manual span management to properly handle the async generator lifecycle
return startSpanManual(
{
name: `${operationName} ${model} stream-response`,
op: getSpanOperation(methodPath),
attributes: requestAttributes as Record<string, SpanAttributeValue>,
},
async (span: Span) => {
try {
if (options.recordInputs && params) {
addRequestAttributes(span, params, operationName);
}

const result = await originalMethod.apply(context, args);

return instrumentStream(
result as OpenAIStream<ChatCompletionChunk | ResponseStreamingEvent>,
span,
options.recordOutputs ?? false,
) as unknown as R;
} catch (error) {
// For streaming requests that fail before stream creation, we still want to record
// them as streaming requests but end the span gracefully
span.setStatus({ code: SPAN_STATUS_ERROR, message: 'internal_error' });
captureException(error, {
mechanism: {
handled: false,
type: 'auto.ai.openai.stream',
data: {
function: methodPath,
},
},
});
span.end();
throw error;
}
},
);
} else {
// Non-streaming responses
return startSpan(
{
name: `${operationName} ${model}`,
op: getSpanOperation(methodPath),
attributes: requestAttributes as Record<string, SpanAttributeValue>,
},
async (span: Span) => {
try {
if (options.recordInputs && params) {
addRequestAttributes(span, params, operationName);
}

const result = await originalMethod.apply(context, args);
addResponseAttributes(span, result, options.recordOutputs);
return result;
} catch (error) {
captureException(error, {
mechanism: {
handled: false,
type: 'auto.ai.openai',
data: {
function: methodPath,
},
},
});
throw error;
}
},
);
}
const isStreamRequested = !!(params && typeof params === 'object' && params.stream === true);

const spanOptions = {
name: isStreamRequested ? `${operationName} ${model} stream-response` : `${operationName} ${model}`,
op: getSpanOperation(methodPath),
attributes: requestAttributes as Record<string, SpanAttributeValue>,
};

return startSpanManual(spanOptions, (span: Span) => {
if (options.recordInputs && params) {
addRequestAttributes(span, params, operationName);
}
let result: R & {
then?: (onFulfilled?: (value: unknown) => unknown, onRejected?: (reason?: unknown) => unknown) => unknown;
withResponse?: () => unknown;
};
try {
result = originalMethod.apply(context, args) as typeof result;
} catch (error) {
span.setStatus({ code: SPAN_STATUS_ERROR, message: 'internal_error' });
captureException(error, {
mechanism: { handled: false, type: 'auto.ai.openai', data: { function: methodPath } },
});
span.end();
throw error;
Comment on lines +291 to +299
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: A synchronous error in originalMethod causes a raw error to be thrown, instead of returning a rejected promise-like object with .withResponse(), breaking the API contract.
Severity: MEDIUM

Suggested Fix

To ensure consistent error handling, the instrumentMethod function should always return a promise-like object. Wrap the call to originalMethod in a Promise.resolve() or modify the structure to ensure that even synchronous errors result in a rejected promise that passes through wrapReturnValue. This maintains the API contract and allows callers to reliably handle all errors with promise-based mechanisms like .catch() and access .withResponse().

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: packages/core/src/tracing/openai/index.ts#L291-L299

Potential issue: The `instrumentMethod` function uses `startSpanManual`, which re-throws
synchronous errors immediately. If the wrapped `originalMethod` (e.g., from the OpenAI
SDK) throws a synchronous error, the `try...catch` block within `startSpanManual`'s
callback will re-throw it. This prevents the `wrapReturnValue` function from being
called. As a result, the caller receives a raw `Error` object instead of a rejected
promise-like object that includes the `.withResponse()` method. This is a breaking
change from the previous behavior, which always returned a promise, and creates
inconsistent error handling between synchronous and asynchronous failures.

Did we get this right? 👍 / 👎 to inform future reviews.

}
return wrapReturnValue(
result,
span,
options,
methodPath,
params,
operationName,
isStreamRequested,
) as R;
});
};
}

Expand Down
Loading