Which component is this bug for?
OpenAI Agents Instrumentation (opentelemetry-instrumentation-openai-agents)
📜 Description
The opentelemetry-instrumentation-openai-agents package does not capture response.instructions from the OpenAI Responses API. System prompts/instructions are never emitted as span attributes on generation spans.
In _hooks.py, the _extract_response_attributes() function reads temperature, max_output_tokens, top_p, model, frequency_penalty, output, and usage from the response object — but never reads response.instructions.
The vanilla opentelemetry-instrumentation-openai package handles this correctly in responses_wrappers.py:
if traced_response.instructions:
_set_span_attribute(
span,
f"{GenAIAttributes.GEN_AI_PROMPT}.{prompt_index}.content",
traced_response.instructions,
)
_set_span_attribute(
span,
f"{GenAIAttributes.GEN_AI_PROMPT}.{prompt_index}.role",
"system",
)
👟 Reproduction steps
- Create an agent with instructions:
from agents import Agent, Runner
agent = Agent(
name="my_agent",
instructions="You are a helpful assistant that always responds in JSON.",
model="gpt-4o-mini",
)
result = Runner.run_sync(agent, "Hello")
- Instrument with OpenLLMetry:
from opentelemetry.instrumentation.openai_agents import OpenAIAgentsInstrumentor
OpenAIAgentsInstrumentor().instrument()
-
Set TRACELOOP_TRACE_CONTENT=true
-
Inspect the generation span attributes — no gen_ai.prompt entry with role: system exists.
👍 Expected behavior
Generation spans should include the agent's instructions as the first prompt entry:
gen_ai.prompt.0.role = "system"
gen_ai.prompt.0.content = "You are a helpful assistant that always responds in JSON."
This is how the vanilla opentelemetry-instrumentation-openai package already handles it.
👎 Actual Behavior with Screenshots
Generation spans contain gen_ai.prompt entries for the message history (user messages, tool calls, etc.) but no system prompt entry. The response.instructions field is completely ignored.
🤖 Python Version
3.12
📃 Provide any additional context for the Bug.
The fix is straightforward. In on_span_end in _hooks.py, when handling GenerationSpanData / ResponseSpanData, prepend response.instructions as a system message before calling _extract_prompt_attributes:
response = getattr(span_data, "response", None)
if trace_content and response and hasattr(response, "instructions") and response.instructions:
system_msg = {"role": "system", "content": response.instructions}
input_data = [system_msg] + (input_data if input_data else [])
_extract_prompt_attributes(otel_span, input_data, trace_content)
👀 Have you spent some time to check if this bug has been raised before?
Are you willing to submit PR?
Yes I am willing to submit a PR!
Which component is this bug for?
OpenAI Agents Instrumentation (
opentelemetry-instrumentation-openai-agents)📜 Description
The
opentelemetry-instrumentation-openai-agentspackage does not captureresponse.instructionsfrom the OpenAI Responses API. System prompts/instructions are never emitted as span attributes on generation spans.In
_hooks.py, the_extract_response_attributes()function readstemperature,max_output_tokens,top_p,model,frequency_penalty,output, andusagefrom the response object — but never readsresponse.instructions.The vanilla
opentelemetry-instrumentation-openaipackage handles this correctly inresponses_wrappers.py:👟 Reproduction steps
Set
TRACELOOP_TRACE_CONTENT=trueInspect the generation span attributes — no
gen_ai.promptentry withrole: systemexists.👍 Expected behavior
Generation spans should include the agent's instructions as the first prompt entry:
gen_ai.prompt.0.role="system"gen_ai.prompt.0.content="You are a helpful assistant that always responds in JSON."This is how the vanilla
opentelemetry-instrumentation-openaipackage already handles it.👎 Actual Behavior with Screenshots
Generation spans contain
gen_ai.promptentries for the message history (user messages, tool calls, etc.) but no system prompt entry. Theresponse.instructionsfield is completely ignored.🤖 Python Version
3.12
📃 Provide any additional context for the Bug.
The fix is straightforward. In
on_span_endin_hooks.py, when handlingGenerationSpanData/ResponseSpanData, prependresponse.instructionsas a system message before calling_extract_prompt_attributes:👀 Have you spent some time to check if this bug has been raised before?
Are you willing to submit PR?
Yes I am willing to submit a PR!