Skip to content

Token usage and genai calls not seen in ai foundary tracing for AI Agents #40176

Open
@sapinderpalsingh

Description

@sapinderpalsingh

I configured azure monitor open telemetry to an ai agent application and configured AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED=true in my environment variables but i still can't see the LLM calls in AI foundary tracing. While the same is much cleaner when I integrate the application insights with semantic kernel chatcompletion agent and setting SEMANTICKERNEL_EXPERIMENTAL_GENAI_ENABLE_OTEL_DIAGNOSTICS_SENSITIVE=true

Sample output with SK chatcompletion agent:

Image

Sample output with AI agent

Image

Keeping or removing AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED=true in .env file doesn't change the output.

Metadata

Metadata

Assignees

Labels

ClientThis issue points to a problem in the data-plane of the library.Monitor - ExporterMonitor OpenTelemetry ExporterService AttentionWorkflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK teamquestionThe issue doesn't require a change to the product in order to be resolved. Most issues start as that

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions