Tracing
Built-in spans for LLM, tool, agent reply, and formatter with error and exception tracking.
AgentScope Studio
Native visualization of traces and token usage when you connect your run to Studio.
Third-Party Export
Send traces to OTLP-compatible backends (e.g. Arize-Phoenix, Langfuse, Alibaba Cloud).
Connecting to AgentScope Studio or a third-party tracing endpoint is done at application startup via agentscope.init. See Settings for other init options.
Overview
Observability in AgentScope is implemented with OpenTelemetry. The framework instruments:- LLM calls — each model
__call__(chat, streaming, tools) - Agent replies — each agent
reply(reasoning and acting) - Formatters — message formatting before sending to the model
- Tools — toolkit
call_tool_functioninvocations - Embeddings — embedding model calls
Setting Up Tracing
AgentScope Studio
When you run agents with AgentScope Studio, you get built-in trace visualization and token usage views. Configure the connection at the start of your application:Third-Party Platforms
To send traces to an OpenTelemetry-compatible backend (or your own OTLP collector), settracing_url in agentscope.init. The URL must be the OTLP trace endpoint.
studio_url and tracing_url, traces are sent to tracing_url; if you pass only studio_url, traces are sent to Studio’s tracing endpoint.
Use either Studio (
studio_url) or a third-party (tracing_url) for trace export, or both when you want Studio for UI and a separate backend for storage/analysis.Connecting to Third-Party Backends
The following examples show how to point AgentScope at popular OTLP-compatible backends.Alibaba Cloud CloudMonitor
Alibaba Cloud CloudMonitor supports OTLP. Use the public endpoint for your region from the ARMS console (Access Center > OpenTelemetry). You can set the service name with theOTEL_SERVICE_NAME environment variable.
Arize Phoenix
Arize Phoenix accepts OTLP. SetPHOENIX_API_KEY in your environment and pass Phoenix’s trace endpoint:
Langfuse
Langfuse supports OTLP. ConfigureLANGFUSE_PUBLIC_KEY and LANGFUSE_SECRET_KEY, then set the Authorization=Basic ... header for the OTLP exporter:
Customizing Tracing
AgentScope’s tracing is implemented with OpenTelemetry; custom spans you create with the OpenTelemetry SDK will appear in the same trace tree. In addition, the following decorators are available to trace framework components:| Decorator | Target | Description |
|---|---|---|
@trace_llm | ChatModelBase.__call__ | Traces LLM invocations (chat, stream, tools). |
@trace_reply | AgentBase.reply | Traces agent reply (reasoning and acting). |
@trace_format | FormatterBase.format | Traces message formatting. |
@trace_toolkit | Toolkit.call_tool_function | Traces tool calls. |
@trace_embedding | Embedding model __call__ | Traces embedding API calls. |
@trace(name="...") | Any function | General-purpose tracer for sync/async and generators. |
Tracing a Custom Chat Model
Your custom model must inherit fromChatModelBase. Apply @trace_llm to __call__ so its calls appear in traces:
Tracing a Custom Agent
Your custom agent must inherit fromAgentBase. Apply @trace_reply to reply:
Tracing a Custom Formatter
Your custom formatter must inherit fromFormatterBase. Apply @trace_format to format:
General-Purpose Tracing
Use@trace(name="...") on any function—sync or async, including generators—to add a span with the given name:
Token Usage
Token usage is tracked by the model layer and is included in trace metadata where supported. When you connect to AgentScope Studio, token consumption is visualized in the Studio UI so you can monitor cost and usage per run. For third-party backends, token-related attributes are exported with the LLM spans according to the OpenTelemetry semantic conventions used by AgentScope. For programmatic access to usage after a model call, use theusage field on the ChatResponse returned by the model.
Summary
- Call agentscope.init(studio_url=…) for Studio tracing and token visualization, or agentscope.init(tracing_url=…) for an OTLP backend (or both).
- Use AgentScope Studio for built-in trace and token usage views.
- Use tracing_url to send traces to Arize-Phoenix, Langfuse, Alibaba Cloud CloudMonitor, or any OTLP endpoint.
- Use @trace_llm, @trace_reply, @trace_format, @trace_toolkit, @trace_embedding, and @trace when implementing custom models, agents, or formatters so they appear in the same trace tree.