The Tool system is a core component of the tRPC-Agent-Go framework, enabling Agents to interact with external services and functions. The framework supports multiple tool types, including Function Tools and external tools integrated via the MCP (Model Context Protocol) standard.
Overview
π― Key Features
π§ Multiple Tool Types: Supports Function Tools and MCP standard tools.
π Streaming Responses: Supports both real-time streaming responses and normal responses.
β‘ Parallel Execution: Tool invocations support parallel execution to improve performance.
π MCP Protocol: Full support for STDIO, SSE, and Streamable HTTP transports.
π οΈ Configuration Support: Provides configuration options and filter support.
Core Concepts
π§ Tool
A Tool is an abstraction of a single capability that implements the tool.Tool interface. Each Tool provides specific functionality such as mathematical calculation, search, time query, etc.
A ToolSet is a collection of related tools that implements the tool.ToolSet interface. A ToolSet manages the lifecycle of tools, connections, and resource cleanup.
typeToolSetinterface{// Tools returns the current tools in this set.Tools(context.Context)[]tool.Tool// Close releases any resources held by the ToolSet.Close()error// Name returns the identifier of the ToolSet, used for// identification and conflict resolution.Name()string}
Relationship between Tool and ToolSet:
One "Tool" = one concrete capability (e.g., calculator).
One "ToolSet" = a group of related Tools (e.g., all tools provided by an MCP server).
An Agent can use multiple Tools and multiple ToolSets simultaneously.
π Streaming Tool Support
The framework supports streaming tools to provide real-time responses:
import"trpc.group/trpc-go/trpc-agent-go/tool/function"// 1. Define a tool function.funccalculator(ctxcontext.Context,reqstruct{Operationstring`json:"operation"`Afloat64`json:"a"`Bfloat64`json:"b"`})(map[string]interface{},error){switchreq.Operation{case"add":returnmap[string]interface{}{"result":req.A+req.B},nilcase"multiply":returnmap[string]interface{}{"result":req.A*req.B},nildefault:returnnil,fmt.Errorf("unsupported operation: %s",req.Operation)}}// 2. Create the tool.calculatorTool:=function.NewFunctionTool(calculator,function.WithName("calculator"),function.WithDescription("Perform mathematical operations."),)// 3. Integrate into an Agent.agent:=llmagent.New("math-assistant",llmagent.WithModel(model),llmagent.WithTools([]tool.Tool{calculatorTool}))
// 1. Define input and output structures.typeweatherInputstruct{Locationstring`json:"location"`}typeweatherOutputstruct{Weatherstring`json:"weather"`}// 2. Implement the streaming tool function.funcgetStreamableWeather(inputweatherInput)*tool.StreamReader{stream:=tool.NewStream(10)gofunc(){deferstream.Writer.Close()// Simulate progressively returning weather data.result:="Sunny, 25Β°C in "+input.Locationfori:=0;i<len(result);i++{chunk:=tool.StreamChunk{Content:weatherOutput{Weather:result[i:i+1],},Metadata:tool.Metadata{CreatedAt:time.Now()},}ifclosed:=stream.Writer.Send(chunk,nil);closed{break}time.Sleep(10*time.Millisecond)// Simulate latency.}}()returnstream.Reader}// 3. Create the streaming tool.weatherStreamTool:=function.NewStreamableFunctionTool[weatherInput,weatherOutput](getStreamableWeather,function.WithName("get_weather_stream"),function.WithDescription("Get weather information as a stream."),)// 4. Use the streaming tool.reader,err:=weatherStreamTool.StreamableCall(ctx,jsonArgs)iferr!=nil{returnerr}// Receive streaming data.for{chunk,err:=reader.Recv()iferr==io.EOF{break// End of stream.}iferr!=nil{returnerr}// Process each chunk.fmt.Printf("Received: %v\n",chunk.Content)}reader.Close()
Built-in Tools
DuckDuckGo Search Tool
The DuckDuckGo tool is based on the DuckDuckGo Instant Answer API and provides factual and encyclopedia-style information search capabilities.
import"trpc.group/trpc-go/trpc-agent-go/tool/duckduckgo"// Create a DuckDuckGo search tool.searchTool:=duckduckgo.NewTool()// Integrate into an Agent.searchAgent:=llmagent.New("search-assistant",llmagent.WithModel(model),llmagent.WithTools([]tool.Tool{searchTool}))
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. MCP tools are based on JSON-RPC 2.0 and provide standardized integration with external services for Agents.
MCP ToolSet Features:
π Unified interface: All MCP tools are created via mcp.NewMCPToolSet().
β Explicit initialization: (*mcp.ToolSet).Init(ctx) lets you fail fast on MCP connection / tool loading errors during startup.
π Multiple transports: Supports STDIO, SSE, and Streamable HTTP.
π§ Tool filters: Supports including/excluding specific tools.
import"trpc.group/trpc-go/trpc-agent-go/tool/mcp"// Create an MCP ToolSet (STDIO example).mcpToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"stdio",// Transport method.Command:"go",// Command to execute.Args:[]string{"run","./stdio_server/main.go"},Timeout:10*time.Second,},mcp.WithToolFilter(mcp.NewIncludeFilter("echo","add")),// Optional: tool filter.)// (Optional but recommended) Explicitly initialize MCP: connect + initialize + list tools.iferr:=mcpToolSet.Init(ctx);err!=nil{log.Fatalf("failed to initialize MCP toolset: %v",err)}// Integrate into an Agent.agent:=llmagent.New("mcp-assistant",llmagent.WithModel(model),llmagent.WithToolSets([]tool.ToolSet{mcpToolSet}))
Transport Configuration
MCP ToolSet supports three transports via the Transport field:
1. STDIO Transport
Communicates with external processes via standard input/output. Suitable for local scripts and CLI tools.
mcpToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"streamable_http",// Use the full name.ServerURL:"http://localhost:3000/mcp",Timeout:10*time.Second,},)
Session Reconnection Support
MCP ToolSet supports automatic session reconnection to recover from server restarts or session expiration.
// SSE/Streamable HTTP transports support session reconnectionsseToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"sse",ServerURL:"http://localhost:8080/sse",Timeout:10*time.Second,},mcp.WithSessionReconnect(3),// Enable session reconnection with max 3 attempts)
Reconnection Features:
π Auto Reconnect: Automatically recreates session when connection loss or expiration is detected
π― Independent Retries: Each tool call gets independent reconnection attempts
π‘οΈ Conservative Strategy: Only triggers reconnection for clear connection/session errors to avoid infinite loops
Dynamic MCP Tool Discovery (LLMAgent Option)
For MCP ToolSets, the list of tools on the server side can change over
time (for example, when a new MCP tool is registered). To let an
LLMAgent automatically see the latest tools from a ToolSet on each
run, use llmagent.WithRefreshToolSetsOnRun(true) together with
WithToolSets.
import("trpc.group/trpc-go/trpc-agent-go/agent/llmagent""trpc.group/trpc-go/trpc-agent-go/model/openai""trpc.group/trpc-go/trpc-agent-go/tool""trpc.group/trpc-go/trpc-agent-go/tool/mcp")// 1. Create an MCP ToolSet (can be STDIO, SSE, or Streamable HTTP).mcpToolSet:=mcp.NewMCPToolSet(connectionConfig)// 2. Create an LLMAgent and enable dynamic ToolSets refresh.agent:=llmagent.New("mcp-assistant",llmagent.WithModel(openai.New("gpt-4o-mini")),llmagent.WithToolSets([]tool.ToolSet{mcpToolSet}),llmagent.WithRefreshToolSetsOnRun(true),)
When WithRefreshToolSetsOnRun(true) is enabled:
Each time the LLMAgent builds its tool list, it calls
ToolSet.Tools(context.Background()) again.
If the MCP server adds or removes tools, the next run of this
LLMAgent will use the updated tool list automatically.
This option focuses on dynamic discovery of tools. If you also need
per-request HTTP headers (for example, authentication headers that come
from context.Context), keep using the pattern shown in the
examples/mcptool/http_headers example, where you manually call
toolSet.Tools(ctx) and pass the tools via WithTools.
Agent Tool (AgentTool)
AgentTool lets you expose an existing Agent as a tool to be used by a parent Agent. Compared with a plain function tool, AgentTool provides:
β Reuse: Wrap complex Agent capabilities as a standard tool
π Streaming: Optionally forward the child Agentβs streaming events inline to the parent flow
π§ Control: Options to skip post-tool summarization and to enable/disable inner forwarding
import("trpc.group/trpc-go/trpc-agent-go/agent/llmagent""trpc.group/trpc-go/trpc-agent-go/model""trpc.group/trpc-go/trpc-agent-go/tool"agenttool"trpc.group/trpc-go/trpc-agent-go/tool/agent")// 1) Define a reusable child Agent (streaming recommended)mathAgent:=llmagent.New("math-specialist",llmagent.WithModel(modelInstance),llmagent.WithInstruction("You are a math specialist..."),llmagent.WithGenerationConfig(model.GenerationConfig{Stream:true}),)// 2) Wrap as an Agent toolmathTool:=agenttool.NewTool(mathAgent,// Optional, defaults to false. When set to true, the outer model summary will be skipped, // and the current round will end directly after tool.response.agenttool.WithSkipSummarization(false),// forward child Agent streaming events to parent flow.agenttool.WithStreamInner(true),)// 3) Use in parent Agentparent:=llmagent.New("assistant",llmagent.WithModel(modelInstance),llmagent.WithGenerationConfig(model.GenerationConfig{Stream:true}),llmagent.WithTools([]tool.Tool{mathTool}),)
Streaming Inner Forwarding
When WithStreamInner(true) is enabled, AgentTool forwards child Agent events to the parent flow as they happen:
Forwarded items are actual event.Event instances, carrying incremental text in choice.Delta.Content
To avoid duplication, the child Agentβs final full message is not forwarded again; it is aggregated into the final tool.response content for the next LLM turn (to satisfy providers requiring tool messages)
UI guidance: show forwarded child deltas; avoid printing the aggregated final tool.response content unless debugging
Example: Only show tool fragments when needed to avoid duplicates
ifev.Response!=nil&&ev.Object==model.ObjectTypeToolResponse{// Tool response contains aggregated content; skip printing by default to avoid duplicates}// Child Agent forwarded deltas (author != parent)ifev.Author!=parentName&&len(ev.Choices)>0{ifdelta:=ev.Choices[0].Delta.Content;delta!=""{fmt.Print(delta)}}
Options
WithSkipSummarization(bool):
false (default): Allow an additional summarization/answer call after the tool result
true: Skip the outer summarization LLM call once the tool returns
WithStreamInner(bool):
true: Forward child Agent events to the parent flow (recommended: enable GenerationConfig{Stream: true} for both parent and child Agents)
false: Treat as a callable-only tool, without inner event forwarding
WithHistoryScope(HistoryScope):
HistoryScopeIsolated (default): Keep the child Agent fully isolated; it only sees the current tool arguments (no inherited history).
HistoryScopeParentBranch: Inherit parent conversation history by using a hierarchical filter key parent/child-uuid. This allows the content processor to include parent events via prefix matching while keeping child events isolated under a sub-branch. Typical use cases: βedit/optimize/continue previous outputβ.
import("trpc.group/trpc-go/trpc-agent-go/agent/llmagent""trpc.group/trpc-go/trpc-agent-go/tool""trpc.group/trpc-go/trpc-agent-go/tool/function""trpc.group/trpc-go/trpc-agent-go/tool/duckduckgo""trpc.group/trpc-go/trpc-agent-go/tool/mcp")// Create function tools.calculatorTool:=function.NewFunctionTool(calculator,function.WithName("calculator"),function.WithDescription("Perform basic mathematical operations."))timeTool:=function.NewFunctionTool(getCurrentTime,function.WithName("current_time"),function.WithDescription("Get the current time."))// Create a built-in tool.searchTool:=duckduckgo.NewTool()// Create MCP ToolSets (examples for different transports).stdioToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"stdio",Command:"python",Args:[]string{"-m","my_mcp_server"},Timeout:10*time.Second,},)iferr:=stdioToolSet.Init(ctx);err!=nil{returnfmt.Errorf("failed to initialize stdio MCP toolset: %w",err)}sseToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"sse",ServerURL:"http://localhost:8080/sse",Timeout:10*time.Second,},)iferr:=sseToolSet.Init(ctx);err!=nil{returnfmt.Errorf("failed to initialize sse MCP toolset: %w",err)}streamableToolSet:=mcp.NewMCPToolSet(mcp.ConnectionConfig{Transport:"streamable_http",ServerURL:"http://localhost:3000/mcp",Timeout:10*time.Second,},)iferr:=streamableToolSet.Init(ctx);err!=nil{returnfmt.Errorf("failed to initialize streamable MCP toolset: %w",err)}// Create an Agent and integrate all tools.agent:=llmagent.New("ai-assistant",llmagent.WithModel(model),llmagent.WithInstruction("You are a helpful AI assistant that can use various tools to help users."),// Add single tools (Tool interface).llmagent.WithTools([]tool.Tool{calculatorTool,timeTool,searchTool,}),// Add ToolSets (ToolSet interface).llmagent.WithToolSets([]tool.ToolSet{stdioToolSet,sseToolSet,streamableToolSet,}),)
MCP Tool Filters
MCP ToolSets support filtering tools at creation time. It's recommended to use the unified tool.FilterFunc interface:
import("trpc.group/trpc-go/trpc-agent-go/tool""trpc.group/trpc-go/trpc-agent-go/tool/mcp")// β Recommended: Use the unified filter interfaceincludeFilter:=tool.NewIncludeToolNamesFilter("get_weather","get_news","calculator")excludeFilter:=tool.NewExcludeToolNamesFilter("deprecated_tool","slow_tool")// Apply filtertoolSet:=mcp.NewMCPToolSet(connectionConfig,mcp.WithToolFilterFunc(includeFilter),)// Optional: initialize once at startup to catch MCP connection / tool loading errors early.iferr:=toolSet.Init(ctx);err!=nil{returnfmt.Errorf("failed to initialize MCP toolset: %w",err)}
Per-Run Tool Filtering
Option one: Per-run tool filtering enables dynamic control of tool availability for each runner.Run invocation without modifying Agent configuration. This is a "soft constraint" mechanism for optimizing token consumption and implementing role-based tool access control.
apply to all agents
Option two: Configure the runtime filtering function through 'llmagent. WhatToolFilter' to only apply to the current agent
Key Features:
π― Per-Run Control: Independent configuration per invocation, no Agent modification needed
π° Cost Optimization: Reduce tool descriptions sent to LLM, lowering token costs
import"trpc.group/trpc-go/trpc-agent-go/tool"// Option 1:// Exclude text_tool and dangerous_tool, all other tools availablefilter:=tool.NewExcludeToolNamesFilter("text_tool","dangerous_tool")eventChan,err:=runner.Run(ctx,userID,sessionID,message,agent.WithToolFilter(filter),)// Option 2:agent:=llmagent.New("ai-assistant",llmagent.WithModel(model),llmagent.WithInstruction("You are a helpful AI assistant that can use various tools to help users."),llmagent.WithTools([]tool.Tool{calculatorTool,timeTool,searchTool,}),llmagent.WithToolSets([]tool.ToolSet{stdioToolSet,sseToolSet,streamableToolSet,}),llmagent.WithToolFilter(filter),)
2. Include Only Specific Tools (Include Filter)
Use whitelist approach to allow only specified tools:
// Only allow calculator and time toolfilter:=tool.NewIncludeToolNamesFilter("calculator","time_tool")eventChan,err:=runner.Run(ctx,userID,sessionID,message,agent.WithToolFilter(filter),)
3. Custom Filtering Logic (Custom FilterFunc)
Implement custom filter function for complex filtering logic:
// Option 1:// Custom filter: only allow tools with names starting with "safe_"filter:=func(ctxcontext.Context,ttool.Tool)bool{declaration:=t.Declaration()ifdeclaration==nil{returnfalse}returnstrings.HasPrefix(declaration.Name,"safe_")}eventChan,err:=runner.Run(ctx,userID,sessionID,message,agent.WithToolFilter(filter),)// Option 2:agent:=llmagent.New("ai-assistant",llmagent.WithModel(model),llmagent.WithInstruction("You are a helpful AI assistant that can use various tools to help users."),llmagent.WithTools([]tool.Tool{calculatorTool,timeTool,searchTool,}),llmagent.WithToolSets([]tool.ToolSet{stdioToolSet,sseToolSet,streamableToolSet,}),llmagent.WithToolFilter(filter),
4. Per-Agent Filtering
Use agent.InvocationFromContext to implement different tool sets for different Agents:
// Define allowed tools for each AgentagentAllowedTools:=map[string]map[string]bool{"math-agent":{"calculator":true,},"time-agent":{"time_tool":true,},}// Custom filter function: filter based on current Agent namefilter:=func(ctxcontext.Context,ttool.Tool)bool{declaration:=t.Declaration()ifdeclaration==nil{returnfalse}toolName:=declaration.Name// Get current Agent information from contextinv,ok:=agent.InvocationFromContext(ctx)if!ok||inv==nil{returntrue// fallback: allow all tools}agentName:=inv.AgentName// Check if this tool is in the current Agent's allowed listallowedTools,exists:=agentAllowedTools[agentName]if!exists{returntrue// fallback: allow all tools}returnallowedTools[toolName]}eventChan,err:=runner.Run(ctx,userID,sessionID,message,agent.WithToolFilter(filter),)
Complete Example: See examples/toolfilter/ directory
Smart Filtering Mechanism
The framework automatically distinguishes user tools from framework tools, filtering only user tools:
Tool Category
Includes
Filtered?
User Tools
Tools registered via WithTools Tools registered via WithToolSets
β Subject to filtering
Framework Tools
transfer_to_agent (multi-Agent coordination) knowledge_search (knowledge base retrieval) agentic_knowledge_search
// Agent registers multiple toolsagent:=llmagent.New("assistant",llmagent.WithTools([]tool.Tool{calculatorTool,// User tooltextTool,// User tool}),llmagent.WithSubAgents([]agent.Agent{subAgent1,subAgent2}),// Auto-adds transfer_to_agentllmagent.WithKnowledge(kb),// Auto-adds knowledge_search)// Runtime filtering: only allow calculatorfilter:=tool.NewIncludeToolNamesFilter("calculator")runner.Run(ctx,userID,sessionID,message,agent.WithToolFilter(filter),)// Tools actually sent to LLM:// β calculator - User tool, in allowed list// β textTool - User tool, filtered out// β transfer_to_agent - Framework tool, auto-preserved// β knowledge_search - Framework tool, auto-preserved
Important Notes
β οΈ Security Notice: Per-run tool filtering is a "soft constraint" primarily for optimization and user experience. Tools must still implement their own authorization logic:
Reason: LLMs may know about tool existence and usage from context or memory and attempt to call them. Tool filtering reduces this possibility but cannot completely prevent it.
WithToolSets is a static configuration: it wires ToolSets when constructing the Agent. In many realβworld scenarios you also need to add, remove, or replace ToolSets at runtime without recreating the Agent.
LLMAgent exposes three methods for this:
AddToolSet(toolSet tool.ToolSet) β add or replace a ToolSet by ToolSet.Name().
// 1. Create Agent with base tools only.agent:=llmagent.New("dynamic-assistant",llmagent.WithModel(model),llmagent.WithTools([]tool.Tool{calculatorTool}),)// 2. Later, attach an MCP ToolSet at runtime.mcpToolSet:=mcp.NewMCPToolSet(connectionConfig)iferr:=mcpToolSet.Init(ctx);err!=nil{returnfmt.Errorf("failed to init MCP ToolSet: %w",err)}agent.AddToolSet(mcpToolSet)// 3. Replace all ToolSets from configuration (declarative control plane).toolSetsFromConfig:=[]tool.ToolSet{mcpToolSet,fileToolSet}agent.SetToolSets(toolSetsFromConfig)// 4. Remove a ToolSet by name (e.g., feature rollback).removed:=agent.RemoveToolSet(mcpToolSet.Name())if!removed{log.Printf("ToolSet %q not found",mcpToolSet.Name())}
Runtime ToolSet updates integrate seamlessly with the tool filtering logic described earlier:
Tools coming from WithTools or any ToolSet (including dynamically added ones) are treated as user tools and are subject to WithToolFilter and perβrun filters.
Framework tools such as transfer_to_agent, knowledge_search, and agentic_knowledge_search remain never filtered and are always available.
# Enter the tool example directory.cdexamples/tool
gorun.
# Enter the MCP tool example directory.cdexamples/mcp_tool
# Start the external server.cdstreamalbe_server&&gorunmain.go&# Run the main program.gorunmain.go-model="deepseek-chat"
Summary
The Tool system provides rich extensibility for tRPC-Agent-Go, supporting Function Tools, the DuckDuckGo Search Tool, and MCP protocol tools.