Step-by-Step Streaming
Thestream method provides a clean interface for receiving intermediate steps during agent execution. Each step represents a tool call and its result.
Understanding Yielded Objects
Thestream method yields two types of objects during execution:
Intermediate Steps (Tuple)
Each tool call yields
(AgentAction, observation):action.log- The agent’s reasoning text explaining why it’s calling the toolaction.tool- The name of the tool being called (e.g., “add”, “echo”)action.tool_input- The input arguments passed to the tool as a dictionaryobservation- The result returned by the tool after execution
Final Result (String)
After all tool calls complete, the agent yields a final string response with the answer to the query.
Low-Level Event Streaming
For more granular control, use thestream_events method to get real-time output events:
The streaming API is based on LangChain’s
stream_events method. For more details on event types and data structure, check the LangChain streaming documentation.Choosing the Right Streaming Method
Use stream() when:
• You want to show step-by-step progress
• You need to process each tool call individually
• You’re building a workflow UI
• You want simple, clean step tracking
Use stream_events() when:
• You need fine-grained control over events
• You’re building real-time chat interfaces
• You want to stream LLM reasoning text
• You need custom event filtering
Examples
Building a Streaming UI
Here’s an example of how you might build a simple console UI for streaming:streaming_ui.py
Web Streaming with FastAPI
For web applications, you can stream agent output using Server-Sent Events:web_streaming.py