Real-Time Streaming
Learn how to add real-time streaming updates to your Motia workflows
What You'll Build
A pet management system with real-time streaming that provides live updates to clients:
- Stream Configuration - Define stream schemas for type-safe updates
- API with Streaming - APIs that initialize streams and return immediately
- Background Job Streaming - Jobs that push real-time progress updates
- Agentic Step Streaming - AI enrichment with live progress updates
- Multi-Step Streaming - Multiple steps updating the same stream
Getting Started
Clone the example repository:
Install dependencies:
Set up your OpenAI API key in .env:
Start the Workbench:
Your Workbench will be available at http://localhost:3000.
Project Structure
Files like features.json and tutorial/tutorial.tsx are only for the interactive tutorial and are not part of Motia's project structure.
All code examples in this guide are available in the build-your-first-app repository.
You can follow this guide to learn how to build real-time streaming with Motia step by step, or you can clone the repository and dive into our Interactive Tutorial to learn by doing directly in the Workbench.

Understanding Real-Time Streaming
You've built APIs that return immediately, background jobs that process asynchronously, workflows that orchestrate complex logic, and agentic workflows that make intelligent decisions. But how do you give users real-time feedback while all this async processing happens in the background?
That's where streaming comes in. Motia provides streams as part of the context in any step handler - you can use them anywhere in your code. Streams use Server-Sent Events (SSE) to push live updates directly to clients as your workflow progresses.
In our pet shelter example:
- The API initializes a stream and returns immediately with a stream ID
- Background jobs push updates as they process (quarantine entry, health checks)
- Agentic steps stream enrichment progress (bio generation, breed analysis)
- Clients get live feedback throughout the entire workflow
The power is in the simplicity - streams is available in your handler's context, just like emit, logger, and state. Any step can update any stream, creating a unified real-time experience without complex orchestration.
Creating Your First Stream
Step 1: Define the Stream Configuration
First, define a stream configuration file. This makes the stream available in the context.streams object for all your step handlers.
View on GitHub:
How Stream Configuration Works
Stream configuration is simple:
- name - Identifier for accessing the stream (e.g.,
context.streams.petCreation) - schema - Zod schema defining what data can be pushed to the stream
- baseConfig - Storage settings (default uses in-memory storage)
Once you create this configuration file, the stream is automatically available as streams.petCreation in the context of any step handler. It's just like emit, logger, or state - part of the tools available in your handler.
Step 2: Initialize Streams from APIs
Now let's update the pet creation API to initialize a stream and return it immediately to the client.
View on GitHub:
How API Stream Initialization Works
The key changes from a regular API:
- Access streams from context -
streamsis available in the FlowContext - Create initial stream message -
await streams.petCreation.set(traceId, 'message', data) - Return the stream result - Contains stream ID and initial message
- Background jobs update the same stream - Using the same traceId
The API returns immediately with a stream ID. Clients can connect to this stream via SSE to receive real-time updates as background jobs process.
Step 3: Stream Updates from Background Jobs
Now let's update the feeding reminder job to push real-time updates to the stream as it processes.
View on GitHub:
How Background Job Streaming Works
Background jobs can push multiple updates to a stream:
- Access the stream -
streams.petCreationis available in context - Push updates -
await streams.petCreation.set(traceId, 'message', data) - Use the same traceId - Links updates to the original API request
- Send multiple updates - Each
set()call sends immediately to connected clients
The background job processes asynchronously, pushing updates at each stage. Clients connected to the stream receive these updates in real-time via SSE.
Step 4: Agentic Step Streaming
Agentic steps can also stream progress updates as they generate content. This provides live feedback during potentially long-running AI operations.
View on GitHub:
How Agentic Step Streaming Works
Agentic steps stream progress as they work:
- Stream start notification - Let users know AI processing has begun
- Progress updates - Stream each stage of generation (bio, breed, temperament, etc.)
- Stream completion - Notify when AI processing is done
- Error streaming - Stream errors gracefully with fallback messages
This transforms a potentially slow AI operation into an engaging real-time experience.
Testing Streaming in Action
The best way to test streams is through Workbench.
Test 1: Create a Pet with Streaming
Open Workbench and navigate to the Endpoints section, then test the Pet Creation endpoint:
Prefer using curl?
You'll get an immediate response with the stream result. The API returns right away while background jobs process asynchronously.
Test 2: Monitor Stream Updates in Workbench
After creating a pet, check the Tracing view in Workbench:
- Automatically switched to the Tracing tab so you can see the stream updates in real-time
- Click on the most recent trace
- Watch the timeline as steps execute
- See stream updates appear in real-time in the timeline

You'll observe:
- Pet creation completes immediately
- Feeding reminder job streams quarantine updates
- AI enrichment streams progress updates
- All updates visible in the trace timeline
Test 3: Create Pet with Symptoms
Test the conditional streaming logic by creating a pet with symptoms:
Watch the logs to see different stream messages based on the symptoms detected.
Test 4: Create Pet Without Symptoms
Compare the streaming behavior with a healthy pet:
The stream will show health check passed messages instead of treatment needed messages.
Observing Stream Updates
Watch the Workbench console logs to see the real-time stream updates as they're pushed:
Each emoji-prefixed log corresponds to a stream update being pushed to connected clients.
🎉 Congratulations! You've built a complete real-time streaming system with Motia. Your pet management system now provides live feedback to users while complex workflows execute in the background.
What's Next?
You've now mastered the complete Motia stack:
- API Endpoints - Build RESTful APIs with validation
- Background Jobs - Process async tasks efficiently
- Workflows - Orchestrate complex business logic
- Agentic Workflows - Make intelligent decisions with AI
- Real-Time Streaming - Provide live updates using streams in any step handler
This is the complete progression from simple APIs to intelligent, real-time systems!
Key Takeaway: Streams are just another tool in your step handler's context - use them wherever you need real-time updates!
Here are some ideas to extend your streaming implementation:
- Add stream analytics - Track how many clients are connected, message delivery rates
- Implement stream persistence - Use Redis adapter for stream storage across restarts
- Create stream multiplexing - Multiple streams per workflow for different update types
- Build progress bars - Use structured progress data (0-100%) instead of just messages
- Add stream authentication - Ensure only authorized clients can access streams
Explore more examples in the Motia Examples Repository.
