React AI Agent Chat SDK
A React library for building AI-powered chat interfaces with tool execution, configurable timeouts, retry logic, and custom renderers. CAUTION NOTE: This library is partially vibe coded, I'll iterate on it to make it more reasonable over time.
Quick Start
1. Install the Package
npm install react-ai-agent-chat-sdk
# or
pnpm add react-ai-agent-chat-sdkPeer Dependencies:
npm install react react-dom zod
AI Provider (choose one):
# For Anthropic Claude models npm install @ai-sdk/anthropic # For OpenAI models npm install @ai-sdk/openai
2. Define Your Tools
Create tools with Zod schemas for type-safe input validation:
import { z } from 'zod'; import { createTool } from 'react-ai-agent-chat-sdk/config-server'; const readFileSchema = z.object({ file_path: z.string().describe('The path to the file to read'), }); const tools = { read_file: createTool({ description: 'Read the contents of a file', display_name: "Reading file", inputSchema: readFileSchema, execute: async ({ file_path }) => { const content = await fs.readFile(file_path, 'utf-8'); return { file_path, content }; } }) };
3. Define Server Configuration
Create server configuration for API routes and tool execution:
import { makeAgentChatRouteConfig } from 'react-ai-agent-chat-sdk/config-server'; import { anthropic } from '@ai-sdk/anthropic'; import { MemoryStorage } from 'react-ai-agent-chat-sdk/storage'; const agentChatRouteConfig = makeAgentChatRouteConfig({ system_prompt: `You are a helpful assistant with access to file management tools.`, tools, auth_func: async () => true, // Replace with your auth logic storage: new MemoryStorage(), // Use your preferred storage modelConfig: { model: anthropic('claude-sonnet-4-20250514'), temperature: 0.3 } });
4. Define Client Configuration
Create client configuration for the React UI:
import { makeAgentChatClientConfig } from 'react-ai-agent-chat-sdk/config-client'; const agentChatClientConfig = makeAgentChatClientConfig({ tools: { read_file: { display_name: "Reading file" } }, route: "/api/chat", // Your chat API endpoint // Optional: custom tool renderers toolRenderers: { read_file: CustomFileRenderer } });
5. Add Chat and History Routes
Create API routes for chat and history:
For Next.js App Router (app/api/chat/route.ts):
import { chatRoute } from 'react-ai-agent-chat-sdk/api'; import { agentChatRouteConfig } from '@/lib/agent-config'; export async function POST(req: Request) { return chatRoute(agentChatRouteConfig, req); }
History Route (app/api/chat/history/route.ts):
import { chatHistoryRoute } from 'react-ai-agent-chat-sdk/api'; import { agentChatRouteConfig } from '@/lib/agent-config'; export async function GET(req: Request) { return chatHistoryRoute(agentChatRouteConfig, req); }
For Express.js (server.js):
import { AgentChatRoute } from 'react-ai-agent-chat-sdk/api'; import { agentChatRouteConfig } from './lib/agent-config.js'; const app = express(); // Chat endpoint app.post('/api/chat', AgentChatRoute(agentChatRouteConfig)); // History endpoint app.get('/api/chat/history', AgentChatRoute({ ...agentChatRouteConfig, method: 'GET' }));
6. Add AgentChat UI Element
Use the chat component in your React app:
'use client'; import { useEffect, useState } from 'react'; import { AgentChat } from 'react-ai-agent-chat-sdk'; import 'react-ai-agent-chat-sdk/agent-chat.css'; import { agentChatClientConfig } from '@/lib/agent-chat-client-config'; export default function ChatPage() { const [conversationId, setConversationId] = useState<string>(''); useEffect(() => { // Load or create conversation ID for persistence let id = localStorage.getItem('current-conversation-id'); if (!id) { id = `conv_${crypto.randomUUID()}`; localStorage.setItem('current-conversation-id', id); } setConversationId(id); }, []); if (!conversationId) { return <div>Loading...</div>; } return ( <AgentChat config={agentChatClientConfig} conversationId={conversationId} /> ); }
Architecture Overview
The SDK now uses a frontend/backend separation architecture:
- Backend Configuration (
config-server): Handles tool execution, AI model configuration, authentication, and storage. Used in API routes. - Frontend Configuration (
config-client): Handles UI rendering, tool display names, and custom renderers. Used in React components. - Shared Types: Both configurations share the same tool definitions and conversation structure.
Customization
Tool Renderers
Create custom renderers for specific tools:
import { ToolCall, ToolResult } from 'react-ai-agent-chat-sdk/config'; export function CustomFileRenderer({ toolCall, toolResult }: { toolCall: ToolCall; toolResult?: ToolResult }) { const hasError = toolResult?.output?.__toolError; const isTimeout = hasError && toolResult?.output?.__errorType === 'ToolTimeoutError'; const getStatusText = () => { if (isTimeout) return 'Timed out'; if (hasError) return 'Error'; if (toolResult?.output) return 'Completed'; return 'Running'; }; return ( <div className={`custom-renderer ${hasError ? 'error' : ''}`}> <div>📁 {toolCall.toolName} - {getStatusText()}</div> {toolResult?.output && ( <pre>{JSON.stringify(toolResult.output, null, 2)}</pre> )} </div> ); }
Add renderers to your configuration:
// lib/agent-chat-client-config.ts import { makeAgentChatClientConfig } from 'react-ai-agent-chat-sdk/config-client'; import { CustomFileRenderer } from './renderers'; export const agentChatClientConfig = makeAgentChatClientConfig({ tools: { read_file: { display_name: "Reading file" } }, route: "/api/chat", toolRenderers: { read_file: CustomFileRenderer, } });
Route Parameters
Customize API endpoints to fit your application structure:
// Server config const agentChatRouteConfig = makeAgentChatRouteConfig({ system_prompt: "You are a helpful assistant.", tools, auth_func: async () => true, storage: new MemoryStorage() }); // Client config const agentChatClientConfig = makeAgentChatClientConfig({ tools: { // Tool definitions for display }, route: "/api/v1/chat", // Custom chat route historyRoute: "/api/v1/history" // Custom history route (optional) });
Retry Configurations
Configure timeouts and retries globally and per-tool:
Global Configuration:
const agentChatRouteConfig = makeAgentChatRouteConfig({ system_prompt: "You are a helpful assistant.", tools, auth_func: async () => true, storage: new MemoryStorage(), toolExecutionConfig: { timeoutMs: 30000, // 30 seconds default retries: 3, retryDelayMs: 1000 // 1 second initial delay } });
Per-Tool Configuration:
const tools = { slow_operation: createTool({ description: 'A slow operation that needs longer timeout', display_name: "Processing data", inputSchema: z.object({}), execute: async () => { // Long-running operation }, executionConfig: { timeoutMs: 60000, // 1 minute timeout retries: 1, // Only 1 retry retryDelayMs: 5000 // 5 second delay } }) };
Storage Configuration:
import { MemoryStorage } from 'react-ai-agent-chat-sdk/storage'; // For development const storage = new MemoryStorage(); // For production, implement ChatStorage interface class MyStorage implements ChatStorage { async saveMessage(conversationId: string, message: ChatMessage): Promise<void> { // Save to your database } async getConversation(conversationId: string): Promise<Conversation | null> { // Retrieve from your database } } const agentChatRouteConfig = makeAgentChatRouteConfig({ system_prompt: "You are a helpful assistant.", tools, auth_func: async () => true, storage // Add storage for conversation persistence });
Model Configuration:
import { openai } from '@ai-sdk/openai'; import { messageCountIs } from 'ai'; const agentChatRouteConfig = makeAgentChatRouteConfig({ system_prompt: "You are a helpful assistant.", tools, auth_func: async () => true, storage: new MemoryStorage(), modelConfig: { model: openai('gpt-4o'), // Use different AI models temperature: 0.7, stopWhen: messageCountIs(10), // Stop after 10 messages onStepFinish: (step) => { console.log('Step finished:', step.finishReason); } } });