Backend-agnostic AI streaming for React
One hook. Any provider. Drop-in UI or bring your own. Works with Anthropic, OpenAI, Groq, or any streaming endpoint.
$npm install @react-ai-stream/react @react-ai-stream/ui
Live demo— three models streaming in parallel via Groq
As simple as this
import { useAIChat } from '@react-ai-stream/react'
import { Chat } from '@react-ai-stream/ui'
import '@react-ai-stream/ui/styles'
export default function Page() {
const { messages, sendMessage,
loading, stop } = useAIChat({
endpoint: '/api/chat',
})
return (
<Chat
messages={messages}
onSend={sendMessage}
onStop={stop}
loading={loading}
/>
)
}What you get
✓
useAIChat hook — Messages, loading, error, stop — all managed
✓
Any backend — Anthropic, OpenAI, Groq, or custom endpoint
✓
Event hooks — onToken, onComplete, onError for side-effects
✓
Drop-in UI — <Chat /> with Markdown + syntax highlighting
✓
Full TypeScript — Strict types, ESM + CJS, 34 passing tests