react-ai-streamv0.1.3
npmGitHubView source →

Backend-agnostic AI streaming for React

One hook. Any provider. Drop-in UI or bring your own. Works with Anthropic, OpenAI, Groq, or any streaming endpoint.

$npm install @react-ai-stream/react @react-ai-stream/ui
Live demo— three models streaming in parallel via Groq
Llama 3.3 70BMeta
Waiting for a message…
Llama 3.1 8BMeta · fast
Waiting for a message…
Llama 4 ScoutMeta · new
Waiting for a message…

As simple as this

import { useAIChat } from '@react-ai-stream/react'
import { Chat } from '@react-ai-stream/ui'
import '@react-ai-stream/ui/styles'

export default function Page() {
  const { messages, sendMessage,
          loading, stop } = useAIChat({
    endpoint: '/api/chat',
  })

  return (
    <Chat
      messages={messages}
      onSend={sendMessage}
      onStop={stop}
      loading={loading}
    />
  )
}

What you get

useAIChat hookMessages, loading, error, stop — all managed
Any backendAnthropic, OpenAI, Groq, or custom endpoint
Event hooksonToken, onComplete, onError for side-effects
Drop-in UI<Chat /> with Markdown + syntax highlighting
Full TypeScriptStrict types, ESM + CJS, 34 passing tests
Read the docs →