Skip to content
TutorialsAIRN

Build an AI Chat App with React Native and WireAI

Malik Chohra

Malik Chohra

April 22, 2026 · 5 min read

Full tutorial: install wireai-rn, set up WireAIProvider with Ollama, render native chat components, register custom cards, and handle the action feedback loop, from zero to working in under 30 minutes.

Building an AI chat app in React Native with WireAI takes under 30 minutes. Install wireai-rn and zod, wrap your root with WireAIProvider, use the useWireAIThread hook in a screen, and map messages to WireAIMessageRenderer. Your agent then responds with interactive native cards instead of plain text.

Most React Native AI chat tutorials show you how to stream text, the result looks like a mini ChatGPT. WireAI goes further: the agent responds with tappable cards, step lists, confirmation dialogs, and mood selectors. All native React Native components that users interact with directly.

How do you install the required dependencies?

npm install wireai-rn zod

# Expo users
npx expo install wireai-rn zod

How do you wrap your app in the WireAIProvider?

The provider needs an adapter (which LLM to use) and a components array (what the agent can render). For local development, use the OllamaAdapter. Cloud LLM adapters (OpenAI, Anthropic, Gemini) ship in the upcoming @wireai/cloud package.

import { WireAIProvider, OllamaAdapter } from 'wireai-rn';
import { builtInComponents } from 'wireai-rn/components';

export default function RootLayout({ children }) {
  return (
    <WireAIProvider
      adapter={new OllamaAdapter({
        baseUrl: 'http://localhost:11434',
        model: 'llama3',
      })}
      components={builtInComponents}
    >
      {children}
    </WireAIProvider>
  );
}

How do you build a chat screen with useWireAIThread?

The useWireAIThread hook manages all conversation state: message history, loading flag, and the action feedback loop. Call sendMessage with the user's text and WireAI handles the full cycle: prompt → LLM → JSON parse → Zod validate → render.

import { useWireAIThread } from 'wireai-rn';
import { WireAIMessageRenderer } from 'wireai-rn/components';
import { View, TextInput, FlatList, Text } from 'react-native';

export function ChatScreen() {
  const { messages, sendMessage, isThinking } = useWireAIThread({
    systemPromptSuffix:
      'You are a helpful assistant. Always respond using the most relevant UI component.',
  });
  const [input, setInput] = React.useState('');

  return (
    <View style={{ flex: 1 }}>
      <FlatList
        data={messages}
        keyExtractor={m => m.id}
        renderItem={({ item }) => (
          <WireAIMessageRenderer message={item} />
        )}
      />
      {isThinking && <Text>Thinking...</Text>}
      <TextInput
        value={input}
        onChangeText={setInput}
        placeholder="Message..."
        onSubmitEditing={() => { sendMessage(input); setInput(''); }}
      />
    </View>
  );
}

How do you register a custom component?

The 11 built-in components cover the most common interaction patterns. For domain-specific UI, register your own component with a Zod schema and a plain-English description that tells the LLM when to use it.

import { registerComponent } from 'wireai-rn';
import { z } from 'zod';

const WeatherCard = registerComponent({
  name: "WeatherCard",
  description: "Use when the user asks about weather. Shows city, temperature and conditions.",
  schema: z.object({
    city: z.string(),
    tempCelsius: z.number(),
    condition: z.enum(["sunny", "cloudy", "rainy", "snowy"]),
  }),
  render: ({ props }) => (
    <View>
      <Text>{props.city}: {props.tempCelsius}°C, {props.condition}</Text>
    </View>
  ),
});

How the action feedback loop works

When a user taps a button inside a rendered component, it calls onSubmit(value). WireAI automatically sends the action result back to the LLM as a follow-up message. The agent sees what the user selected and responds with the next appropriate component, a fully agentic conversation loop with no extra code required.

Common setup errors

  • Network request failed, Ollama isn't running or the base URL is wrong. Android emulator uses 10.0.2.2, not localhost. See the Ollama setup guide.
  • Component not found, The LLM returned a component name that isn't registered. The name in your registerComponent call must match exactly what appears in the system prompt.
  • Zod validation error, Dev mode logs the exact prop that failed. Add .describe() hints on schema fields to guide the model toward valid values.

Build your first AI chat app. Run npm install wireai-rn zod and follow the steps above.