wireai-rn
Wire your AI agent to native mobile UI. Open-source React Native SDK for generative UI: render interactive native components from LLM responses.
What is wireai-rn?
Most AI-powered mobile apps render plain text in a chat bubble. wireai-rn changes that: your LLM returns structured JSON, and the SDK renders it as fully interactive native components: selection cards, step lists, number inputs, confirmation prompts, and more.
You register which components your app can render. The SDK automatically builds a system prompt, validates every response against Zod schemas, and renders the right component with injected callbacks that feed interactions back to the LLM.
How it works
sendMessage("Start my check-in")
Prepends auto-generated system prompt · trims history to maxContextMessages / maxContextChars
Ollama · LM Studio · OpenAI · Webhook · A2A, same interface, pluggable transport
{"action":"render","component":"SelectionCard","props":{...}}
Extracts JSON from text · normalizes · validates against Zod schema
message.response contains the typed WireAIResponse object
Looks up "SelectionCard" in the registry · validates props · mounts with injected callbacks
onSelect("I feel overwhelmed") → sendMessage("I selected: I feel overwhelmed") → next LLM turn
Key features
5 LLM providers
Ollama, LM Studio, OpenAI, Webhook, and A2A Protocol. Swap without changing your UI code.
11 built-in components
SelectionCard, ChipSelectCard, StepList, InfoList, StatusCard, and more.
Zod-validated props
Every prop is validated against the component's schema before rendering. No runtime surprises.
Custom components
Register any React Native component in one file. The LLM learns it automatically.
A2A Protocol
First-class support for Google's Agent-to-Agent protocol. Connect to any A2A-compatible agent.
Design tokens
Full design system exported: colors, spacing, radii, and typography for consistent custom components.
Install
$yarnadd wireai-rn zodpod install or native rebuild required.Minimal example
1import { WireAIProvider, useWireAIThread, useWireAIInput,
2 useWireAIAction, ComponentRenderer, defaultComponents } from "wireai-rn";
3
4const llm = { provider: "ollama", baseUrl: "http://localhost:11434", model: "llama3" };
5
6export default function App() {
7 return (
8 <WireAIProvider llm={llm} components={defaultComponents}>
9 <ChatScreen />
10 </WireAIProvider>
11 );
12}
13
14function ChatScreen() {
15 const { messages, isLoading, sendMessage } = useWireAIThread();
16 const { inputText, setInputText, handleSubmit } = useWireAIInput(sendMessage);
17 const createCallbacks = useWireAIAction(sendMessage);
18
19 return (
20 <View style={{ flex: 1 }}>
21 <FlatList data={messages} keyExtractor={m => m.id}
22 renderItem={({ item }) => (
23 <ComponentRenderer message={item} callbacks={createCallbacks(item.id)} />
24 )}
25 />
26 <TextInput value={inputText} onChangeText={setInputText}
27 onSubmitEditing={handleSubmit} placeholder="Type a message..." />
28 </View>
29 );
30}