Skip to content

Hooks

Four hooks cover everything from conversation management to storage persistence. All hooks must be used inside WireAIProvider.

useWireAIThread

The main conversation hook. Manages message history, loading state, and LLM communication.

Ts
1const {
2  messages,    // Message[]   : full conversation history
3  isLoading,   // boolean     : true while waiting for LLM response
4  error,       // string|null : last error message
5  sendMessage, // (text: string, options?: SendMessageOptions) => void
6  reset,       // () => void  : clears history and re-creates adapter
7  abort,       // () => void  : cancels in-flight request
8} = useWireAIThread();

sendMessage(text, options?)

Appends a user message and triggers an LLM call. Safe to call from button presses, concurrent calls are dropped unless interruptLoading: true.

Ts
1// Basic send
2sendMessage("Show me a trip to Tokyo");
3
4// Interrupt an in-flight request (e.g. "Stop" button)
5sendMessage("Let me rethink that", { interruptLoading: true });

reset()

Clears message history, resets loading/error state, aborts any in-flight request, and re-creates the LLM adapter. This is important for A2A: it clears the server-side contextId so the next session starts fresh.

Background behavior

When the app moves to the background (AppState → "background"), any in-flight LLM request is automatically aborted and isLoading is set to false. This prevents dangling network calls.

Message shape

Ts
1type Message = {
2  id: string;                  // unique ID
3  role: "user" | "assistant";
4  content: string;             // raw LLM response string (for assistant messages)
5  response?: WireAIResponse;   // parsed + validated response (assistant messages only)
6  timestamp: number;
7};
8
9type WireAIResponse =
10  | { action: "render"; component: string; props: Record<string, unknown>; message?: string }
11  | { action: "ask"; message: string };

useWireAIInput

Lightweight input state manager. Keeps the text field value in sync and clears it on submit.

Ts
1const {
2  inputText,    // string: current input value
3  setInputText, // (text: string) => void
4  handleSubmit, // () => void: calls sendMessage(inputText) then clears
5} = useWireAIInput(sendMessage);
Usage
1const { inputText, setInputText, handleSubmit } = useWireAIInput(sendMessage);
2
3<TextInput
4  value={inputText}
5  onChangeText={setInputText}
6  onSubmitEditing={handleSubmit}
7  returnKeyType="send"
8/>
9<Button title="Send" onPress={handleSubmit} />

useWireAIAction

Factory hook that returns a function to create callback objects. Pass the callbacks to ComponentRenderer: they translate component interactions into natural language messages sent back to the LLM.

Ts
1const createCallbacks = useWireAIAction(sendMessage);
2
3// Per message: pass to ComponentRenderer
4const callbacks = createCallbacks(message.id);
5
6// callbacks contains:
7{
8  onConfirm(payload?)  → sends "I selected: <payload>" or "Yes, confirmed."
9  onDeny()             → sends "No, cancel that."
10  onSubmit(value)      → sends "My answer is: <value>"
11  onSelect(value)      → sends "I selected: <value>"
12  onPress(label)       → sends "I tapped: <label>"
13  onContinue()         → sends "continue."
14  onCancel()           → sends "cancel."
15}
How callbacks reach components
ComponentRenderer merges these callbacks into the component's props before rendering. Your components receive them as InjectedProps. Use onSubmit for form values, onSelect for selections, onContinue for CTAs.

useLLMConfigStorage

Persists LLM configuration to any async storage backend. Returns the loaded config, a loading flag, and save/clear helpers.

Ts
1const {
2  config,      // LocalLLMConfig: current (or default) config
3  isLoaded,    // boolean: false until storage has been read
4  saveConfig,  // (config: LocalLLMConfig) => Promise<void>
5  clearConfig, // () => Promise<void>
6} = useLLMConfigStorage(storageBackend, defaultConfig);

StorageBackend interface

Ts
1interface StorageBackend {
2  getItem(key: string): Promise<string | null>;
3  setItem(key: string, value: string): Promise<void>;
4  deleteItem(key: string): Promise<void>;
5}

Example backends

Ts
1import { MMKV } from "react-native-mmkv";
2const mmkv = new MMKV();
3
4export const mmkvBackend: StorageBackend = {
5  getItem: async (key) => mmkv.getString(key) ?? null,
6  setItem: async (key, value) => mmkv.set(key, value),
7  deleteItem: async (key) => mmkv.delete(key),
8};
Ts
1import AsyncStorage from "@react-native-async-storage/async-storage";
2
3export const asyncStorageBackend: StorageBackend = {
4  getItem: (key) => AsyncStorage.getItem(key),
5  setItem: (key, value) => AsyncStorage.setItem(key, value),
6  deleteItem: (key) => AsyncStorage.removeItem(key),
7};
Ts
1import * as SecureStore from "expo-secure-store";
2
3export const secureBackend: StorageBackend = {
4  getItem: (key) => SecureStore.getItemAsync(key),
5  setItem: (key, value) => SecureStore.setItemAsync(key, value),
6  deleteItem: (key) => SecureStore.deleteItemAsync(key),
7};

Usage with WireAIProvider

Tsx
1function App() {
2  const { config, isLoaded, saveConfig } = useLLMConfigStorage(
3    secureBackend,
4    { provider: "openai", baseUrl: "https://api.openai.com", model: "gpt-4o-mini" }
5  );
6
7  // Wait for storage to load before rendering
8  if (!isLoaded) return null;
9
10  return (
11    <WireAIProvider
12      key={`${config.provider}:${config.baseUrl}:${config.model}`}
13      llm={config}
14      components={components}
15    >
16      <ChatScreen onSettingsChange={saveConfig} />
17    </WireAIProvider>
18  );
19}