Skip to content
ExpoRN

Expo SDK 55: A Solid Foundation for Agentic UI

Malik Chohra

Malik Chohra

April 5, 2026 · 5 min read

Why the latest Expo release is a strong architectural stack for building generative AI mobile apps.

Expo SDK 55 provides the performance foundation that agentic UI apps demand. Bridgeless mode eliminates the serialization overhead of passing LLM JSON responses across the React Native bridge, New Architecture Fabric renders native components with lower frame-time variance, and the updated Expo Go sandbox runs WireAI’s OllamaAdapter without a development build.

Earlier Expo SDK versions could run WireAI, but high-frequency agent interactions, rapid component swaps, streaming state updates, nested animations, produced frame drops on mid-range Android devices. Expo SDK 55 changes that. The New Architecture is now the default, not an opt-in experiment, and the performance difference on generative UI workloads is measurable.

What changed in Expo SDK 55

SDK 55 ships several changes that directly affect AI-native apps:

  • New Architecture on by default: Fabric renderer and TurboModules are enabled for all new projects. The old bridge is still available as a fallback but is no longer the default path.
  • Bridgeless mode stable: The JavaScript-to-native communication layer now uses JSI (JavaScript Interface) directly, removing the async serialization bottleneck that caused stuttering when mounting large component trees.
  • Synchronous SQLite reads: expo-sqlite now supports synchronous reads from the JS thread via JSI. Useful for WireAI thread persistence, conversation history can be read synchronously at startup without an async waterfall.
  • Updated Expo Go: SDK 55’s Expo Go supports the full WireAI OSS SDK including the OllamaAdapter. No development build required for local LLM prototyping.
  • React Native 0.76: Ships with the latest Hermes 2.0, improved TypeScript support for JSX transforms, and better sourcemap generation for debugging agent response parsing errors.

Why Bridgeless mode matters for WireAI specifically

Every WireAI component render involves a JSON parse, a Zod validation run, and a React re-render mounting new native views. On the old bridge architecture, each of these steps crossed an asynchronous serialization boundary. Under load, say, an agent rapidly rendering 3–4 components in quick succession during an onboarding flow, this produced measurable jank: 100–200ms frame delays that showed up as stutter on Android devices.

Bridgeless mode (JSI) removes the serialization step. Native module calls happen synchronously on the same thread. The JSON parse, Zod validation, and component mount all complete within the same JavaScript frame budget. On a mid-range Android device, the difference is roughly 60ms vs 15ms for a single WireAI component render cycle, below the threshold where users perceive lag.

Installing WireAI in an Expo SDK 55 project

# Create a new Expo SDK 55 project
npx create-expo-app@latest my-ai-app --template blank-typescript

cd my-ai-app

# Install WireAI and its peer dependency
npx expo install wireai-rn zod

# Start with Expo Go (no development build needed for local LLMs)
npx expo start

WireAI’s core package uses only standard React Native APIs, fetch, View, Text, useState, useEffect, and has no native module dependencies. This is why it works in Expo Go without ejecting. Features that require native modules (MMKV for thread persistence, on-device inference) will require an Expo development build, but basic chat + generative UI works out of the box.

Expo Router and WireAI: the agent-aware navigation pattern

Expo SDK 55 ships Expo Router v4, which uses file-system based routing with React Native’s native stack navigator under the hood. The cleanest WireAI integration puts the WireAIProvider in the root layout so conversation state persists across tab navigation:

// app/_layout.tsx, root layout with WireAI provider
import { WireAIProvider, OllamaAdapter } from ‘wireai-rn’;
import { appComponents } from ‘@/components/wireai’;
import { Stack } from ‘expo-router’;

export default function RootLayout() {
  return (
    <WireAIProvider
      adapter={new OllamaAdapter({ baseUrl: ‘http://localhost:11434’, model: ‘llama3’ })}
      components={appComponents}
    >
      <Stack>
        <Stack.Screen name="(tabs)" options={{ headerShown: false }} />
      </Stack>
    </WireAIProvider>
  );
}

With the provider at the root, the useWireAIThread hook maintains conversation context as the user navigates between tabs. The agent remembers what happened in the onboarding tab when the user moves to the dashboard tab, no manual state lifting required.

Thread persistence with expo-sqlite

Expo SDK 55’s synchronous SQLite reads unlock a clean thread persistence pattern: load the conversation history synchronously at app start, write new messages synchronously after each turn. No flicker, no loading state before the first render. The schema is simple, one table for threads, one for messages, with the component name and props stored as JSON:

import * as SQLite from ‘expo-sqlite’;

const db = SQLite.openDatabaseSync(‘wireai.db’);

// Initialize schema once
db.execSync(`
  CREATE TABLE IF NOT EXISTS messages (
    id TEXT PRIMARY KEY,
    thread_id TEXT,
    role TEXT,
    content TEXT,
    component TEXT,
    props TEXT,
    created_at INTEGER
  )
`);

// Read conversation history synchronously, no loading spinner
function loadThread(threadId: string) {
  return db.getAllSync(
    ‘SELECT * FROM messages WHERE thread_id = ? ORDER BY created_at ASC’,
    [threadId]
  );
}

Sharing Zod schemas between Expo and Next.js

Because WireAI component schemas are plain Zod objects with no React Native imports, they work identically in a Next.js web app. In a monorepo, keep your component schemas in a shared packages/ai-schemas package. Your Expo app and your Next.js admin dashboard both import from the same source, the agent’s component selection is validated against the same schemas on both platforms.

// packages/ai-schemas/src/mood.ts, shared between mobile and web
import { z } from ‘zod’;

export const MoodCheckInSchema = z.object({
  question: z.string(),
  options: z.array(z.string()).min(2).max(5),
});

// Expo app: render as native component
// Next.js dashboard: validate incoming webhook data

Build on the best Expo release yet. Run npx create-expo-app@latest and npx expo install wireai-rn zod.