A collection of on-device AI primitives for React Native with first-class Vercel AI SDK support. Run AI models directly on users' devices for privacy-preserving, low-latency inference without server costs.
The AI SDK Profiler plugin captures OpenTelemetry spans from Vercel AI SDK requests and surfaces them in Rozenite DevTools. DevTools are runtime agnostic, so they work with on-device and remote runtimes.
Learn more in the DevTools documentation.
Native integration with Apple's on-device AI capabilities through @react-native-ai/apple:
Production-ready with instant availability on supported iOS devices.
Run any GGUF model from HuggingFace locally using llama.rn through @react-native-ai/llama:
Cross-platform parity with optimized performance on both iOS and Android.
Run any open-source LLM locally using MLC's optimized runtime through @react-native-ai/mlc:
MLC support is experimental and not recommended for production use yet.
Support for Google's on-device models is planned for future releases.
Get started by choosing the approach that fits your needs!