Introduction

react-native-ai brings on-device AI to React Native apps, allowing you to run large language models directly on users' devices.

Why On-Device AI?

  • Privacy: All processing happens locally—no data leaves the device
  • Speed: Instant responses without network latency
  • Offline: Works anywhere, even without internet

What's Supported

MLC LLM Engine: Run popular models like Mistral, Llama, and Phi with our MLC-powered package. Compatible with Vercel AI SDK for easy integration.

Apple Intelligence: Native support for Apple's on-device foundation models with @react-native-ai/apple, featuring purpose-built mobile architecture and dynamic adapters.

Both packages offer streaming and a unified API that works across iOS and Android.

Get started by choosing the approach that fits your needs!

Need to boost your app's performance?
We help React Native teams enhance speed, responsiveness, and efficiency.