A collection of on-device AI primitives for React Native with first-class Vercel AI SDK support. Run AI models directly on users' devices for privacy-preserving, low-latency inference without server costs.
Native integration with Apple's on-device AI capabilities through @react-native-ai/apple
:
Production-ready with instant availability on supported iOS devices.
Run any open-source LLM locally using MLC's optimized runtime through @react-native-ai/mlc
:
MLC support is experimental and not recommended for production use yet.
Support for Google's on-device models is planned for future releases.
Get started by choosing the approach that fits your needs!