React Native Quick Start¶
Add liveness detection to your React Native app using the @moveris/react-native SDK.
In plain terms
Install the package, add camera permissions (iOS/Android), wrap your app in MoverisProvider, and render LivenessView. The component handles camera access, frame capture, and API calls. Route requests through your backend to keep the API key secure.
Installation¶
npm install @moveris/react-native @moveris/shared react-native-vision-camera react-native-reanimated
For iOS:
Platform Setup¶
iOS (Info.plist)¶
Android (AndroidManifest.xml)¶
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" android:required="true" />
<uses-feature android:name="android.hardware.camera.front" android:required="true" />
Reanimated Setup¶
Add the Reanimated Babel plugin to your babel.config.js:
module.exports = {
presets: ['module:@react-native/babel-preset'],
plugins: ['react-native-reanimated/plugin'],
};
Minimal Example¶
import { MoverisProvider, LivenessView } from '@moveris/react-native';
function App() {
return (
<MoverisProvider
config={{
apiKey: 'sk-your-api-key',
baseUrl: 'https://api.moveris.com',
}}
>
<LivenessCheck />
</MoverisProvider>
);
}
function LivenessCheck() {
return (
<LivenessView
onResult={(result) => {
console.log('Verdict:', result.verdict);
console.log('Score:', result.score);
}}
onError={(error) => {
console.error('Error:', error.message);
}}
/>
);
}
API Key Security
Never include your API key in mobile app code. Always route requests through your backend server.
Using the Hook¶
For a custom UI, use the useLiveness hook:
import { View, Text, Pressable, StyleSheet } from 'react-native';
import { MoverisProvider, useLiveness } from '@moveris/react-native';
function LivenessCheck() {
const {
status,
result,
error,
framesReceived,
framesRequired,
feedbackMessage,
start,
reset,
} = useLiveness();
return (
<View style={styles.container}>
<Text style={styles.status}>Status: {status}</Text>
<Text style={styles.progress}>
Frames: {framesReceived}/{framesRequired}
</Text>
{feedbackMessage && (
<Text style={styles.feedback}>{feedbackMessage}</Text>
)}
{status === 'idle' && (
<Pressable style={styles.button} onPress={start}>
<Text style={styles.buttonText}>Start Verification</Text>
</Pressable>
)}
{result && (
<View style={[
styles.result,
result.verdict === 'live' ? styles.live : styles.fake,
]}>
<Text style={styles.verdict}>{result.verdict.toUpperCase()}</Text>
<Text style={styles.score}>
Score: {result.realScore.toFixed(0)}
</Text>
</View>
)}
{error && <Text style={styles.error}>{error.message}</Text>}
{(result || error) && (
<Pressable style={styles.button} onPress={reset}>
<Text style={styles.buttonText}>Try Again</Text>
</Pressable>
)}
</View>
);
}
const styles = StyleSheet.create({
container: { flex: 1, justifyContent: 'center', alignItems: 'center' },
status: { fontSize: 16, color: '#666', marginBottom: 8 },
progress: { fontSize: 14, color: '#999' },
feedback: { fontSize: 16, color: '#3b82f6', marginVertical: 12, textAlign: 'center' },
button: { backgroundColor: '#3b82f6', paddingHorizontal: 32, paddingVertical: 16, borderRadius: 8, marginTop: 16 },
buttonText: { color: '#fff', fontSize: 18, fontWeight: '600' },
result: { padding: 20, borderRadius: 12, marginTop: 16, alignItems: 'center', width: '80%' },
live: { backgroundColor: '#22c55e' },
fake: { backgroundColor: '#ef4444' },
verdict: { fontSize: 28, fontWeight: 'bold', color: '#fff' },
score: { fontSize: 16, color: '#fff', marginTop: 4 },
error: { color: '#ef4444', marginTop: 12 },
});
With Face Detection¶
Enable smart frame capture using ML Kit (Android/iOS):
import {
LivenessView,
createMLKitAdapter,
} from '@moveris/react-native';
function SmartLivenessCheck() {
return (
<LivenessView
enableFaceDetection
faceDetectorAdapter={createMLKitAdapter()}
onResult={(result) => console.log(result)}
onError={(error) => console.error(error)}
/>
);
}
Or with Expo Face Detector:
import {
LivenessView,
createExpoFaceDetectorAdapter,
} from '@moveris/react-native';
function ExpoLivenessCheck() {
return (
<LivenessView
enableFaceDetection
faceDetectorAdapter={createExpoFaceDetectorAdapter()}
onResult={(result) => console.log(result)}
/>
);
}
Proxy Setup¶
In production, route API calls through your backend:
Your backend forwards requests to Moveris with the API key:
// Express.js example
app.post('/api/liveness/*', async (req, res) => {
const response = await fetch(`https://api.moveris.com${req.path}`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': process.env.MOVERIS_API_KEY,
},
body: JSON.stringify(req.body),
});
res.status(response.status).json(await response.json());
});
Next Steps¶
- Components -- All React Native components
- Hooks -- All React Native hooks
- LivenessClient Reference -- Direct API client usage