Skip to content

React Native Examples

Mobile integration examples for React Native apps using Expo Camera and React Native Vision Camera.

In plain terms

These examples show how to add liveness detection to mobile apps (iOS and Android) using the device camera. Use Expo Camera for simpler setups or React Native Vision Camera for more control. Always proxy API calls through your backend to protect the API key.

Moveris API (v2)

These examples use Moveris API (v2) endpoints. The source field is required and should be set to "live" for real-time camera capture.

Backend Proxy Required

Never include your API key in mobile app code. Always route requests through your backend server to keep credentials secure.

Model selection (v1 and v2)

You can keep both flows depending on your integration. Use model in body for v1 compatibility, or use X-Model-Version: latest with frame_count for v2 alias-based resolution.

Expo Camera

The easiest way to add liveness detection to Expo projects. Expo Camera provides built-in base64 encoding, making frame capture straightforward.

import { CameraView, useCameraPermissions } from 'expo-camera';
import { useRef, useState } from 'react';
import { Button, View, Text, StyleSheet, ActivityIndicator } from 'react-native';

interface Frame {
  index: number;
  timestamp_ms: number;
  pixels: string;
}

interface LivenessResult {
  verdict: 'live' | 'fake';
  real_score: number;
  score: number;
  session_id: string;
}

export default function LivenessScreen() {
  const [permission, requestPermission] = useCameraPermissions();
  const [isChecking, setIsChecking] = useState(false);
  const [result, setResult] = useState<LivenessResult | null>(null);
  const [error, setError] = useState<string | null>(null);
  const cameraRef = useRef<CameraView>(null);

  const captureFrames = async (): Promise<Frame[]> => {
    const frames: Frame[] = [];

    for (let i = 0; i < 10; i++) {
      const photo = await cameraRef.current?.takePictureAsync({
        base64: true,
        quality: 0.7,
        skipProcessing: true,
      });

      if (photo?.base64) {
        frames.push({
          index: i,
          timestamp_ms: i * 100,
          pixels: photo.base64,
        });
      }

      await new Promise(resolve => setTimeout(resolve, 100));
    }

    return frames;
  };

  const checkLiveness = async () => {
    setIsChecking(true);
    setError(null);
    setResult(null);

    try {
      const frames = await captureFrames();

      const requiredFrames = 10;  // mixed-10-v2
      if (frames.length < requiredFrames) {
        throw new Error(`Only captured ${frames.length} frames (need ${requiredFrames})`);
      }

      // Replace with your backend URL
      const response = await fetch('YOUR_BACKEND_URL/api/liveness', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
          // v2 flow (optional): 'X-Model-Version': 'latest',
        },
        body: JSON.stringify({
          session_id: `session-${Date.now()}`,
          source: 'live',
          // v1 flow:
          model: 'mixed-10-v2',
          // v2 flow (alternative):
          // frame_count: 10,
          frames,
        }),
      });

      const envelope = await response.json();
      if (!envelope.success) {
        throw new Error(envelope.message || 'Verification failed');
      }

      setResult(envelope.data);
    } catch (err) {
      setError(err instanceof Error ? err.message : 'Unknown error');
    } finally {
      setIsChecking(false);
    }
  };

  if (!permission) {
    return <View style={styles.container}><ActivityIndicator /></View>;
  }

  if (!permission.granted) {
    return (
      <View style={styles.container}>
        <Text style={styles.text}>Camera permission required</Text>
        <Button title="Grant Permission" onPress={requestPermission} />
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <CameraView
        ref={cameraRef}
        style={styles.camera}
        facing="front"
      >
        <View style={styles.faceGuide} />
      </CameraView>

      <Button
        title={isChecking ? 'Verifying...' : 'Verify Liveness'}
        onPress={checkLiveness}
        disabled={isChecking}
      />

      {result && (
        <View style={[
          styles.resultCard,
          result.verdict === 'live' ? styles.live : styles.fake
        ]}>
          <Text style={styles.verdict}>{result.verdict.toUpperCase()}</Text>
          <Text style={styles.score}>Score: {(result.score * 100).toFixed(1)}%</Text>
        </View>
      )}

      {error && <Text style={styles.error}>{error}</Text>}
    </View>
  );
}

const styles = StyleSheet.create({
  container: { flex: 1, backgroundColor: '#000' },
  camera: { flex: 1 },
  faceGuide: {
    position: 'absolute',
    top: '20%',
    left: '15%',
    width: '70%',
    height: '45%',
    borderWidth: 2,
    borderColor: 'rgba(255, 255, 255, 0.5)',
    borderRadius: 150,
  },
  text: { color: '#fff', textAlign: 'center', marginBottom: 16 },
  resultCard: { padding: 16, margin: 16, borderRadius: 8, alignItems: 'center' },
  live: { backgroundColor: '#22c55e' },
  fake: { backgroundColor: '#ef4444' },
  verdict: { fontSize: 24, fontWeight: 'bold', color: '#fff' },
  score: { fontSize: 16, color: '#fff', marginTop: 4 },
  error: { color: '#ef4444', textAlign: 'center', margin: 16 },
});

Installation

# For Expo projects
npx expo install expo-camera

Vision Camera (High Performance)

For bare React Native projects or when you need maximum performance. React Native Vision Camera offers faster capture and frame processing.

import { useRef, useState, useCallback } from 'react';
import { View, Text, StyleSheet, Pressable } from 'react-native';
import {
  Camera,
  useCameraDevice,
  useCameraPermission,
  PhotoFile
} from 'react-native-vision-camera';
import RNFS from 'react-native-fs';

interface Frame {
  index: number;
  timestamp_ms: number;
  pixels: string;
}

export default function LivenessScreen() {
  const device = useCameraDevice('front');
  const { hasPermission, requestPermission } = useCameraPermission();
  const cameraRef = useRef<Camera>(null);
  const [isChecking, setIsChecking] = useState(false);
  const [result, setResult] = useState<any>(null);

  const captureFrames = useCallback(async (): Promise<Frame[]> => {
    const frames: Frame[] = [];

    for (let i = 0; i < 10; i++) {
      const photo: PhotoFile = await cameraRef.current!.takePhoto({
        qualityPrioritization: 'speed',
        flash: 'off',
      });

      // Read file as base64
      const base64 = await RNFS.readFile(photo.path, 'base64');

      frames.push({
        index: i,
        timestamp_ms: i * 100,
        pixels: base64,
      });

      // Cleanup temp file
      await RNFS.unlink(photo.path);

      await new Promise(r => setTimeout(r, 100));
    }

    return frames;
  }, []);

  const checkLiveness = useCallback(async () => {
    setIsChecking(true);

    try {
      const frames = await captureFrames();

      const response = await fetch('YOUR_BACKEND_URL/api/liveness', {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
          // v2 flow (optional): 'X-Model-Version': 'latest',
        },
        body: JSON.stringify({
          session_id: `session-${Date.now()}`,
          source: 'live',
          // v1 flow:
          model: 'mixed-10-v2',
          // v2 flow (alternative):
          // frame_count: 10,
          frames,
        }),
      });

      const envelope = await response.json();
      if (!envelope.success) {
        throw new Error(envelope.message || 'Verification failed');
      }
      setResult(envelope.data);
    } catch (error) {
      console.error('Liveness check failed:', error);
    } finally {
      setIsChecking(false);
    }
  }, [captureFrames]);

  if (!hasPermission) {
    return (
      <View style={styles.container}>
        <Pressable style={styles.button} onPress={requestPermission}>
          <Text style={styles.buttonText}>Grant Camera Permission</Text>
        </Pressable>
      </View>
    );
  }

  if (!device) {
    return (
      <View style={styles.container}>
        <Text style={styles.text}>No front camera found</Text>
      </View>
    );
  }

  return (
    <View style={styles.container}>
      <Camera
        ref={cameraRef}
        style={styles.camera}
        device={device}
        isActive={true}
        photo={true}
      />

      <View style={styles.overlay}>
        <View style={styles.faceGuide} />
      </View>

      <Pressable
        style={[styles.button, isChecking && styles.buttonDisabled]}
        onPress={checkLiveness}
        disabled={isChecking}
      >
        <Text style={styles.buttonText}>
          {isChecking ? 'Verifying...' : 'Verify Liveness'}
        </Text>
      </Pressable>

      {result && (
        <View style={[
          styles.result,
          result.verdict === 'live' ? styles.live : styles.fake
        ]}>
          <Text style={styles.verdict}>{result.verdict}</Text>
        </View>
      )}
    </View>
  );
}

const styles = StyleSheet.create({
  container: { flex: 1, backgroundColor: '#000' },
  camera: { flex: 1 },
  overlay: {
    ...StyleSheet.absoluteFillObject,
    justifyContent: 'center',
    alignItems: 'center',
  },
  faceGuide: {
    width: 250,
    height: 320,
    borderWidth: 2,
    borderColor: 'rgba(255, 255, 255, 0.6)',
    borderRadius: 125,
  },
  button: {
    backgroundColor: '#3b82f6',
    padding: 16,
    margin: 16,
    borderRadius: 8,
    alignItems: 'center',
  },
  buttonDisabled: { opacity: 0.5 },
  buttonText: { color: '#fff', fontSize: 18, fontWeight: '600' },
  text: { color: '#fff', textAlign: 'center' },
  result: { padding: 16, margin: 16, borderRadius: 8, alignItems: 'center' },
  live: { backgroundColor: '#22c55e' },
  fake: { backgroundColor: '#ef4444' },
  verdict: { color: '#fff', fontSize: 20, fontWeight: 'bold', textTransform: 'uppercase' },
});

Installation

# For bare React Native projects
npm install react-native-vision-camera react-native-fs

# iOS
cd ios && pod install

Platform Configuration

iOS (Info.plist)

<key>NSCameraUsageDescription</key>
<string>We need camera access to verify your identity</string>

Android (AndroidManifest.xml)

<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" android:required="true" />
<uses-feature android:name="android.hardware.camera.front" android:required="true" />

Mobile Best Practices

  • Permission handling: Always check and request camera permissions before attempting to use the camera. Provide clear messaging about why permission is needed.
  • Face guide overlay: Display an oval or rectangle guide to help users position their face correctly within the camera frame.
  • Lighting guidance: Detect low-light conditions and prompt users to move to a better-lit area for more accurate results.
  • Memory management: Clean up temporary photo files after encoding to base64 to prevent storage bloat.
  • Front camera: Always use the front-facing camera (facing="front") for liveness verification.
  • Frame timing: Capture frames at ~100ms intervals to ensure natural movement is captured for accurate liveness detection.
  • Error recovery: Handle camera initialization failures gracefully, especially on devices with limited camera access.