JavaScript Examples¶
Complete JavaScript examples for integrating Moveris API (v2).
In plain terms
Copy these code snippets to add liveness detection to your web app. They show how to capture frames from a video element, send them to the API, and display the result. Frame count depends on the model (e.g. 10 for mixed-10-v2, 30 for mixed-30-v2). Replace the placeholder API key with yours (preferably via a backend proxy).
Moveris API (v2)
These examples use Moveris API (v2) at https://api.moveris.com
Model selection
Examples use the v1 flow (model in body). For v2 resolution, send X-Model-Version: latest header with frame_count: 10|30|60|90|120 in the body. See Model Versioning & Frames.
REST API Examples¶
async function checkLiveness(videoElement) {
const frames = [];
// Capture frames at ~10 FPS (count = model min_frames, e.g. 10 for mixed-10-v2)
const frameCount = 10; // Use getModels() for dynamic model selection
for (let i = 0; i < frameCount; i++) {
const canvas = document.createElement('canvas');
canvas.width = 640;
canvas.height = 480;
const ctx = canvas.getContext('2d');
ctx.drawImage(videoElement, 0, 0, 640, 480);
frames.push({
index: i,
timestamp_ms: performance.now(),
pixels: canvas.toDataURL('image/png').split(',')[1]
});
// Wait ~100ms between frames (10 FPS)
await new Promise(r => setTimeout(r, 100));
}
const response = await fetch('https://api.moveris.com/api/v1/fast-check', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': 'sk-your-api-key'
},
body: JSON.stringify({
session_id: crypto.randomUUID(),
source: 'live',
model: 'mixed-10-v2',
frames: frames
})
});
const body = await response.json();
if (!body.success) {
throw new Error(body.message ?? 'Request failed');
}
return body.data;
}
interface Frame {
index: number;
timestamp_ms: number;
pixels: string; // Base64
}
interface LivenessResult {
verdict: "live" | "fake";
real_score: number;
score: number;
session_id: string;
processing_ms: number;
}
async function checkLiveness(
frames: Frame[],
sessionId: string
): Promise<LivenessResult> {
const response = await fetch(
'https://api.moveris.com/api/v1/fast-check',
{
method: "POST",
headers: {
"X-API-Key": "sk-your-api-key",
"Content-Type": "application/json",
},
body: JSON.stringify({
session_id: sessionId,
source: "live",
// v1 flow:
model: "mixed-10-v2",
// v2 flow (alternative):
// frame_count: 10,
frames,
}),
}
);
const body = await response.json();
if (!body.success) {
throw new Error(body.message ?? 'Request failed');
}
return body.data as LivenessResult;
}
// Usage
const result = await checkLiveness(frames, crypto.randomUUID());
console.log(`Verdict: ${result.verdict}, Score: ${result.score}`);
With Crops (Faster)¶
Dependencies
This example requires the @mediapipe/tasks-vision package for face detection.
import { FaceDetector, FilesetResolver } from '@mediapipe/tasks-vision';
async function checkLivenessWithCrops(videoElement) {
// Initialize MediaPipe Face Detector
const vision = await FilesetResolver.forVisionTasks(
'https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@latest/wasm'
);
const faceDetector = await FaceDetector.createFromOptions(vision, {
baseOptions: {
modelAssetPath: 'https://storage.googleapis.com/mediapipe-models/face_detector/blaze_face_short_range/float16/1/blaze_face_short_range.tflite'
},
runningMode: 'VIDEO'
});
const crops = [];
// Capture frames (count = model min_frames, e.g. 10 for mixed-10-v2)
const frameCount = 10;
for (let i = 0; i < frameCount; i++) {
// Capture frame
const canvas = document.createElement('canvas');
canvas.width = videoElement.videoWidth;
canvas.height = videoElement.videoHeight;
const ctx = canvas.getContext('2d');
ctx.drawImage(videoElement, 0, 0);
// Detect face
const detections = faceDetector.detectForVideo(canvas, performance.now());
if (detections.detections.length > 0) {
const face = detections.detections[0].boundingBox;
// Expand to 3x face size
const centerX = face.originX + face.width / 2;
const centerY = face.originY + face.height / 2;
const size = Math.max(face.width, face.height) * 3;
// Crop and resize to 224x224
const cropCanvas = document.createElement('canvas');
cropCanvas.width = 224;
cropCanvas.height = 224;
const cropCtx = cropCanvas.getContext('2d');
cropCtx.drawImage(
canvas,
centerX - size / 2, centerY - size / 2, size, size,
0, 0, 224, 224
);
crops.push({
index: i,
pixels: cropCanvas.toDataURL('image/png').split(',')[1]
});
}
await new Promise(r => setTimeout(r, 100));
}
const response = await fetch('https://api.moveris.com/api/v1/fast-check-crops', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': 'sk-your-api-key'
},
body: JSON.stringify({
session_id: crypto.randomUUID(),
source: 'live',
model: 'mixed-10-v2',
crops: crops
})
});
const body = await response.json();
if (!body.success) {
throw new Error(body.message ?? 'Request failed');
}
return body.data;
}
Tips¶
- Use
crypto.randomUUID()to generate unique session IDs - PNG encoding is recommended for better accuracy
- Frame count must match model (e.g. 10 for mixed-10-v2, 30 for mixed-30-v2). Capture at ~10 FPS.
- Always make API calls from your server to protect your API key
- Set
source: "live"when capturing from a live camera