Types & Constants¶
The @moveris/shared package exports all TypeScript types, constants, and utility functions used across the SDK. This page covers the most important exports.
In plain terms
These are the data shapes your code uses when talking to the API: frame format, response structure, verdict values, etc. Use them for type safety and autocomplete in TypeScript.
Optional parameters
Properties followed by ? (for example, model?) are optional. You can omit them when creating objects or when the API uses defaults. For instance, model and frame_count in FastCheckRequest can be left out.
Import¶
import type {
FrameData,
CropData,
CapturedFrame,
LivenessResult,
FastCheckRequest,
FastCheckResponse,
// ... etc
} from '@moveris/shared';
Request & Response Types¶
FrameData¶
A single video frame for the API.
interface FrameData {
index: number; // Frame index (0-based)
timestamp_ms: number; // Timestamp in milliseconds
pixels: string; // Base64-encoded PNG image data
}
CropData¶
A pre-cropped 224x224 face image.
interface CropData {
index: number; // Frame index (0-based)
timestamp_ms: number; // Timestamp in milliseconds
crop: string; // Base64-encoded 224x224 PNG crop
}
CapturedFrame¶
Internal representation of a captured frame (used by hooks before converting to FrameData). The SDK uses timestampMs (camelCase); the API payload uses timestamp_ms (snake_case). The toFrameData() helper converts automatically.
interface CapturedFrame {
index: number;
timestampMs: number; // API uses timestamp_ms
pixels: string;
width: number;
height: number;
}
FastCheckRequest¶
interface FastCheckRequest {
session_id: string;
source: FrameSource;
model?: FastCheckModel;
frame_count?: number; // Optional. For v2 resolution with X-Model-Version header (10, 30, 60, 90, or 120). API 2.0+.
frames: FrameData[];
warnings?: string[]; // Optional. Capture warnings from frontend (API echoes in response)
}
FastCheckResponse¶
The confidence field is reserved for future use and is functionally identical to real_score. Use real_score for decision-making.
interface FastCheckResponse {
session_id: string;
verdict: 'live' | 'fake';
confidence: number; // Reserved for future use; use real_score for decision-making
score: number;
real_score: number; // Use this for decision-making
processing_ms: number;
frames_processed: number;
model: string;
warning?: string; // Single warning message
warnings?: string[]; // Aggregated capture warnings from frontend (API 1.11+)
}
FastCheckStreamRequest¶
interface FastCheckStreamRequest {
session_id: string;
source: FrameSource;
model?: FastCheckModel;
frame_count?: number; // Optional. For v2 resolution with X-Model-Version header. Must be consistent across all requests. API 2.0+.
frame: FrameData; // Single frame per request
warnings?: string[]; // Optional. Per-frame capture warnings (API 1.11+)
}
FastCheckStreamResponse¶
interface FastCheckStreamResponse {
status: 'buffering' | 'complete';
session_id: string;
frames_received: number;
frames_required: number;
ttl_seconds: number;
warnings?: string[]; // Per-frame (buffering) or aggregated (complete)
// Present when status is 'complete':
verdict?: 'live' | 'fake';
confidence?: number; // Reserved for future use; use real_score for decision-making
real_score?: number; // Use this for decision-making
score?: number;
model?: string;
processing_ms?: number;
frames_processed?: number;
available?: boolean;
warning?: string;
error?: string;
}
LivenessResult¶
Unified result type used by SDK hooks. Use realScore (or real_score from the API) for decision-making; confidence is reserved for future use.
interface LivenessResult {
verdict: 'live' | 'fake';
confidence: number; // Reserved for future use; use realScore for decision-making
score: number;
realScore: number; // Use this for decision-making
sessionId: string;
processingMs: number;
framesProcessed: number;
model: string;
warning?: string;
warnings?: string[]; // Aggregated capture warnings (API 1.11+)
}
HybridCheckRequest¶
interface HybridCheckRequest {
session_id: string;
source: FrameSource;
frames: HybridFrameData[];
fps?: number;
model?: string;
}
HybridCheckResponse¶
interface HybridCheckResponse {
session_id: string;
verdict: 'live' | 'fake';
confidence: number; // Reserved for future use; use real_score for decision-making
score: number;
real_score: number; // Use this for decision-making
visual_score?: number;
physio_extracted?: boolean;
processing_ms: number;
frames_processed: number;
model: string;
}
Other Response Types¶
interface HealthResponse {
status: string;
version: string;
models_available: string[];
}
interface JobStatusResponse {
job_id: string;
status: 'pending' | 'processing' | 'completed' | 'failed';
result?: FastCheckResponse;
}
interface QueueStatsResponse {
pending: number;
processing: number;
completed: number;
}
interface ErrorResponse {
error: string;
message: string;
required?: number;
received?: number;
required_scope?: string; // Present when error is insufficient_scope (API 1.10+)
}
Common error codes
invalid_key (401), insufficient_credits or account_suspended (402), insufficient_scope (403), insufficient_frames (400). See Errors.
Enums & Literals¶
FrameSource¶
"live"-- Captured from a live camera feed"media"-- From a recorded video or uploaded file
FastCheckModel¶
type FastCheckModel = '10' | 'mixed-10-v2' | 'mixed-30-v2' | 'mixed-60-v2' | 'mixed-90-v2' | 'mixed-120-v2' | string;
Frame count must meet at least the model's min_frames. For predictable latency, send exactly that many. Use getModels() or useModels to fetch available models dynamically.
Constants¶
API Endpoints¶
import { API_ENDPOINTS } from '@moveris/shared';
API_ENDPOINTS.production // 'https://api.moveris.com'
API_ENDPOINTS.staging // 'https://staging.api.moveris.com'
API_ENDPOINTS.development // 'http://localhost:8001'
API Paths¶
import { API_PATHS } from '@moveris/shared';
API_PATHS.HEALTH // '/health'
API_PATHS.FAST_CHECK // '/api/v1/fast-check'
API_PATHS.FAST_CHECK_CROPS // '/api/v1/fast-check-crops'
API_PATHS.FAST_CHECK_STREAM // '/api/v1/fast-check-stream'
API_PATHS.VERIFY // '/api/v1/verify'
API_PATHS.HYBRID_CHECK // '/api/v1/hybrid-check'
API_PATHS.HYBRID_50 // '/api/v1/hybrid-50'
API_PATHS.HYBRID_150 // '/api/v1/hybrid-150'
Frame Configuration¶
import { FRAME_CONFIG } from '@moveris/shared';
FRAME_CONFIG.FAST_CHECK_FRAMES // 10
FRAME_CONFIG.DEFAULT_FPS // 10
FRAME_CONFIG.DEFAULT_WIDTH // 640
FRAME_CONFIG.DEFAULT_HEIGHT // 480
FRAME_CONFIG.CROP_SIZE // 224
Retry Configuration¶
import { RETRY_CONFIG } from '@moveris/shared';
RETRY_CONFIG.MAX_ATTEMPTS // 3
RETRY_CONFIG.INITIAL_DELAY // 1000 (ms)
RETRY_CONFIG.MAX_DELAY // 10000 (ms)
Frame Buffer Configuration¶
import { FRAME_BUFFER_CONFIG } from '@moveris/shared';
FRAME_BUFFER_CONFIG.MAX_FRAMES // Maximum frames to buffer
FRAME_BUFFER_CONFIG.FLUSH_INTERVAL // Flush interval in ms
Feedback System¶
The SDK includes a built-in feedback message system for guiding users during face capture.
Feedback Messages¶
import {
FEEDBACK_MESSAGES,
DEFAULT_STATUS_MESSAGES,
getFeedbackMessage,
getStatusMessage,
} from '@moveris/shared';
getFeedbackMessage(key, locale?)¶
Returns a localized feedback message for the user:
getFeedbackMessage('face_not_detected');
// "No face detected. Please position your face in the oval guide."
getFeedbackMessage('face_too_far', 'es');
// Spanish translation
Common feedback keys:
| Key | Description |
|---|---|
face_not_detected | No face found in frame |
face_too_far | Face is too far from camera |
face_too_close | Face is too close to camera |
face_not_centered | Face is not within the oval guide |
poor_lighting | Lighting conditions are insufficient |
too_much_blur | Frame is too blurry |
hold_still | User needs to hold still |
capturing | Frames are being captured |
processing | Frames are being processed |
Locales¶
DEFAULT_LOCALE-- English messagesES_LOCALE-- Spanish messages
Oval Guide Helpers¶
For rendering face guide overlays:
import {
OVAL_GUIDE_COLORS,
getOvalGuideState,
getCaptureQualityFeedback,
canCaptureFrame,
} from '@moveris/shared';
getOvalGuideState(detectionResult)-- Returns the visual state of the oval guide ("searching","aligning","ready","capturing")getCaptureQualityFeedback(detectionResult)-- Returns quality feedback for the current framecanCaptureFrame(detectionResult)-- Returnstrueif the current frame meets quality thresholds
Detection Types¶
Types used by the face detection system:
interface DetectionResult {
faceDetected: boolean;
boundingBox?: BoundingBox;
landmarks?: FaceLandmarks;
headPose?: HeadPose;
quality?: FrameQuality;
}
interface DetectionSummary {
totalFrames: number;
facesDetected: number;
averageQuality: number;
}
interface HeadPose {
pitch: number; // Up/down rotation
yaw: number; // Left/right rotation
roll: number; // Tilt
}
interface GazeThresholds {
maxPitch: number;
maxYaw: number;
maxRoll: number;
}
Utility Functions¶
Validators¶
Encoders¶
Retry¶
import { retryWithBackoff } from '@moveris/shared';
// Retry any async function with exponential backoff
const result = await retryWithBackoff(
() => someAsyncOperation(),
{ maxAttempts: 3, initialDelay: 1000, maxDelay: 10000 }
);
Frame Analysis¶
import {
isFaceFullyVisible,
isFaceInOval,
calculateAdaptiveCropMultiplier,
calculateFaceCropRegion,
validateFaceLandmarks,
} from '@moveris/shared';
These functions are used internally by the SDK hooks but can be used directly for custom implementations.