Zero-model glasses detection for the browser
Detect whether a person is wearing glasses using webcam video and facial landmarks. No AI models, no server, no dependencies — pure math.
npm install glassesjs
Enable your webcam to see real-time glasses detection with confidence score and per-method breakdown.
GlassesJS combines 6 independent detection methods, each returning a score 0–100. The final confidence is a weighted average.
Horizontal Sobel edge detection on the nose bridge area. Glasses frames create strong horizontal edges where they sit on the nose.
Vertical edge analysis at both temples. Glasses arms create symmetric vertical edge patterns on both sides of the face.
Tracks iris position variance over multiple frames. Glasses lenses refract light, causing higher variance in detected iris position.
Analyzes Z-coordinate discontinuities across eye landmarks. Glasses create a false plane in front of the face.
Compares pixel contrast in the eye region vs. cheeks. Glass lenses alter local contrast through reflections and tinting.
Samples colors across the eye region and compares with skin baseline. Coated lenses shift color temperature.
You already have MediaPipe landmarks — just pass them in.
import { GlassesDetector } from 'glassesjs';
const detector = new GlassesDetector();
const result = detector.detect(canvas, faceLandmarks);
console.log(result.hasGlasses); // true / false
console.log(result.confidence); // 0–100
console.log(result.methods); // per-method breakdown
const detector = new GlassesDetector({
frameBuffer: 30,
confidenceThreshold: 70,
});
// In your detection loop, every frame:
detector.addFrame(canvas, faceLandmarks);
// After 10+ frames:
const result = detector.getResult();
// When user changes:
detector.reset();
Library handles MediaPipe internally. Just provide a video element.
import { StandaloneDetector } from 'glassesjs/standalone';
const detector = await StandaloneDetector.create({
video: myVideoElement,
framesForResult: 30,
confidenceThreshold: 70,
});
// One-shot — accumulates 30 frames, returns result:
const result = await detector.detectOnce();
console.log(result.hasGlasses, result.confidence);
const detector = await StandaloneDetector.create({
video: myVideoElement,
interval: 5000, // evaluate every 5 seconds
});
// Start — callback fires every 5 seconds:
detector.start((result) => {
console.log(result.hasGlasses, result.confidence);
});
// Stop when done:
detector.stop();
// Cleanup:
detector.destroy();