VisionEmoji App Icon

VisionEmoji

Real-time AI-powered emoji overlay for iOS see no evil monkey

Point your camera at the world and watch objects transform into emojis instantly. Powered by YOLO26 + CoreML with on-device inference.

iOS 26 Swift SwiftUI CoreML YOLO26 Vision
Download on the App Store

See It In Action

Real screenshots from VisionEmoji running on iPhone

Dog detection with emoji overlay
Pet Detection — Shih-Tzu classified with emoji overlay
Living room scene with dog and couch detection
Room Scan — Dog, couch, and person detected
Kitchen objects detection
Kitchen — Knives, bottle, and bowl identified
Desk setup with multiple objects detected
Workspace — 9 objects tracked simultaneously
Debug mode with bounding boxes and confidence scores
Debug Mode — Bounding boxes with confidence scores

Features

Everything runs on-device with zero cloud dependency

Rocket

Real-Time Detection

YOLO26m processes frames at up to 60 FPS with Neural Engine acceleration on Apple Silicon.

Direct hit

Dual-Model Pipeline

YOLO26m detection + per-object crop classification blends COCO and ImageNet labels.

Locked

100% On-Device

All inference runs locally via CoreML. No data ever leaves your iPhone.

Gear

Configurable

Adjust FPS, emoji scale, confidence thresholds, Kalman filter parameters, and label priority.

Ruler

Kalman Filtering

Smooth position tracking with configurable process and measurement noise for stable overlays.

Apple

Native Apple Emojis

Uses Apple's built-in emoji set rendered via NSAttributedString with NSCache optimization.

Tech Stack

Swift SwiftUI CoreML Vision AVFoundation Combine YOLO26 coremltools