DeepCamera yolo-detection-2026-coral-tpu-macos

Google Coral Edge TPU — real-time object detection natively (macOS / Linux)

install
source · Clone the upstream repo
git clone https://github.com/SharpAI/DeepCamera
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/SharpAI/DeepCamera "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/detection/yolo-detection-2026-coral-tpu-macos" ~/.claude/skills/sharpai-deepcamera-yolo-detection-2026-coral-tpu-macos && rm -rf "$T"
manifest: skills/detection/yolo-detection-2026-coral-tpu-macos/SKILL.md
source content

Coral TPU Object Detection

Real-time object detection natively utilizing the Google Coral Edge TPU accelerator on your local hardware. Detects 80 COCO classes (person, car, dog, cat, etc.) with ~4ms inference on 320x320 input.

Requirements

  • Python 3.9–3.13

How It Works

┌─────────────────────────────────────────────────────┐
│ Host (Aegis-AI)                                     │
│   frame.jpg → /tmp/aegis_detection/                 │
│   stdin  ──→ ┌──────────────────────────────┐       │
│              │ Native Python Environment     │       │
│              │   detect.py                   │       │
│              │   ├─ loads _edgetpu.tflite     │       │
│              │   ├─ reads frame from disk     │       │
│              │   └─ runs inference on TPU    │       │
│   stdout ←── │   → JSONL detections          │       │
│              └──────────────────────────────┘       │
│   USB ──→ Native System USB / edgetpu drivers       │
└─────────────────────────────────────────────────────┘
  1. Aegis writes camera frame JPEG to shared
    /tmp/aegis_detection/
    workspace
  2. Sends
    frame
    event via stdin JSONL to the local Python instance
  3. detect.py
    invokes PyCoral and executes natively on the mapped USB Edge TPU
  4. Returns
    detections
    event via stdout JSONL

Platform Setup

Linux

# Uses the official apt-get google-coral packages natively
./deploy.sh

macOS

# Downloads and installs the libedgetpu OS payload framework inline
./deploy.sh

Important Deployment Notice: The updated

deploy.sh
script will natively halt execution and prompt you securely for your OS
sudo
password to securely register the USB drivers (
libedgetpu
) system-wide. If you refuse the prompt, it gracefully outputs the exact terminal instructions for you to configure it manually.

Performance

Input SizeInferenceOn-chipNotes
320x320~4ms100%Fully on TPU, best for real-time
640x640~20msPartialSome layers on CPU (model segmented)

Cooling: The USB Accelerator aluminum case acts as a heatsink. If too hot to touch during continuous inference, it will thermal-throttle. Consider active cooling or

clock_speed: standard
.

Protocol

Same JSONL as

yolo-detection-2026
:

Skill → Aegis (stdout)

{"event": "ready", "model": "yolo26n_edgetpu", "device": "coral", "format": "edgetpu_tflite", "tpu_count": 1, "classes": 80}
{"event": "detections", "frame_id": 42, "camera_id": "front_door", "objects": [{"class": "person", "confidence": 0.85, "bbox": [100, 50, 300, 400]}]}
{"event": "perf_stats", "total_frames": 50, "timings_ms": {"inference": {"avg": 4.1, "p50": 3.9, "p95": 5.2}}}

Bounding Box Format

[x_min, y_min, x_max, y_max]
— pixel coordinates (xyxy).

Installation

Linux / macOS

./deploy.sh

The deployer builds the local Python virtual environment and installs the Edge TPU runtime. No Docker required.