DeepCamera yolo-detection-2026-coral-tpu-macos
Google Coral Edge TPU — real-time object detection natively (macOS / Linux)
install
source · Clone the upstream repo
git clone https://github.com/SharpAI/DeepCamera
Claude Code · Install into ~/.claude/skills/
T=$(mktemp -d) && git clone --depth=1 https://github.com/SharpAI/DeepCamera "$T" && mkdir -p ~/.claude/skills && cp -r "$T/skills/detection/yolo-detection-2026-coral-tpu-macos" ~/.claude/skills/sharpai-deepcamera-yolo-detection-2026-coral-tpu-macos && rm -rf "$T"
manifest:
skills/detection/yolo-detection-2026-coral-tpu-macos/SKILL.mdsource content
Coral TPU Object Detection
Real-time object detection natively utilizing the Google Coral Edge TPU accelerator on your local hardware. Detects 80 COCO classes (person, car, dog, cat, etc.) with ~4ms inference on 320x320 input.
Requirements
- Python 3.9–3.13
How It Works
┌─────────────────────────────────────────────────────┐ │ Host (Aegis-AI) │ │ frame.jpg → /tmp/aegis_detection/ │ │ stdin ──→ ┌──────────────────────────────┐ │ │ │ Native Python Environment │ │ │ │ detect.py │ │ │ │ ├─ loads _edgetpu.tflite │ │ │ │ ├─ reads frame from disk │ │ │ │ └─ runs inference on TPU │ │ │ stdout ←── │ → JSONL detections │ │ │ └──────────────────────────────┘ │ │ USB ──→ Native System USB / edgetpu drivers │ └─────────────────────────────────────────────────────┘
- Aegis writes camera frame JPEG to shared
workspace/tmp/aegis_detection/ - Sends
event via stdin JSONL to the local Python instanceframe
invokes PyCoral and executes natively on the mapped USB Edge TPUdetect.py- Returns
event via stdout JSONLdetections
Platform Setup
Linux
# Uses the official apt-get google-coral packages natively ./deploy.sh
macOS
# Downloads and installs the libedgetpu OS payload framework inline ./deploy.sh
Important Deployment Notice: The updated
script will natively halt execution and prompt you securely for your OSdeploy.shpassword to securely register the USB drivers (sudo) system-wide. If you refuse the prompt, it gracefully outputs the exact terminal instructions for you to configure it manually.libedgetpu
Performance
| Input Size | Inference | On-chip | Notes |
|---|---|---|---|
| 320x320 | ~4ms | 100% | Fully on TPU, best for real-time |
| 640x640 | ~20ms | Partial | Some layers on CPU (model segmented) |
Cooling: The USB Accelerator aluminum case acts as a heatsink. If too hot to touch during continuous inference, it will thermal-throttle. Consider active cooling or
.clock_speed: standard
Protocol
Same JSONL as
yolo-detection-2026:
Skill → Aegis (stdout)
{"event": "ready", "model": "yolo26n_edgetpu", "device": "coral", "format": "edgetpu_tflite", "tpu_count": 1, "classes": 80} {"event": "detections", "frame_id": 42, "camera_id": "front_door", "objects": [{"class": "person", "confidence": 0.85, "bbox": [100, 50, 300, 400]}]} {"event": "perf_stats", "total_frames": 50, "timings_ms": {"inference": {"avg": 4.1, "p50": 3.9, "p95": 5.2}}}
Bounding Box Format
[x_min, y_min, x_max, y_max] — pixel coordinates (xyxy).
Installation
Linux / macOS
./deploy.sh
The deployer builds the local Python virtual environment and installs the Edge TPU runtime. No Docker required.