Reachy Mini for Home Assistant
Voice ยท Gestures ยท Smart Home
v1.0.4

Reachy Mini App

Your robot meets your Home Assistant.

Transform Reachy Mini Wi-Fi into a voice-controlled smart home hub. Natural conversations, expressive movements, gesture recognition โ€” all seamlessly connected to Home Assistant.

๐ŸŽค Wake Word ๐Ÿ‘€ Face Tracking ๐Ÿ”„ Body Following ๐Ÿคš 18 Gestures ๐Ÿ”Š Multi-room Audio โšก Zero Config ๐Ÿƒ Dashboard Card

Before You Start

Requirements

Make sure you have everything ready for a smooth setup.

๐Ÿค–

Reachy Mini Wi-Fi

This app requires the Wi-Fi version of Reachy Mini. The USB version has not been validated

๐Ÿ 

Home Assistant

A running Home Assistant instance

๐Ÿ“ถ

Same Network

Both Reachy Mini and Home Assistant must be on the same local network.

๐ŸŽ™๏ธ

Voice Pipeline

Configure a Voice Assistant pipeline in Home Assistant (STT + TTS + LLM).

Getting Started

Quick Start

Install and connect in under a minute. No configuration needed.

Installation

Up and running in 1 minute

  • 1๏ธโƒฃ Open Reachy Mini Dashboard โ†’ Applications
  • 2๏ธโƒฃ Enable "Show community apps"
  • 3๏ธโƒฃ Install "Reachy Mini for Home Assistant"
  • 4๏ธโƒฃ Home Assistant discovers automatically

How it works

Seamless integration

This Reachy Mini app uses ESPHome protocol to communicate with Home Assistant โ€” no ESPHome device needed. Home Assistant discovers it via mDNS and adds the robot entities automatically. Voice commands are processed by your Home Assistant instance โ€” STT, intent recognition, and TTS all happen there.

ESPHome Protocol mDNS Discovery Robot Entities Zero Config

Capabilities

Everything you need for smart home control

Zero-configuration robot entities, built-in reactions, and auto-discovery via mDNS.

๐ŸŽค

Voice Wake

Local wake word detection with MicroWakeWord and OpenWakeWord. Say "Okay Nabu" or "Hey Reachy" to activate.

๐Ÿ 

Smart Home Control

Full Home Assistant integration. Control lights, switches, climate, media โ€” anything in your Home Assistant.

๐Ÿ‘€

Face Tracking

YOLO-based face detection with body following. Head and body move together naturally to track you during conversations.

๐Ÿคš

Gesture Detection

HaGRID ONNX models recognize hand gestures and publish the detected gesture label and confidence to Home Assistant entities.

๐Ÿ˜Š

Expressive Motion

Built-in listening, thinking, speaking, timer, and emotion reactions with natural head sway and non-blocking motion during conversations.

๐Ÿ“น

Camera Stream

MJPEG video stream as ESPHome Camera entity. Real-time monitoring in Home Assistant dashboard.

๐Ÿ”Š

Multi-room Audio

Sendspin protocol support. Sync audio playback with other speakers throughout your home.

โšก

Zero Configuration

Install and go. mDNS auto-discovery and built-in HA reactions mean the default experience works without extra setup.

๐Ÿƒ

Dashboard Card

Custom Lovelace card for Home Assistant. Real-time 3D visualization of robot pose and status.

๐Ÿงฉ

HA Blueprint

Device-first Home Assistant blueprint for presence automations using the current zero-config model: sleep control, idle behavior, and speaker volume.

๐Ÿš€

Auto Release

Version-driven GitHub release workflow. Update pyproject/changelog, then release is created automatically.

Updates

Changelog

View older versions