Spaces:
Running
Running
metadata
title: Echo
emoji: π€
colorFrom: indigo
colorTo: purple
sdk: static
pinned: false
license: mit
short_description: A companion robot that remembers you and grows with you
tags:
- reachy_mini
- reachy_mini_python_app
- companion
- memory
- ai
Reachy Echo
A companion robot that remembers you and grows with you.
What Makes Echo Different
Most robot apps treat the robot as a voice interface with decorative movement. Echo is different:
| Feature | Traditional | Echo |
|---|---|---|
| Memory | Forgets everything | Remembers your name, preferences, conversations |
| Initiative | Waits for commands | Greets you, suggests breaks, celebrates wins |
| Models | Single provider | 18+ models via LiteLLM (swap anytime) |
| Movement | Decorative | Communicates emotion and state |
Quick Start
Install
cd ~/apps/reachy/apps/echo
pip install -e .
Run (Simulation Mode)
python -m reachy_mini_echo --sim
Run (Real Robot)
With the Reachy daemon running:
- Echo appears in the Reachy Mini dashboard
- Or access directly at http://localhost:7861
Features
Memory System
Echo remembers across sessions:
- Your name and preferences
- Past conversations and topics
- Work patterns and habits
Try saying:
- "My name is Alex"
- "I'm a software engineer"
- "What do you know about me?"
Proactive Behaviors
| Behavior | What it does | When |
|---|---|---|
| Morning Greeting | Personalized hello | First appearance, 6-11am |
| Work Break Reminder | Suggests a stretch | After 2 hours of work |
| Build Celebration | Excited dance | When your code builds |
| Build Support | Sympathetic response | When builds fail |
| Return Greeting | Welcome back | After 30+ min absence |
Model Selection
Switch between 18+ models instantly:
| Model | Best for |
|---|---|
llama-3.3-70b-cerebras |
Fast local inference (default) |
claude-opus-4.5-openrouter |
Most capable |
gpt-5.2-openrouter |
Latest GPT |
gemini-3-pro-openrouter |
Multimodal |
qwen-3-235b-cerebras |
Large context |
Configuration
Environment variables (optional):
# LiteLLM server (defaults to DGX)
export LITELLM_URL=http://your-litellm-server:4000
export LITELLM_API_KEY=your-key
# Default model
export LITELLM_MODEL=llama-3.3-70b-cerebras
UI Overview
βββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π€ Reachy Echo β
β A companion that remembers you and grows with you β
βββββββββββββββββββββββββββββββ¬ββββββββββββββββββββββββ€
β β Status: π’ Connected β
β Conversation β β
β βββββββββββββ β Model: [dropdown] β
β User: Hi there! β β
β Echo: Hello! How are you? β Memory: β
β β Facts: 3 | Sessions: 5β
β β β
β [Type a message...] [Send]β Proactive Behaviors: β
β β β Morning Greeting β
β β β Work Break Reminder β
β β β Build Celebration β
β β β
β β [Clear] [Forget Me] β
βββββββββββββββββββββββββββββββ΄ββββββββββββββββββββββββ
Architecture
reachy_mini_echo/
βββ main.py # ReachyMiniEcho app class
βββ providers/ # LLM backends (LiteLLM)
βββ memory/ # SQLite + fact extraction
βββ proactive/ # Trigger/behavior engine
Development
See CLAUDE.md for detailed architecture and development guide.
Adding Behaviors
- Create behavior class in
proactive/behaviors.py - Implement
execute(echo)method - Create trigger in
proactive/triggers.py - Register in
proactive/engine.py
Memory Fact Extraction
Echo automatically extracts facts from conversation:
- "My name is X" β stores name
- "I prefer X" β stores preference
- "I work at X" β stores employer
Privacy
- All data stored locally in
data/memory.db - "Forget Me" button clears personal data
- No cloud storage of conversations
- Local models available for full privacy
License
MIT