Ukranian controls Home Assistant over LoRa radio when their power grid goes down
Source: Hacker News
Introduction
Hey r/homeassistant,
I live in Ukraine. Russia regularly attacks our power grid — when it goes down, internet and cell towers follow within hours. My Home Assistant keeps running on battery backup, but I can’t reach it from outside. So I built a radio bridge.
How it works
Two Lilygo T‑Echo radios (~$30 each, LoRa 433 MHz, Meshtastic firmware). One is plugged into my Mac mini via USB, the other is portable with me. A Python listener daemon sits between the radio and Home Assistant, routing commands and returning sensor data — all over encrypted LoRa. HA runs on a Home Assistant Green.
What I can do from the radio
Smart home control
- Turn lights on/off
- Check temperature from Aqara sensors (three around the house)
- Check power status — grid on/off, battery levels (EcoFlow, Zendure)
- Check who’s home
Voice messages (the fun part)
- Type
SAY: Привіт, я скоро буду вдома(Hey, I’ll come back home soon) on the T‑Echo. - Listener calls
tts.google_translatewith Ukrainian language. - HA Voice PE speaker reads it aloud at home.
Zero internet. Just radio → Mac mini → HA TTS → speaker.
Camera snapshots
- Ask “what’s outside?” via radio or Discord.
- Listener grabs snapshots from Tapo C120 + C100 (via HA camera proxy API).
- Runs them through a local vision model (gemma3:12b on Ollama).
- Sends a text description, e.g., “5 cars parked, no people, snowy”.
- Hourly automated monitoring logs everything.
Proactive alerts
- The AI monitors power status.
- Power goes out → LoRa message to my radio within seconds, plus battery levels and temperature.
The HA integration
The listener talks to HA through the REST API:
GET /api/states/{entity_id}— read sensorsPOST /api/services/{domain}/{service}— control devicesGET /api/camera_proxy/{camera_entity}— grab snapshotsPOST /api/services/tts/speak— voice messages
Incoming radio messages are classified by a local LLM (phi4‑mini) — “is this a smart home command, a question, or a TTS request?” — then routed to the appropriate HA service or to a larger model (gemma3:12b) for general questions.
Architecture
T-Echo (portable)
│ LoRa 433 MHz, encrypted
▼
T-Echo (USB) → Mac mini
│
├── SAY: prefix → tts.google_translate → Voice PE speaker
├── Smart home → Home Assistant REST API
├── Camera → camera_proxy → gemma3 vision → description
├── AI questions → phi4-mini → gemma3:12b (local via Ollama)
└── Alerts → outbox .msg files → LoRa TX
Why this matters
HA on battery backup is great, but useless if you can’t reach it. The radio bridge means:
- No dependency on Wi‑Fi, internet, or cell towers
- Encrypted communication (Meshtastic PSK)
- ~1–3 km urban range with stock T‑Echo antenna (extendable with mesh nodes)
- Total cost: ~$ 60 for two radios
Entities I use
camera.tapo_c120_hd_stream/camera.tapo_c100_hd_stream— snapshotstts.google_translate_en_com(withlanguage: "uk") — Ukrainian TTSmedia_player.home_assistant_voice_*— the speakerbinary_sensor.tapo_c120_person_detection— triggers- Aqara temperature sensors
- Power grid status sensor (via Yasno integration and Meross Smart Plug as a sensor)
- EcoFlow battery levels
Stack
- Home Assistant — the heart of it all
- HA Voice PE — TTS output speaker
- Tapo C120 + C100 — security cameras
- Meshtastic on Lilygo T‑Echo (433 MHz)
- Ollama — local AI models
- OpenClaw — AI agent framework
- Mac mini M4 — server on battery backup
Happy to answer questions about the HA setup.