Advent of AI 2025 - Day 5: I Built a Touchless Flight Tracker You Control With Hand Gestures

Published: (December 8, 2025 at 12:45 AM EST)
3 min read
Source: Dev.to

Source: Dev.to

Overview

I’ve edited this post, but AI helped. These are meant to be quick posts related to the Advent of AI. I don’t have time to spend a couple of hours on each daily post. 😅

For Goose’s Day 5 of Advent of AI, the challenge was to build “The Homecoming Board.” It’s a gesture‑controlled flight‑arrival display where people wearing gloves and mittens can navigate using hand gestures—no touching screens in the freezing cold. The requirements were:

  • At least two distinct gestures for navigation
  • Real‑time flight data
  • Audio feedback for gesture recognition (nice‑to‑have)

Tech Stack

  • TanStack Start – React + TypeScript with SSR
  • MediaPipe – Hand‑tracking and gesture recognition (GitHub)
  • OpenSky Network API – Real‑time flight data (API docs)

I chose TanStack Start because I’d already used it for a significant project (the Pomerium MCP app demo).

Implementation Details

Hand Tracking with MediaPipe

Getting MediaPipe running in the browser was a bit clunky at first. I tried TensorFlow.js, but eventually settled on the MediaPipe WASM runtime because it’s pure client‑side and deploys easily to Netlify.

Gesture training example

// useMediaPipe.ts – custom hook for MediaPipe integration
import { Hands } from '@mediapipe/hands';

export const useMediaPipe = () => {
  const hands = new Hands({
    locateFile: (file) => `/mediapipe/${file}`,
  });

  hands.setOptions({
    maxNumHands: 2,
    modelComplexity: 1,
    minDetectionConfidence: 0.7,
    minTrackingConfidence: 0.5,
  });

  return hands;
};

The hand tracking runs at 30–60 FPS with landmark visualization. I mirror both the video feed and the landmarks so the interaction feels natural. A quirk I ran into was the green skeleton overlay appearing on my head—MediaPipe was mistakenly detecting facial features as hands. I fixed it by rendering the skeleton only when at least one hand is detected.

Gesture Detection

The challenge asked for at least two gestures; I implemented four:

  • Closed fist
  • Open palm
  • Thumbs up
  • Thumbs down

Both hands are detected independently, allowing simultaneous different gestures.

// Finger curl ratio: distance(tip, wrist) / distance(knuckle, wrist)
const fingerCurl = (finger: FingerLandmarks, wrist: Point) => {
  const tipDist = distance(finger.tip, wrist);
  const knuckleDist = distance(finger.knuckle, wrist);
  return tipDist / knuckleDist;
};

const classifyGesture = (handLandmarks: HandLandmarks) => {
  const curls = handLandmarks.fingers.map(f => fingerCurl(f, handLandmarks.wrist));

  const isFist = curls.every(c => c  c > 0.8);
  const isThumbsUp = curls[0] > 0.8 && curls.slice(1).every(c => c  c > 0.8);

  if (isFist) return 'fist';
  if (isOpenPalm) return 'open-palm';
  if (isThumbsUp) return 'thumbs-up';
  if (isThumbsDown) return 'thumbs-down';
  return 'unknown';
};

The algorithm uses simple thresholding on finger‑curl ratios. I added a small “gesture training” step that lets users calibrate the thresholds to their hand shape, improving robustness across different users and glove types.

Flight Data Integration

I proxy requests to the OpenSky API through a tiny serverless function to avoid CORS issues and to cache results with TanStack Query. The UI displays arrival times, gate numbers, and flight status, updating in near‑real time.

Audio Feedback

Each recognized gesture triggers a short audio cue (with an option to mute). This gives users immediate confirmation, which is especially useful when wearing mittens that block visual cues.

UI & Theming

  • Light and dark winter themes, WCAG AAA compliant
  • Camera selector for devices with multiple cameras
  • Mobile‑friendly layout (works well on phones and tablets)

Results & Takeaways

  • Real‑time hand tracking with MediaPipe’s WASM runtime
  • Four reliable gestures, detected independently for each hand
  • Live flight data from OpenSky Network with smart caching
  • Audio feedback for every gesture (toggleable)
  • Optional gesture‑training step for personalized accuracy
  • Fully responsive UI with high‑contrast themes

Building a computer‑vision‑powered app turned out to be more accessible than I expected, thanks to MediaPipe’s client‑side WASM build. The biggest challenge was fine‑tuning the gesture thresholds, but a quick calibration step solved most edge cases. Overall, the Homecoming Board meets (and exceeds) the Advent of AI Day 5 requirements while providing a smooth, touch‑free experience for cold‑weather travelers.

Back to Blog

Related posts

Read more »