Skip to content

Autonomous Drone Navigation System

Custom flight‑controller hardware + CV stack for autonomous flight

Role: Systems & firmwareStatus: In flight‑tuning
ESP32FreeRTOSRaspberry PiSLAMPID ControlKiCad

Overview

This past summer, I was extremely interested in avionics, and embedded software development. What better way to learn than building a flight controller from scratch! I developed a fully custom flight controller PCB, integrating IMU, GPS, and cameras. It allows communication between a Raspberry PI 0W for vision information. It combines real‑time embedded control with a companion computer vision pipeline to enable reliable autonomous flight on low‑cost hardware. I also developed the quadrotor chassis from scratch in Fusion 360, trying to minimize for 3D-printing weight and optimizing for center of mass distribution.

Concept

Split compute: the ESP32 handles deterministic loops (sensing, control, motor PWM) under FreeRTOS, while a Raspberry Pi runs SLAM and obstacle avoidance. The design emphasises easy tuning and log‑driven iteration.

Design Process

  • Developed quadrotor chassis in Fusion 360, optimizing for center of gravity, and minimizing 3D-printing weight.
  • Designed a custom controller PCB (power management, IMU, GPS, barometer, motor drivers) in KiCad.
  • Implemented multi‑loop PID for attitude/altitude; prioritized IMU pipeline with task affinities on FreeRTOS.
  • Integrated stereo depth & detection on the Pi; fused estimates for navigation.
  • Bench‑tested PWM transients and added flyback/EMI mitigation; iterated with black‑box logs.

Specifications

MCU: ESP32 @ FreeRTOS
Companion: Raspberry Pi
Sensors: IMU (MPU 6500), GPS(GY-NEO6MV2), Barometer(BPM 280)
Comms: I2C/SPI/UART, PWM ESCs
Vision: Stereo depth + detection
Planning: A*/RRT (roadmap)

Results

  • Hardware assembled and validated; closed‑loop stability achieved on bench.
  • Vision stack integrated; flight envelope characterization in progress.
  • Currently in process of self-navigation implementation using A* and Pathweaver.
Custom flight controller PCB Final custom drone build

Computer Vision: Real‑Time Stereo Depth Estimation

Depth pipeline using StereoSGBM/SBM with calibration & rectification

Role: Solo developerTimeline: May–August 2025
OpenCVPythonCalibration3D ReconstructionReal‑time

Overview

A real‑time stereo depth pipeline built for robotics perception. Produces stable disparity maps on commodity cameras with a full calibration→rectification→disparity→post‑filtering chain.

Concept

Engineer a reproducible CV stack that trades a bit of peak accuracy for consistent, low‑latency results appropriate for embedded navigation.

Design Process

  • Implemented intrinsic/extrinsic calibration and lens distortion correction.
  • Performed epipolar rectification and StereoSGBM disparity with tunable parameters.
  • Benchmarked against Middlebury; prepared Kalman temporal smoothing hooks.

Specifications

Language: Python + OpenCV
Modes: SGBM / BM
Artifacts: Rectified pairs, disparity maps
Outputs: Depth estimates & point clouds

Results

  • Consistent depth maps validated on public datasets and live feeds.
  • Ready for integration into on‑robot navigation nodes.
  • Rough proof of concept, utilizing cheap amazon webcams fused together in a 3D print.
Tsukuba stereo input pair Ground truth disparity Left camera feed Right camera feed

Garage Parking Assistant

Ultrasonic‑guided parking with real‑time distance feedback

Role: Embedded developer
ArduinoUltrasonic SensorsEmbedded

Overview

A garage aid that measures bumper‑to‑wall distance via ultrasonic sensing and provides intuitive feedback cues for consistent, safe parking.

Concept

Low‑cost, reliable sensing with debounced reads and basic smoothing for stable display—no network or app required.

Design Process

  • Implemented sensor polling loop with outlier rejection and moving‑average smoothing.
  • Designed simple visual feedback logic and thresholds for driver cues.
  • Iterated mounting geometry to reduce spurious reflections.

Specifications

MCU: Arduino
Sensor: Ultrasonic rangefinder
Loop: Debounced + smoothed readings
UI: LED/indicator thresholds

Results

  • More consistent parking stops; reduces bumper contact in tight spaces.

Contact

Open to internships/co-ops in robotics, embedded systems, and computer vision. Email meGitHubHome