Custom flight‑controller hardware + CV stack for autonomous flight
Role: Systems & firmwareStatus: In flight‑tuning
ESP32FreeRTOSRaspberry PiSLAMPID ControlKiCad
Overview
This past summer, I was extremely interested in avionics, and embedded software development. What better way to learn than building a flight controller from scratch! I developed a fully custom flight controller PCB, integrating IMU, GPS, and cameras. It allows communication between a Raspberry PI 0W for vision information. It combines real‑time embedded control with a companion computer vision pipeline to enable reliable autonomous flight on low‑cost hardware. I also developed the quadrotor chassis from scratch in Fusion 360, trying to minimize for 3D-printing weight and optimizing for center of mass distribution.
Concept
Split compute: the ESP32 handles deterministic loops (sensing, control, motor PWM) under FreeRTOS, while a Raspberry Pi runs SLAM and obstacle avoidance. The design emphasises easy tuning and log‑driven iteration.
Design Process
Developed quadrotor chassis in Fusion 360, optimizing for center of gravity, and minimizing 3D-printing weight.
Designed a custom controller PCB (power management, IMU, GPS, barometer, motor drivers) in KiCad.
Implemented multi‑loop PID for attitude/altitude; prioritized IMU pipeline with task affinities on FreeRTOS.
Integrated stereo depth & detection on the Pi; fused estimates for navigation.
Bench‑tested PWM transients and added flyback/EMI mitigation; iterated with black‑box logs.
Depth pipeline using StereoSGBM/SBM with calibration & rectification
Role: Solo developerTimeline: May–August 2025
OpenCVPythonCalibration3D ReconstructionReal‑time
Overview
A real‑time stereo depth pipeline built for robotics perception. Produces stable disparity maps on commodity cameras with a full calibration→rectification→disparity→post‑filtering chain.
Concept
Engineer a reproducible CV stack that trades a bit of peak accuracy for consistent, low‑latency results appropriate for embedded navigation.
Design Process
Implemented intrinsic/extrinsic calibration and lens distortion correction.
Performed epipolar rectification and StereoSGBM disparity with tunable parameters.
Benchmarked against Middlebury; prepared Kalman temporal smoothing hooks.
Specifications
Language: Python + OpenCV
Modes: SGBM / BM
Artifacts: Rectified pairs, disparity maps
Outputs: Depth estimates & point clouds
Results
Consistent depth maps validated on public datasets and live feeds.
Ready for integration into on‑robot navigation nodes.
Rough proof of concept, utilizing cheap amazon webcams fused together in a 3D print.
Pitfalls & Challenges
Undocumented cameras: The Amazon listing lacked sensor/lens details, so I disassembled the webcams to identify the sensor and track down the manufacturer datasheet. I used those specs (sensor size, focal characteristics, etc.) to sanity‑check and improve intrinsic calibration.
Mechanical misalignment from the mount: A 3D‑printed stereo rig was difficult to keep consistent due to printer tolerances and warping. I iterated by adding small shims (tape) to adjust pitch/yaw in ~0.5–1° steps until epipolar lines were close.
Software rectification trade‑offs: Because the cameras weren’t perfectly aligned, I relied heavily on software rectification to enforce horizontal alignment. The large black borders in the rectified frames are the cost of cropping/warping to keep epipolar geometry correct and disparity stable.
Baseline & FOV constraints: With cheap webcams, small baseline changes and lens variation noticeably impacted depth noise. I had to tune baseline spacing and SGBM parameters to balance depth range vs. flicker and speckle artifacts.