roll axis PID tuning First outdoor flight test PID timing test Custom flight‑controller hardware + CV stack for autonomous flight Role: Systems & firmwareStatus: In flight‑tuning ESP32FreeRTOSRaspberry PiSLAMPID ControlKiCad Overview This past summer, I was extremely interested in avionics, and embedded software development. What better way to learn than building a flight controller from scratch! I developed a fully custom flight controller PCB, integrating IMU, GPS, and cameras. It allows communication between a Raspberry PI 0W for vision information. It combines real‑time embedded control with a companion computer vision pipeline to enable reliable autonomous flight on low‑cost hardware. I also developed the quadrotor chassis from scratch in Fusion 360, trying to minimize for 3D-printing weight and optimizing for center of mass distribution. Concept Split compute: the ESP32 handles deterministic loops (sensing, control, motor PWM) under FreeRTOS, while a Raspberry Pi runs SLAM and obstacle avoidance. The design emphasises easy tuning and log‑driven iteration. Design Process Developed quadrotor chassis in Fusion 360, optimizing for center of gravity, and minimizing 3D-printing weight. Designed a custom controller PCB (power management, IMU, GPS, barometer, motor drivers) in KiCad. Implemented multi‑loop PID for attitude/altitude; prioritized IMU pipeline with task affinities on FreeRTOS. Integrated stereo depth & detection on the Pi; fused estimates for navigation. Bench‑tested PWM transients and added flyback/EMI mitigation; iterated with black‑box logs. Specifications MCU: ESP32 @ FreeRTOS Companion: Raspberry Pi Sensors: IMU (MPU 6500), GPS(GY-NEO6MV2), Barometer(BPM 280) Comms: I2C/SPI/UART, PWM ESCs Vision: Stereo depth + detection Planning: A*/RRT (roadmap) Results Hardware assembled and validated; closed‑loop stability achieved on bench. Vision stack integrated; flight envelope characterization in progress. Currently in process of self-navigation implementation using A* and Pathweaver. V2 Flight Controller Beginning work on a second revision that follows the OpenFC architecture for improved modularity and tuning. beginning of second version of flight controller, based on OpenFC architecture GitHub Short pre‑flight demo
Your browser does not support the video tag. Driving test set up Depth pipeline using StereoSGBM/SBM with calibration & rectification Role: Solo developerTimeline: May–August 2025 OpenCVPythonCalibration3D ReconstructionReal‑time Overview A real‑time stereo depth pipeline built for robotics perception. Produces stable disparity maps on commodity cameras with a full calibration→rectification→disparity→post‑filtering chain. Concept Engineer a reproducible CV stack that trades a bit of peak accuracy for consistent, low‑latency results appropriate for embedded navigation. Design Process Implemented intrinsic/extrinsic calibration and lens distortion correction. Performed epipolar rectification and StereoSGBM disparity with tunable parameters. Benchmarked against Middlebury; prepared Kalman temporal smoothing hooks. Specifications Language: Python + OpenCV Modes: SGBM / BM Artifacts: Rectified pairs, disparity maps Outputs: Depth estimates & point clouds Results Consistent depth maps validated on public datasets and live feeds. Ready for integration into on‑robot navigation nodes. Rough proof of concept, utilizing cheap amazon webcams fused together in a 3D print. Pitfalls & Challenges Undocumented cameras: The Amazon listing lacked sensor/lens details, so I disassembled the webcams to identify the sensor and track down the manufacturer datasheet. I used those specs (sensor size, focal characteristics, etc.) to sanity‑check and improve intrinsic calibration. Mechanical misalignment from the mount: A 3D‑printed stereo rig was difficult to keep consistent due to printer tolerances and warping. I iterated by adding small shims (tape) to adjust pitch/yaw in ~0.5–1° steps until epipolar lines were close. Software rectification trade‑offs: Because the cameras weren’t perfectly aligned, I relied heavily on software rectification to enforce horizontal alignment. The large black borders in the rectified frames are the cost of cropping/warping to keep epipolar geometry correct and disparity stable. Baseline & FOV constraints: With cheap webcams, small baseline changes and lens variation noticeably impacted depth noise. I had to tune baseline spacing and SGBM parameters to balance depth range vs. flicker and speckle artifacts. Source on GitHub Live demo Processing video
Ultrasonic‑guided parking with real‑time distance feedback Role: Embedded developer ArduinoUltrasonic SensorsEmbedded Overview A garage aid that measures bumper‑to‑wall distance via ultrasonic sensing and provides intuitive feedback cues for consistent, safe parking. Concept Low‑cost, reliable sensing with debounced reads and basic smoothing for stable display—no network or app required. Design Process Implemented sensor polling loop with outlier rejection and moving‑average smoothing. Designed simple visual feedback logic and thresholds for driver cues. Iterated mounting geometry to reduce spurious reflections. Specifications MCU: Arduino Sensor: Ultrasonic rangefinder Loop: Debounced + smoothed readings UI: LED/indicator thresholds Results More consistent parking stops; reduces bumper contact in tight spaces.