Projects

Go1 DLIOM-Based Autonomous Navigation Permalink

Dec 2023 - Aug 2024

Configured Unitree Go1 with 3D LiDAR and onboard Direct LiDAR-Inertial Odometry and Mapping (DLIOM) to enable accurate state estimation and autonomous navigation using ROS2 and Nav2 in real-world environments.

Image to Robot Drawing Trajectory Converter Permalink

iRobot Intern Hackathon

Jul 2023 - Aug 2023

Created a program that converted input images (uploaded directly by users or captured with a robot’s fisheye camera) to trajectories that the robot would follow to perform its drawing task.

UUV (Underwater Unmanned Vehicle) Localization Permalink

May 2023 - Apr 2024

Designed an underwater localization system for UUVs utilizing inertial pose estimation and beacon triangulation based on acoustic communication understanding constraints of underwater communication and dead reckoning using IMU.

School Tunnel Mapping Permalink

Mar 2023 - May 2023

Created a map of school tunnel using various SLAM algorithms on sensor data collected by a LiDAR, Cameras, and IMU.

Autonomous Reconnaissance Robot Permalink

Nov 2022 - Dec 2022

Utilized Gmapping SLAM, Frontier-Based Exploration and tag detection on TurtleBot3 to achieve a reconnaissance mission autonomously in a closed and unkown environment.

Wearable Fitness Motion Tracker

ROK Armed Forces Hackathon

Aug 2021 - Oct 2021

Designed a wearable fitness tracker using an IMU sensor and complementary filter to estimate body orientation and motion in real time. Implemented posture classification and repetition counting algorithms for exercises (e.g., squats), using calibrated angle and angular velocity thresholds to detect form and timing accuracy.

Flower-Care Monitoring System

ROK Army Startup Competition

Jul 2021 - Sep 2021

Designed a flower-care monitoring system using various sensors such as moisture, light, and temperature sensors and transmitted sensor data via Wi-Fi to a Google Cloud server for further analysis.

Wii-mote Controlled Robot Arm Permalink

Sep 2019 - Dec 2019

Configured a robotic arm on an FPGA development board using memory-mapped I/O. Programmed control logic in C++ to interpret directional gestures from a Wii-mote’s onboard IMU. Detected up, down, left, and right motions to drive the arm’s servos in real time. Mapped Wii-mote button inputs to control the gripper’s open and close actions. Integrated sensor data processing, embedded control, and wireless communication for gesture-based robotic manipulation.