← Back to All Projects

AI-Driven Navigation Software

Robotics
AIRoboticsComputer VisionEmbedded Systems
AI-Driven Navigation Software cover image

Summary

Engineered the autonomous control software for a custom robotic platform (NVIDIA Jetson). Built computer vision pipelines using Python and Linux to handle localization and mapping while interfacing with low-level controller logic.

Details

This project was a Live Demo and unfortunately, does not have any images or videos.

This project focused on the development of an autonomous navigation stack for a custom robotic platform powered by the NVIDIA Jetson TX1. The primary goal was to engineer a system capable of interpreting environmental data to perform self-driving tasks, including path planning, localization, and real-time obstacle avoidance.

Computer Vision Pipeline

Using Python, I developed a vision pipeline that allowed the robot to perceive its environment through a camera feed.

  • Feature Detection: The system was trained to identify specific color and shape combinations.
  • Spatial Interpretation: By analyzing the perspective and size of detected objects, the robot could estimate distance and orientation relative to markers.

Multi-Sensor Fusion

To ensure reliability in varied environments, I implemented a sensor fusion layer that combined inputs from several hardware components:

  • Distance Perception: Integrated Sonar (Ultrasonic) sensors for object detection, ensuring the robot could identify obstacles that might be difficult for the camera to perceive in low-light or high-glare conditions.
  • Inertial Measurement: Utilized an Accelerometer and Gyroscope to track the robot’s physical state. This allowed for precise turning angles and acceleration control.
  • Kinematics: Directly controlled motors to translate navigation logic into physical movement, implementing smooth velocity ramping and directional correction.

Localization and Obstacle Avoidance

  • Localization: The robot estimated its position within a 3D space by reconciling its internal movement tracking with the physical landmarks it detected.
  • Reactive Navigation: I developed a pathing algorithm that prioritized obstacle avoidance. When the Sonar or Vision systems detected an obstruction, the robot would dynamically recalculate its heading to move along a designated path without physical collision.

Demonstration

  • Autonomous Decision Making: Successfully demonstrated a "Self-Driving" loop where the robot performed localization, detection, and movement without human intervention.
  • Remote Control: Integrated a MVP mobile application that could remotely control the robot's movements via a mobile device, allowing for real-time interaction and testing of the autonomous system.