Inspiration
Real-world autonomous robotics challenges, such as self-driving cars and search-and-rescue robots, inspire this project. The goal is to develop a robot that intelligently navigates a maze while reacting to its environment using sensor-based decision-making. The challenge of optimizing speed and accuracy while overcoming sensor limitations makes this project both engaging and educational.
What it does
The robot currently has three different functions based on our program.
1. Color Detection and Place the Flag Algorithm: Pathfinding and Color-Based Navigation
- The robot starts facing away from the center of six concentric rings, each marked with an RGB color.
- The robot moves toward the center by detecting the color of the current ring and ensuring it follows a non-repeating path.
- Once the robot has found the first loop, it looks left until it sees the old color and then looks back right, going from the new color to the old color. It measures the time between the two and divides the time by two to figure out the center.
- The color sensor (TCS3200) reads the RGB values, and the robot adjusts its direction accordingly.
- The robot drops the flag at the target zone once the center is reached.
2. Maze Navigation Algorithm: Wall Detection & Color-Guided Decision Making The robot moves forward by default unless it detects a wall using the ultrasonic sensor (HC-SR04). When a wall is detected, the robot reads the color tile underneath using the TCS3200 color sensor. The robot follows a predefined movement rule based on color: Black → Continue moving forward. Blue → Turn left. Green → Turn right. Red → Perform a U-turn. The goal is to complete the maze quickly while avoiding unnecessary movements.
3. Color Pattern Detection Algorithm: Sequential Color Recognition & Memory-Based Pathfinding The maze contains randomly placed colored tiles, giving the robot a specific color sequence to follow. The sequence the robot must detect is: Red Green Blue Green (new location) Blue (new location) When the robot detects a correct color from the sequence, it blinks an LED to confirm recognition. To meet the challenge requirements, the robot ensures that duplicate colors in the sequence are detected at different locations. The robot navigates around obstacles using the ultrasonic sensor while ensuring the correct color order is followed. The goal is to complete the sequence correctly without missing colors or repeating the exact location.
How we built it
Hardware Components: L298N Motor Driver Ultrasonic Sensor (HC-SR04) TCS3200 Color Sensor Arduino Uno Chassis with Motors
Software & Programming: The robot is programmed using C++ in Arduino. The code follows a modular approach:
- Default forward movement.
- Wall detection using an ultrasonic sensor.
- Color-based decision-making for navigation.
- Motor control functions for turns, movement, and stopping. The system was fine-tuned through multiple trial-and-error adjustments to improve speed and accuracy.
Challenges we ran into
Sensor Accuracy & Calibration The ultrasonic sensor readings fluctuated in some cases, requiring better distance threshold tuning. The TCS3200 color sensor had lighting-dependent variations, so we had to fine-tune the color recognition algorithm.
Smooth & Accurate Turning Since the robot does not use encoders, timing-based turning required multiple trials to ensure consistent 90-degree turns.
Optimizing Decision-Making Speed The robot initially had delays between wall detection and color recognition, which caused slower navigation. Optimized sensor polling improved real-time decision-making.
Accomplishments that we're proud of
*Efficient Color Recognition * After calibration, the robot consistently detects colors in various lighting conditions. *Optimized Real-Time Decision-Making * The robot moves smoothly and efficiently without unnecessary stops. *Structured & Modular Code * The well-organized code makes it easy to adjust or expand in future challenges.
What we learned
Sensor Calibration is Critical as the performance of ultrasonic and color sensors depends on their environment. Efficient Code Structure Improves Performance and Optimized Logic, reduces delays, and improves response times. Real-world testing is Essential since theoretical programming requires real-world adjustments for better accuracy. Autonomous Navigation is Complex, and real-time perception and decision-making require precise tuning and optimized algorithms.
What's next for Maze Robot - closed challenge
Future possibilities include enhancing turn precision by adding wheel encoders or using IMU sensors for accurate angular control. Speed Optimization by implementing adaptive speed control for faster movement while maintaining accuracy. Better Obstacle Avoidance by using multiple ultrasonic sensors to handle complex maze environments.
Built With
- arduino
- light-sensor
- ultrasonic-sensor
Log in or sign up for Devpost to join the conversation.