Inspiration

Using Alternative transport mobility modes such as bicycle, skateboard, scooters, powered skates and hoverboards is much more demanding than using an automobiles and by assisting with intersection safety and rider relevant information, hopefully decrease automobile use. This project will use sensor supported AI discrimination of pedestrians and alternative mobility users.

What it does

Using visual recognition, positioning sensors and dynamic signage, optimize user trips. The trip could also be customized to a person needs if a profile is associated to the trip.
Visual impairments such as color blindness and night blindness could be compensated for. Intersections could have audio enhancements for blind pedestrians and cyclists.

How we built it

The Project is built around Extensible Modularity. The first iteration of the sensor was created with an Arduino 2560. We utilized a sonic sensor that can detect an object up to 13ft (or ~3 meters). For the demo build if an object is detected over 60 inches away the indicator is green. If an object is detected between 12 inches and 60 inches the light will change to yellow. The light will change to red if the distance of an object is less than a foot. An LCD shows the exact measurement. A smaller form was created without the LCD using an Arduino Nano to demonstrate the smaller form factor available.

Challenges we ran into

This project integrates a wide spectrum of technologies, sensing, image processing, image recognition, AI training, communicating with traffic control system, aggregating various emergency, public information systems, signage systems and unknown, unknowns.

Accomplishments that we're proud of

Completed compilation of sketches for green, yellow, red indicators using ultrasonic sensors.

What we learned

What's next for Alternative Mobility Support System

Built With

Share this project:
×

Updates