Inspiration

The inspiration behind JARVIS was to create an AI tool that could assist blind individuals in their daily tasks, offering accessibility through voice commands.

What it does

JARVIS provides a range of features designed to assist blind users, including voice-controlled object detection, music playback from Spotify, and providing information such as the current date.

How we built it

We built JARVIS using a combination of technologies, including speech recognition, OpenCV for object detection, and the Spotify API for music playback. The frontend was developed using HTML, CSS, and JavaScript, while the backend was powered by Flask for handling requests.

Challenges we ran into

One challenge we faced was integrating the Spotify API for music playback, as it required proper authentication and handling of user permissions. Additionally, optimizing the object detection feature for real-time performance presented technical hurdles.

Accomplishments that we're proud of

We're proud to have developed a functional AI tool that genuinely assists blind users in their daily lives. Creating an intuitive interface and implementing features like object detection and music playback demonstrates our commitment to accessibility.

What we learned

Throughout the development process, we learned valuable lessons in integrating various APIs, optimizing performance for real-time tasks, and designing user-friendly interfaces. Additionally, we gained insights into the challenges faced by blind individuals in accessing technology.

What's next for JARVIS

In the future, we plan to further enhance JARVIS by adding more advanced features, improving its accuracy and responsiveness, and expanding its capabilities to cater to a broader range of user needs. Additionally, we aim to conduct user testing and gather feedback for continuous improvement.

Built With

Share this project:

Updates