My team and I attended a 5 day internship which was associated with the National Institute of Speech and Hearing (NISH) where we talked to people who were physically and mentally challenged. We were all moved by that and we wanted to do something for them which can hopefully make life easier or better for them. That was when we came across a speaker who was visually-impaired, talking about the challenges he faced when handling money with others on the road, and that how many times he had been cheated upon. He specifically said the existing apps for such currency recognition had really complicated user interface and he was better off without them. Also asking to other people during this covid-19 outbreak is highly risky, excluding the risk that the "other" person here was a part of the "cheater's" alibi. Hence we decided to build a mobile application which was easy to use and fast too, so that they can be more safe out in public and have a better life in the world.
What it does
Notiefy is a currency detection app for the visually impaired which has a really basic UI, so that the app can be easily and quickly used. The app opens up with a camera with a large button in the bottom-middle for the blind person to click the image easily, and the application would process the image and with the help of artificial intelligence, it would classify what currency-note it is with the help of a computer-generated audio. It also says the total amount of money it has been shown to the camera because the speaker had mentioned that also as a difficulty, when the person has a bunch notes to count. The app, time being works only for Indian currency notes.
How we built it
The app was developed using the Flutter framework which is a mobile application development framework brought up by Google. The app also has a deep learning model in the backend for classification of the currency notes.
Challenges we ran into
Making the model for the backend was certainly hard as we had to achieve a certain amount of accuracy, so the model had to be trained properly. Next was getting the image to be processed by the app.
Accomplishments that we are proud of
Our teachers were really pleased with our work and put it forward for NISH approval and I hope it turns out good for us. We have a sense of joy because we believe our app has the potential of making lives easier and safer for the visually-impaired
What we learned
We learnt flutter in depth and how it can be integrated with Machine Learning. We also figured a lot can be done to make the world a better place for the physically challenged.
What's next for Noteify
- We are trying to integrate the flash module into the application so that it has more accuracy.
- Then we might publish the app into playstore when we have enough money.
- We are also working on real-time scanning so that the user would not have to click the image.