While we were brainstorming global healthcare issues, Vignav thought about his uncle, who is rapidly losing his vision and will soon become legally blind. Having seen him struggle with simple everyday actions we all take for granted, Vignav was worried about the impacts of his vision on more important situations, such as reading prescriptions in the doctor’s office. We were concerned that if he were to lose all his vision, he would be unable to understand his medicine and prescriptions, thereby further jeopardizing his health.

Our curiosity about Vignav’s uncle inspired us to research deeper into the medical treatment issues for the visually impaired. The results we found were absolutely shocking. 285 million people in the world are visually impaired, and 75% of the world’s population needs prescriptions at one point in their life; but even so, in today’s society, the focus of healthcare strays away from helping the visually impaired, as only 1% of doctors offer prescriptions in braille or in another form accessible to visually impaired people! To add to this issue, 86% of pharmacies don’t even accommodate the requests of their visually impaired patients! The scope of our issue is much larger than it seems at first glance, yet little to no attention has been given to come up with a solution. Inability to read prescriptions can lead to devastating consequences as the basic healthcare of the patient is jeopardized, which may lead to future injuries and possibly death. With this in mind, along with the fact that blind people can easily use smartphones, we decided to try and solve this unique healthcare problem using technology.

What it does

Our app contains various advanced features. Xcode was the software that we used to build our entire app, and where we imported a multitude of libraries such as TesseractOCR and Firebase that are integrated into our software.

First, we incorporated a user profile. This page allows users to enter basic personal information, such as their name, primary physician, and condition for which they need prescriptions. This will be saved in a personalized database for each user to increase the efficiency of viewing that prescription later. We created a UI that has TextFields, and through Apple Accessibility, the user will be told where each TextField is and will enter the information. This information is then sent to our Firebase database using the Firebase libraries in Xcode. Our future goals are to incorporate smart profiling, where we will implement a system, based on the user’s profile, to find locations where their required prescriptions can be found.

The main function of our app is the Scan Prescriptions function. First, the user enters the name of the prescription. Then, using Apple Accessibility, they will take a picture of their prescription. Although it may seem unlikely that those who are visually impaired can perform these functions, it is actually a simple process for them. Apple Accessibility, a system implemented in iPhones that the visually impaired are used to, allows for the person to hover their camera over the prescription until the paper is seen. When this happens, the person is verbally notified and can continue on with our process. Lastly, after taking the photo, they will click the save prescription button. This allows for their prescription to be stored in their personal database and be accessible to them at later times.

The enter name button is simply just a TextField, and we save their information. We coded this function by first importing AVKit, an Apple provided library that allowed us to open the camera. Then, when the camera is open, our app will start verbally reading the text presented in front of the user. Once it sees the prescription paper, the user will simply press “Use Photo”. Then, we fed that image through TesseractOCR, which is a library that converts an image to speech. Tesseract is a convolutional neural network that uses machine learning to detect words and text in images. Lastly, once the user clicks on the save prescription button, we send the name of the prescription as well as the text into firebase. The name is the key, and then we store the actual text within that in firebase, which is accessible for later use.

In our third section of our app, the past prescriptions section, we display all the prescriptions that have been stored previously in the personalized database on a tableview. Then, alongside each saved prescription, there is a clickable button that when clicked, begins to read out the prescription. The accessibility feature allows them to locate the button that they are looking for, which ultimately reads out their prescription. We coded this by making a tableviewcontroller and simply creating a cell for each piece of data that is in firebase. We imported the key and the actual text, and we created a button for each that will play the text. The audio is read using AVSpeechSynthesizer.

How we built it

Throughout the entire app, we incorporate two main features. First, we have a speech synthesizer on each view that reads out what that specific view does. Second, the user will have Apple Accessibility on, so when they tap the button, it will tell them what button they are pressing, and then they can confirm.

Challenges we ran into

Challenges we faced included finding an OCR that would take images into text. Luckily, we were able to find TesseractOCR by Google. We faced another issue when trying to figure out how TableViewControllers function. This along with importing data from the firebase database were the parts of our code which took the longest to write. Still, we used other resources such as Youtube to overcome these obstacles.

What is next for Tickbird

Our future plans are to add the smart profiling feature as explained above and to make the app accessible in languages other than English and Spanish.

Update: Tickbird is now on the App Store! Go check it out and recommend it to someone you may know! App Store Link: Landing Page:

Fun Fact: The reason we named our app Tickbird is because rhinos naturally have poor eyesight, and tickbirds help guide them throughout the wild. Our app is the tickbird for visually impaired patients!

Built With

Share this project: