We spoke about our grandparents' struggle with dementia, and how the gradual decline of their memory had a significant toll on their emotional and physical wellbeing. The app is designed for those with early onset Alzheimer's, as the recent diagnosis for the patient is indication of their ability to begin habitual brain-stimulating activity; they are able to exercise their associative memory through ForgetMeNot in hopes of slowing down the progression of Alzheimers and gaining confidence with increased independence.
What it does
ForgetMeNot utilizes image recognition to produce results that remind the patient how to complete their daily tasks. For instance, the user would take a picture (the scan), and the app would output a sort of visual stimuli (video, picture, link to resource) that will help the user complete said task. If the picture was a button on a shirt — patients frequently struggle with getting dressed once affected by Alzheimer's — a video would be shown on how to button a shirt.
How we built it
The program is written in Java and we integrated the Clarifai API into the app. The Android app was developed through Android Studio.
Challenges we ran into
It was incredibly difficult to integrate the API into the app, and the Java outputs took time to format correctly.
Accomplishments that we're proud of
Milestone successes during the hackathon included: solidifying the product flowchart, properly integrating the Clarifai API into the app, and organizing the output of the image recognition API.
What we learned
A great deal on Java and Android app development through Android Studio.
What's next for ForgetMeNot
We would like to implement the app's program into AR goggles so the experience can be hands-free for the user.