Touch screen devices have been used increasingly in the domain of cognitive healing as an aid for children with autism, down-syndrome, and brain traumatic injury To help treat amnesia, dementia, and post stroke symptoms There generally is no one-size-fits all solution to help rehabilitate these victims This is validated by a list of methods to treat amnesia from which includes hypnosis, energy psychology, cognitive therapy, nutrition and Technical Assistance from iPhone, iPad, tablet as potential treatment methods

What it does

CORTEX features two exercises to help people of with different cognitive autistic people read and understand facial expressions and people who suffer cognitive impairment following intermediate and severe strokes. The first group can you the game called _ Sweet Emotion _ to correctly identify people's facial expression with the power of their camera. The second game called _ Magnifying Glass 2 _ lets people point their phone at various objects and attempt to correctly identify it through a quiz like interface.

All the user's progress can be tracked via a separate web interface.

How we built it

We used Swift to build the iOS app, the model Inceptionv3 for object classification, and CNNEmotions for expression classification. We used React to build the web app, and the app posts it's information to a Firebase backend which we use for authentication and as a database.

Challenges we ran into

ML models weren't as accurate in portrait mode as expected. Design challenges since we designed for a group with special requirements.

Accomplishments that we're proud of

Implemented accessibility services for illiterate, the object detection works quite well.

What we learned

Learned many skills in Firebase, more advanced iOS development skills and web development skills

What's next for Cortex

User testing, more background research, beginning to refine the exercises based on user input and academic research.

Share this project: