Location: MIT Media Lab, 6th Floor, Atrium, Table right across Wayfair booth

Inspiration

It is challenging for those with autism to connect with those around them and understand societal emotions and expressions. Hence they do not react accordingly. This results in diminished social experience and sometimes social alienation. Our app solves this problem and increases sense of belonging for those with autism. We are mainly targeting Austistic children.

What it does

Smilehelper creates enhanced social experiences for those with autism by translating the facial emotions into optimal sensory stimuli. We use familiar imagery and familiar sounds that are associated with the corresponding emotion. The app user can grasp, feel and react to the emotions faster and better

How I built it

We have a diverse team of developers, 3D modeler, Austism researcher and product manager. We have a diverse team of developers, 3D modeler, Austism researcher and product manager. We used Unity, Azure Face API, hololens and Maya to build the application

Challenges I ran into

Initial set up of unity with hololens was extremely time consuming

Accomplishments that I'm proud of

Getting Face API to work, 3D modeling of the squirrel

What I learned

We learned how to use github and unity. We learnt Azure Face API and hololens set up and operation. We are all first timers for hackathon. We learned how to work together with complete strangers under pressure.

What's next for Smilehelper

We are aiming for customized translations thru machine learning for amplified emotion translations and imagery representing child's familiar and favorite objects (Toys, legos). Our app can also be used for enhancing collaboration with Autistic people

Built With

Share this project:

Updates