Inspiration

MLH Prime is a unique event and this is why we challenged ourselves to create something special. We decided to build a solution to provide a new training tool for people suffering from autism spectrum disorders. Autism is characterized by impaired social interaction, verbal and non-verbal communication, and restricted and repetitive behavior. Social skills are a key component of our performance in society, and reinforcing them is one of the main therapeutic goals with these type of patients. Among all the available therapies, training in emotion recognition and expression has been proven to be one of the most effective ones. We’ve chosen to build a solution upon this widely accepted method, boosting it with the use of technology and a biofeedback monitoring system.

We’re very happy to introduce Pathos emotion trainer!

What it does

Pathos’ solution consists of two main elements.

The web app:

Analyses the therapist’s facial expressions when interacting with the patient. These are sent to the patient’s mobile app in image and text format allowing an easier interpretation of the interlocutor’s emotions.

Provides biofeedback based on the patient’s frontal lobe neural activity, which is monitored using Muse. Key data displayed on the web app interface includes live training information -status of the sensors, mellow/stress level of the patient, brain waves activity, and self-expression identification- and sessions analytics -historic record of sessions KPIs.

The mobile app:

Becomes a helping hand to the patient, who is able to better identify the interlocutor’s expressions thanks to the display of the therapist’s emotion detected by the face expression recognition system.

Allows the patient better identify and express emotions picking them from a menu in image and text format. The chosen emotion is sent to the therapist’s dashboard once picked by the patient.

How we built it

To build Pathos we reviewed literature on emotion recognition training to understand how could we get the most out of the hardware at our disposal. Pathos’ main elements are Muse and OpenCV’s emotion recognition module. We have used Go for the backend and Python for OpenCV since it is a widely supported language for this library. Muse was connected via a server written in Go that receives information with OSC protocol. For the front end we used Bootstrap and EpochJS's real-time visualisation library.

Challenges we ran into

First of all, the project implied carrying out some bibliographic research to ensure that what we wanted to build was actually feasible with the means we had at our disposal. It was also the first time we used Muse, which means we had to understand its protocols and learn how to interpret the data it provides. Regarding the computer vision side of the project, we couldn’t use Android as planned because continuous camera usage was not allowed by the OS, as it forced to ask permission to the user repeatedly. We tried to bypass the issue by using a webcam and Intel Edison, but other issues arose and we ended up doing it with our own Microsoft Surface.

Accomplishments that we are proud of

First off, we are proud to submit the project on time -it was our first 24-hour hackathon, which added a layer of complexity to the challenge. We are also proud to combine the strengths of our team, leveraging Ivan & David’s programming wizardry with Ricard and Adrian’s previous background on behavioral sciences. Finally, this hackathon has been the one where Ricard and Adrian have coded the most, which we (especially them :p) are very proud of.

What we learned

How to use Muse, OpenCV, and real-time data visualisation tools; prototyping/designing techniques; better web design practices; how to link neuropsychology with technology.

What's next for Pathos

Try the product with more accurate EEG hardware, test it in a controlled environment, and add sessions historic data and analytics.

Share this project:
×

Updates