Smile App

Inspiration

When going through high school, students tend to feel shy and intimidated to present in front of their peers. Some of us have also had this experience and obviously, this was mostly because we never had practice and constructive criticism that helps us improve gradually. We hope, by making this app, many students and adults will get more practice and note where in their speech they can improve, giving users confidence in knowing how to present their speech.

What it does

The Smile app helps people to get ready for presentations and interview with access to an iPhone and our app. It allows the user to practice a speech in front of his/her phone camera. Then, the app uses a machine learning model to see instances where the user is happy, angry, neutral or sad: the general emotions that a human goes through. Generally, people tend to be on the happy or neutral emotional state when presenting a good speech or when communicating effectively.

How we built it

We designed the graphical user interface using Figma.com. Then we separately worked on implementing the project on both Android and iOS platforms using Machine Learning data models for iOS and Google Cloud Vision API for Android. By distributing tasks evenly

Challenges we ran into

Low internet speed Google cloud vision wasn’t working the way we wanted. //Android version was tough and was not completed. GitHub Core Machine Learning modules were difficult to get working Combining speech recognition messed up the code Gives a few false positives

Accomplishments that we're proud of

We were persistently working on the project no matter what problem we faced. We completed our iOS app and still looked for extra ways to improve the information given to the user.

What we learned

Using core ML Framework Boosting the accuracy of a data model Creating a speech recognition method Using Google Cloud Vision API to do sentiment analysis on a picture of a person

What's next for Smile

The next step for our application is to use common interview questions in order to make the interviewee more comfortable at the real interview. For instance, we can use the “Cracking the coding interview” questions to make CS students more comfortable and ready for the interview questions. Furthermore, this app can help students and employees ready for presentations by giving tips according to their results that did estimate by Smile app. We are going to use heartbeat data, gesture data models, speech emotion recognition, text recognition and face recognition data to analyze the interviewee performance.

Built With

Share this project:
×

Updates