Inspiration

Inspiration The inspiration for this projects was from the numerous project presentations I've seen in my senior year and receiving feedback myself on how I can improve. It is difficult to practice by yourself at home, but with the convenience of smartphones, this should make the process easier and improving personal skills.

What it does

What it Does: The Project uses IBM-Watson's bluemix tone-analyzer API, Android Voice Recognizer, Google Mobile Vision face detection API to produce a Presentation confidence level score out of 100. It does this by converting the Speech into text and using the tone-analyzer to produce a qualitative score from categories such as confidence, or emotions. The Voice Recognizer has records the intensity of the Voice and it is averaged for the time duration of presentation. The Google Mobile Vision face detection API uses image processing to display a live video detection of the user's face, the probability of the eyes being open, and how happy you are (as the happy/relaxed you are, the better you can perform). These Scores are then factored together to produce a confidence level score out of 100.

How I built it

Bill of Materials I built it using an android smartphone, Android Studio, IBM-Watson's Bluemix API, Android Voice Recognizer, and Google's mobile Vision API.

Challenges I ran into

Challenges Some of the Biggest Challenges was trying to get each piece to work together as there was a ton of Debugging. My regret of this project was not incorporating the pebble watch and its accelerometer to detect hand jitteriness.

Accomplishments that I'm proud of

Accomplishments It was one of the toughest moments I've had with my phone as I was constantly trying to get it to listen to me. Eventually, it came around. In all seriousness though, I am proud of how much I've improved in utilizing APIs to develop a project like this.

What I learned

Lessons My phone is still a terrible listener, but I've gotten a better idea of the capabilities that android phones have. Especially with all the sensors it utilizes.

What's next for SmartPitch

What's next As mentioned before, I will try to get the pebble accelerometer to work with the smartphone in order to detect and take into consideration the hand jitteriness as hand and body coordination is important in a presentation. I'm thinking of also including a user-adjustable timer and some phones have a heartbeat sensor, so it would be nice to take that into consideration as well. TONS of debugging both backend and front left to do as well.

Share this project:
×

Updates