Inspiration
Gone are the days of practicing public speaking in a mirror. You shouldn’t need an auditorium full of hundreds of people to be able to visualize giving a keynote speech. This app allows people to put themselves in public speaking situations that are difficult to emulate in every day life. We also wanted to give anyone who wants to improve their speech, including those with speech impediments, a safe space to practice and attain feedback.
What it does
The Queen’s Speech allows users to use Google Cardboard with a virtual reality environment to record and analyze their audience interaction while giving a virtual speech. Using 3D head tracking, we are able to give real time feedback on where the speaker is looking during the speech so that users can improve their interaction with the audience. We also allow the users to play their speech back in order to listen to pace, intonation, and content. We are working on providing immediate feedback on the number of "um"s and "like"s to improve eloquence and clarity of speech.
How we built it
Incorporating Adobe After Effects and the Unity game engine, we used C# scripting to combine the best of 360 degree imagery and speech feedback.
Challenges we ran into
Connecting to the Microsoft Project Oxford proved more difficult than expected on our Mac laptops than the typical PC. We couldn't integrate real 360 footage due to lack of Unity support.
Accomplishments that we're proud of
Being able to provide a 3D like video experience through image sequencing, as well as highlighting user focus points, and expanding user engagement. Hosting on Google Cardboard makes it accessible to more users.
What's next for The Queen's Speech
Currently working on word analysis to track "Ums" and "Likes" and incorporating Project Oxford, as well as more diverse 3D videos.
Log in or sign up for Devpost to join the conversation.