Inspiration
As second year McGill students all studying in computer science (or computer science related) fields, we are very passionate about making projects which may help our community and which are accessible to all. Which is why we decided we would create an app targeted at children with autism which would help them recognize social cues in their environment to understand how others convey their emotions through speech.
What it does
Hence, we created "Rubber Ducky", similar to the famous expression often used in programming, our app allows the user to record their own audio clips or record others talking. With these clips, the app sends them to our python program (through the firebase storage function) which, in turn, analyzes the recording and sends a string variable (ex: "happy") into the firebase firestore. The app then takes this variable and according to which of the seven emotions it represents, we show a certain screen (every emotion has a different colour).
How we built it
The way we built this app was through 3 main processes: - Xcode & Swift: This was used to create the UI of the app as well as most of the back-end (choosing which screen to show according to which variable was outputted by our Python Program) - Firebase: The "link" between our app and the machine learning algorithm we had implemented in Python. Firebase allowed us both to push files from the app unto a shared storage area (in our case the audio files recorded on the app) . It also allowed us to pull the variable which the Python program resulted in. - Python: Our Machine Learning algorithm. We wrote a script which ran on a database of 24 actors that each had 60 different audio recordings. This allowed us to get a reasonably correct algorithm in which can check the perceived emotion of any inputed audio file. Now, this isn't perfect but it is working for the purposes of this hackathon.
Challenges we ran into
The challenges we faced mainly surrounded the distance we had to deal with. Several of us were quarantined, and thus only able to communicate through zoom or group chats. Also, two of us lost power in a snowstorm and had to learn how to work around no electricity or wifi. In terms of technology, our main issues were dealing with languages and technologies which were new to all of us and trying to make them work together, but through some late nights and a lot of determination, we managed to create a cohesive project.
Accomplishments that we're proud of
We are proud of having successfully pulled off working with a large dataset and using machine learning to create a useful product, both for children on the autism spectrum or for people living through social distancing times, despite limited prior knowledge in the area of machine learning. We also accomplished a fantastic-looking UI we feel will genuinely engage users and capture their attention. Finally, we are proud we have created a product that will be able to help those in need. Working with social disabilities can feel isolating, and we are motivated by the idea that one day we could connect people in ways we never imagined.
What we learned
All of us had previously worked on (very) simple Machine Learning algorithms as well as simple app design. However, this project took on a whole new perspective, we all became much more acquainted in how to link firebase to Xcode and to our python program. We also learned how to work in teams efficiently over zoom in a limited amount of time and how to communicate in the best way possible. Moreover, we found out that we make a pretty good team (all things considered)!
What's next for Rubber Duck
In the short term, we would train our model for more possible emotions and a wider selection of samples to make it more accurate. To collect these samples, we would work on user and data privacy and then ask for users to allow the app to use their samples if they were willing. Some larger-scale goals include implementing a feature that analyzes the speech and then gives a potential response to the user, for those who struggle with conversation. We would also like to make it more widely available on other platforms and to implement a feature such as conversation videos that would allow the user to practice eye contact, which children with autism often struggle with.



Log in or sign up for Devpost to join the conversation.