Coming to the hackathon as four strangers, the only thing in common we had is we wanted to make something that had a social impact. Since all of us were very interested in working with AR/VR, we thought about creating a VR therapeutic experience. We added a lot of functionality that we thought would be useful along the way.
In the end, we created a virtual reality environment in which the user can talk to a VR counselor. The environment changes based on the user's heartbeat (extracted from a Fitbit), in an attempt to implement Biofeedback to allow the user to have better control over their vitals. The heart rate data is also displayed on a web dashboard, for the user's human therapist/counselor to overview.
We used many different APIs to make our application. For speech recognition, we used Google Cloud's Text-to-Speech and Speech-to-Text APIs. For heart rate, we used a FitBit and created an app for it which sent data to the phone, which again uploaded it on Google's Firebase. We used various tools to create a website that worked as a dashboard fetching data from Firebase. This entire project was put together in Unity3D with OpenVR using the Oculus Rift.
The biggest challenges we faced were related to API integrations. We had a lot of trouble getting the Google Cloud APIs working on Unity as it was the first time any of us were using REST API. Getting live data from Fitbit was also very hard as it required to go through a lot of authentication to start transferring the data.
Therefore most of our learning also revolved around FitBit and API integrations in Unity. For two of our four team members, it was their first hackathon, so there was a lot for them to learn from that too.