There's millions of mobile users around the world who dive into a world of negativity and clutter everyday. We wanted to apply our interest in software and mental health to inject a dose of positivity and community into the mobile world.
What it does
Riize begins asks for how you're feeling as you login to the app. Depending on your current emotion, Riize knows exactly what media to display to the user. Media can consist of videos, articles, images, music, quotes, games, etc aimed at uplifting each specific emotion. After displaying the user a series of media, Riize prompts the user to rate their experience and takes this data to generate a more personalized and effective experience for next time through improved content generation.
How we built it
Riize is a cross platform mobile application built on React Native. We built it using several mobile friendly components. We also modeled a Firebase Realtime Database that would display how we would store and rank content that we would provide the user to ensure the highest quality is delivered.
Challenges we ran into
We had no experience in React Native, so it was a strong learning curve, but we were able to rise above it and learn fundamentals effectively.
Accomplishments that we're proud of
Developing a beautiful seamless UI that guides the user through a series of uplifting media. We're proud of the amount of work we were able to accomplish in such a small span of time. We're also proud of our teamwork and time management. We're excited to build this product even farther.
What we learned
We learned the fundamentals of React Native good code practice, and mastered version control with Git (merge conflicts).
What's next for Riize
We plan on implementing several things starting with connecting our database to our front end. We wish to grow our community and improve our community aspect by crowdsourcing our media. We also hope to improve our front end by adding a swipe right and left model for media to help the user affirm what they liked and what they didn't. We wish to add computer vision, specifically emotion recognition using Microsoft Azure's API for emotion classification to rank our content in realtime based on user response. Finally, we hope to add machine learning algorithms to see what media would help each different kind of user the best.