Github Repo: https://github.com/TrooperZ/echomind
Inspiration
I came across the idea for EchoMind because I've been listening to music lately to feel better, because I've been sad and down lately, and my music has felt repetitive lately. I thought that maybe I could get a bunch of curated playlists and have a playlist generated based on my journaling habits.
What it does
EchoMind links your Spotify and Notion accounts together and generates a playlist based on the most recent journal entry in your Notion workspace. It uses various keywords and phrases to generate a weighted emotional sentiment of the text and uses those weights to generate a playlist.
How we built it
I built EchoMind with Next.js and used Firestore for the user database. I then used Notion and Spotify APIs for authentication and connections, then pulled the data from them to analyze. I wrote the emotion algorithm based on some online resources I found about frequency and emotional text.
Challenges we ran into
The biggest challenges I ran into were managing authentication and working with the playlist API and getting around TypeScript bugs with the API backend. Another challenge was designing a visually appealing UI in the little time I had to design it.
Accomplishments that we're proud of
I'm proud of the UI I designed, it's one of the better ones I've made in such a short time period. I'm also proud of the idea and the way I implemented it as well as how I made my own algorithm and didn't just make it a GPT wrapper.
What we learned
I learned how to properly manage data around various APIs to process it and design a visually appealing UI.
What's next for EchoMind
I plan to adjust the API interface and perform some optimizations to make it more scalable towards a larger audience and add other features such as listening time tracking, more optimized song selection algorithms, and other features.
Built With
- firestore
- nextjs
- react
- vercel
Log in or sign up for Devpost to join the conversation.