Inspiration

It always feels like we have more things we need to remember to do and to keep track of, which creates large amounts of stress during our everyday lives. This stress is especially felt while trying to fall asleep, and is one of the main causes of insomnia.

Often, however, systems for note-taking and remembering, such as bullet journals, spreadsheets, and todo list apps are complex and time-intensive to use, which just adds to the stress they are meant to relieve.

Brain Dump allows you to simply pour your thoughts and worries out into one app, where you can easily access them again from anywhere, any time, on any device.

What it does

Brain Dump is a cross-platform app designed to relieve that stress through easy to use user interfaces that allow you to record everything you need to remember, and no longer need to worry about remembering everything.

Through an Alexa Skill, you can record your thoughts, even when your hands are busy. Then, you can organize and manage everything on our cross-platform Flutter app, which runs natively on your phone, desktop, or the web. Our apps allows users to easily keep track of what is going on in their lives, rather than just adding another complex and hard to use system to their already busy lives. Brain Dump uses intuitive user interfaces to relieve stress, rather than creating more.

How we built it

To begin with, we selected two main platforms for the front end. We built our web/mobile/desktop app in Flutter, and we built our smart home component for Amazon Alexa. These services allow our app to run on pretty much every device that people have, anywhere, so there are no accessibility issues with getting the data. The whole point of the app is to reduce stress, we don’t want the user to always need to come back to a single device to reduce their stress! We also started working with Google Assistant at the end, but we didn’t have enough time to get that fully integrated.

From there, we next built out our database architecture using Google Firestore. This serves as the central connection point for all data throughout the app that is coming in or out from any device or interface. We then built a wide range of central database functions on top of the database as basically our own internal library, so that the database actions are identical regardless of the requesting device.

In between these two layers (the front end and the database), we used Google Cloud Run to stand up a serverless Application Programming Interface (API) for each and every device that we support. These can range from a single massive function that handles all Alexa Skill responses, to a complex web of endpoints, helpers, and pass throughs to provide diverse and versatile functions in the Flutter app. Then, it just became a matter of connecting the front end to the associated API, which passed the data to the database, and everything was ready for use!

Tech Stack Diagram

Challenges we ran into

This was our first time ever working with Alexa skills, or smart home devices in general, so the start of the hackathon was spent learning how Alexa skills worked. One challenge that we had was using Firebase Cloud Functions to run the Alexa skill, rather than AWS Lambda functions. This meant that we couldn’t directly follow the tutorials, but having all of our code hosted through Firebase between all versions of the app made it worth it.

We wanted to try supporting Desktop Applications using Flutter, and so we found out when setting everything up that Desktop Applications aren’t even available in the developer package (bleeding-edge distribution) of Flutter. If you want to compile to a desktop app, you need to be running the code directly from the master branch of the Flutter GitHub. Consequently, when it came time to connect to our database, there were exactly 0 libraries that supported compiling to desktop, so we kind of needed to build our own round-about method. This led to a lot of trial and error and troubleshooting with the data formats, as we were using Dart on one side of the connection, and TypeScript on the other! In the end though, we got it working, and we’re really happy about that.

We also had a very wide range of input devices and systems for Brain Dump, and so we needed common database handlers that would work for all devices, no matter the data format that we are getting. To do this, we first started by writing a cross-platform set of standard database functions, then we built out and exposed the same functions in different endpoints (depending on the device). This ensured that if we updated the database structure, all of our code for each of the devices would receive the update at the same time to avoid data mismatches and rewriting the same code for each device.

Accomplishments that we're proud of

We had a lot of firsts for this hackathon, like our first Alexa skill, and first native Windows app, but despite that we still managed to complete all of the main features that we wanted to. In addition, we’re really proud of how seamlessly all of the devices interconnect and share data, so this can truly be a cross platform service for however users want to access and interact with Brain Dump.

What we learned

This was our first time creating an Alexa Skill, and we learned a lot about creating VUI experiences. It turns out that collecting generic strings of text from the user (for the notes) creates problems creating compound commands, because we can’t tell when the user is done. We ended up needing to separate the creation of notes and tags from one command to two, and learned about conversational design.

This was our first time using Flutter for either Web or Desktop, and so we learned a lot about Flutter in trying to get that set up. It turns out that desktop is still so new that you need to run Flutter from the master branch of their GitHub, you can’t even run it in their developer bleeding edge download. As a result, we had to implement a lot of database connectivity that we usually use a library for ourselves, and we learned a lot about Dart vs. JavaScript data types, and some of the best practices to ensure connectivity between different systems.

This was also our first time running Queries in Firestore. We’ve used Firestore extensively in the past, but only ever when we knew exactly where the data was that we needed. This time, we needed to run multi-step queries to things like filter by tag or rearrange the items to get the database to reflect the changes we made in the Flutter app. In the process of setting that up, we learned a lot more about not only Firestore Queries, but how NoSQL databases are structured and indexed behind the scenes.

What's next for Brain Dump

When we first started developing, we prioritized one platform as having each feature first so if we didn’t have time to finish everything, at least one device worked for everything we wanted to show. In the future, we’d like to work on filling in some of the gaps between what each device can do, to provide a seamless experience for the user across all platforms.

In addition, we’d like to work on adding some more complex features like tag color coding, and self sorting based on priorities assigned to different tags (as opposed to just the last modified almost always moving to the top).

Share this project:

Updates