Inspiration

A lot of accountability or study apps have the same problem: they’re just glorified timers.

I’ll download a productivity app on my computer but still end up scrolling through Instagram on my phone with no consequences. If I could minimize interruptions and stay accountable across everything I use, I’d get so much more done.

That’s why we built DubFlow, a smarter and cuter way to stay focused.

What it does

DubFlow is a smart focus tracker that keeps you accountable in the cutest way possible. It’s an all-in-one platform that uses both AWS Rekognition and desktop monitoring to recognize when you’re focused, distracted, or off-task.

Your on-screen companion, Dubs, reacts in real time to your behavior. When you’re focused, Dubs naps happily. When you start to drift, whether on your computer or at your desk, Dubs wakes up and pulls you back with live speech or push notifications. These context-aware messages help you break out of distraction loops and get back into the flow.

For example, if you only have 20 minutes to work on your calculus homework but Dubs catches you on your phone, Dubs will playfully remind you through voice, text, and notification that "Only 20 minutes remain for those integrals, so put that phone away before I eat it."

All your focus data appears in an interactive dashboard that tracks your attention timeline, gives you a focus score, and tracks a variety of metrics to help you improve. DubFlow makes productivity both intelligent and adorable.

How we built it

DubFlow is a desktop application built with SvelteKit and Electron.

The context-based message system for Dubs's smart reminders was based in two main context factors: screen/window information and webcam/scene information.

The vision processing component for DubFlow uses OpenCV locally for real-time analysis of the user's eye movements, and AWS Rekognition for detailed context analysis, including the user’s emotional state (whether the user appears calm, stressed, or confused) and identifies distracting objects in the frame such as phones, drinks, or other devices.

For relevant site context, like if a user was on a distracting site like Instagram, we use Electron's native window management APIs to identify the active window and site URL, and flag it if it's a known distraction domain. (e.g. Reddit, Instagram, X)

This context is all piped to an AWS Bedrock-hosted LLM, which generates relevant and concise messages that keep the user focused. For additional effect, the messages are not only displayed onscreen but also said out loud through ElevenLabs and sent to the user's phone via Pushover notification.

For security reasons, everything except the AWS Rekognition service runs locally on your own device. Besides this AWS Rekognition stream, no data is stored on any external servers or services.

Challenges we ran into

One of the main challenges was managing a stable Electron architecture that could handle constant real-time monitoring without lag. We had to carefully design event triggers for when Dubs should react while managing the avatar’s state changes across multiple data sources like webcam input, app activity, and focus timers. Getting these systems to stay in sync without spamming reactions took a lot of fine-tuning and optimization

Integrating AWS Rekognition and Bedrock also came with hurdles. We worked hard on engineering prompts for Bedrock that would generate short, natural messages instead of repetitive and irrelevant ones. Finally, tuning the distraction detection pipeline and deciding when a user was truly off-task versus just glancing away required a lot of fine-tuning of the model's sensitivity, which we eventually solved by implementing a grace period and a unified distraction manager.

Accomplishments that we're proud of

We're proud that we built a tool that we would actually use ourselves! Even as we were developing Dubflow, our team often found ourselves using Dubs’ encouragement to stay positive and productive. We’re also proud of maintaining an efficient team environment, which we accomplished through our clear division of roles and a unified focus.

Plus, our Dubs is super cute. (All animations/sprites 100% hand drawn!)

What we learned

Our most useful takeaway was learning how to manage a project with many different services, as this project was one of our most ambitious in terms of the sheer number of components used.

We found that prioritizing the creation of a clear system architecture diagram early on helped immensely with the eventual integration of all these components. Having a clear diagram also allowed us to split up work more easily, since we all understood what the expected preconditions and postconditions of each component were.

We learned a lot about effective system prompting, as one of our biggest challenges was ensuring that the LLM would create the structured output we were looking for from the raw context inputs from AWS Rekognition and Electron desktop monitoring.

What's next for Dubflow

We plan on adding more purchasable/unlockable cosmetics (sunglasses, hats, custom items) through an item shop, as well as a progression system so you can engage with your buddy Dubs more. For example, you could use points earned by studying to give him treats or pet him.

Built With

Share this project:

Updates