Inspiration In today's fast-paced digital world, we take thousands of photos, but we often lose the emotional context behind them. While studying how communication changes over time and the impact of the generation gap, I realized that we are losing the human element of our memories. I wanted to build something that doesn't just store files, but preserves feelings. LifeBook was inspired by the need to bridge the gap between simple digital storage and a true emotional inheritance, creating a space where mental well-being and memory preservation meet.

What it does I built LifeBook with a focus on cloud-based AI integration.

The AI Brain: I utilized Amazon Bedrock to access the Amazon Nova foundation models. Nova acts as the core engine for our emotional analysis.

The Logic: When a user submits a journal entry, the text is passed via API to Amazon Nova, which uses custom-engineered system prompts to extract the primary and secondary emotions, calculate an intensity score, and suggest therapeutic music genres.

The Interface: I used a rapid frontend builder to design a clean, empathetic, mobile-first user interface that seamlessly routes data to the AWS backend.

The Vault: I designed the database schema to handle "time-capsule" logic, allowing entries to remain locked until a specific future date for the Legacy Vault feature

Challenges we ran into Coming from a background of solving highly deterministic, logic-based problems in C programming and linear algebra, shifting my mindset to work with generative AI was a massive learning curve. Instead of writing rigid loops and strict mathematical functions, I had to learn the art of prompt engineering. Tuning Amazon Nova to reliably output structured JSON data for complex, nuanced human emotions—rather than just giving generic text replies—took significant trial and error. Additionally, designing an architecture that securely handles sensitive personal memories required careful thought about data flow.

Accomplishments that we're proud of I am incredibly proud of how the "Emotion-to-Music" matching feature turned out. Seeing the AI accurately read a complex, mixed-emotion journal entry and instantly suggest a perfectly fitting musical vibe made the platform feel truly alive and therapeutic. I am also proud of successfully implementing the Amazon Bedrock API to create a frictionless connection between the frontend UI and the Nova models.

What we learned This hackathon was a crash course in modern cloud architecture. I learned how to securely provision and invoke models using Amazon Bedrock and how to structure prompts to constrain AI outputs for application backends. Beyond the code, I learned how to design a user experience that prioritizes empathy and emotional intelligence, proving that technology can be used for deep, personal well-being.

What's next for Qwerty The next step is to fully integrate Amazon Nova Pro's multimodal capabilities so users can upload photos without any text, allowing the AI to analyze the lighting, facial expressions, and context of the image to auto-generate emotional tags. I also plan to integrate the Spotify API to allow users to play their AI-matched therapeutic music directly inside the app, and expand the Legacy Vault with enhanced AWS encryption for legal documents

Built With

Share this project:

Updates