Inspiration
We built Vespera because sleep is affected by more than just whether someone feels tired. A stressful day, a packed calendar, and your mood before bed can all change how easily you fall asleep and how well you sleep. We wanted to make a sleep app that feels more personal than just pressing play on white noise. Instead of giving every user the same experience, Vespera tries to understand how their day went and adapt the sound it plays based on that. The original idea was to combine mood check-ins, calendar data, sleep feedback, and eventually Fitbit data into one system that could make better sleep recommendations over time.
What it does
Vespera is a React Native sleep assistant prototype. Before sleep, the user checks in with their mood, reviews a summary of their day, and gets a recommended ambient sound based on their mood, calendar stress, and recent sleep history. The app also includes a sleep session flow, morning feedback, a settings screen, and local history tracking so recommendations can improve over time. Right now, the prototype includes mood check-ins, a calendar summary stub with stress scoring, a sound recommendation flow, fallback timed sound playback, wake feedback, 14-night history tracking, Fitbit summary stubs, and saved settings such as default sound duration and wake-up time.
How we built it
We built Vespera using React Native and TypeScript. The app is organized into multiple screens, including a pre-sleep screen, sleep session screen, wake feedback screen, and settings screen, all connected through the main App.tsx flow. We used local storage to save settings, sessions, and sleep history so the prototype could persist user data between uses. On the pre-sleep screen, the app loads a calendar summary, lets the user select their mood, generates a recommended sound, and builds a sleep session object that gets saved locally. The calendar system is currently a stub, but it already models event counts, total duration, an AI stress score, and an overall day stress score to simulate how future recommendations would work once live integrations are added.
Challenges we ran into
The biggest challenge was that our idea depended on multiple data sources working together. Calendar-based stress analysis, adaptive recommendations, and Fitbit-based sleep detection all sound simple in a design doc, but connecting them into one consistent flow is much harder in practice. We had to decide what parts to fully build and what parts to stub so the prototype still demonstrated the core experience. Another challenge was turning vague ideas like “how stressful was your day?” into something the app could actually score and use, which is why the current version uses placeholder calendar events and a formula-based stress score. We also had to think carefully about app flow, since this project is really several connected moments: before sleep, during sleep, waking in the night, and morning reflection.
Accomplishments that we're proud of
We're proud that Vespera became more than just a soundboard. Even as a prototype, it already connects mood input, calendar context, recommendation logic, settings, and sleep history into one app experience. We are also proud that we built a full end-to-end flow instead of only mocking the first screen. The project shows how a sleep app could become adaptive over time by learning from feedback instead of always playing the same sound every night. We also like that the structure is set up well for future integrations, since the app already has clear places for calendar APIs, Fitbit data, and improved AI-based scoring to plug in later.
What we learned
We learned that building a health-related app is not just about the interface. A lot of the real work is in designing how different types of data interact and how to turn user behavior into useful recommendations. We also learned the value of prototyping in stages. By using stubs for calendar and Fitbit-related features, we were still able to test the full product idea without getting blocked by every integration at once. On the technical side, we got more experience with React Native app structure, TypeScript types, state management across screens, and local persistence for user sessions and history.
What's next for Vespera
The next step is turning the prototype into a real personalized sleep assistant. That means replacing the calendar stub with actual calendar permissions and event fetching, using AI to score event importance more intelligently, and adding real onboarding and permissions flows. After that, we would want to connect Fitbit or other wearable sleep data so sound playback can respond to actual sleep stages instead of just timers. We would also want to improve the recommendation model so it learns more accurately from past nights and gives users clearer insight into which sounds help them most.
Built With
- reactnative
Log in or sign up for Devpost to join the conversation.