Inspiration
The idea for BackBuddy came to me when my friend snapped a photo of me unknowingly cosplaying The Hunchback of Notre Dame over at my desk. My third eye was opened, and from that moment on I realised just how common this issue was - you can't go a minute in public without seeing someone slouched over looking at their phone, their laptop, etc. I wanted to create a solution that a lot people could easily integrate into their everyday lives merely by having their Airpods connected.
What it does
It uses the gyroscope data of compatible Airpods Pro or Airpods Max to determine when users are slouching. It then automatically sends haptic alerts reminding users to straighten their posture.
How I built it
It was built as a fully native iOS app, using Swift. The user interface of the app was built with SwiftUI. The core functionality of the app was built around Apple's Core Motion framework to provide posture tracking, with the supporting use of SwiftData for data persistence.
Challenges I ran into
The major challenges in this project were being a solo team and figuring out a way to get the posture tracking to work in the background since constant gyroscope readings can drain power.
- None of Apple's default background modes for apps worked
Accomplishments that I'm proud of
- Building out most of the core functionality for the app in the given time
What I learned
- How Apple's background modes work
- Reading motion data with Core Motion
- Live Activities with ActivityKit
- Creating widgets with WidgetKit
- Configuring notifications with UserNotifications
- Rapid prototyping with Figma
What's next for BackBuddy
- Completing posture tracking history, then releasing to the app store
- Adding dark mode
- Adding more customisation of notification frequency, slouch detection sensitivity
- Adding gamification of posture tracking
Built With
- activitykit
- avfoundation
- corehaptics
- coremotion
- figma
- swift
- swiftui
- usernotifications



Log in or sign up for Devpost to join the conversation.