Inspiration
Last year, I volunteered at a hospital and worked directly alongside nurses. I saw firsthand how frequently they were overburdened and constantly interrupted during their shifts. The hospital used a centralized call bell system where patient requests (a room number and a short voice message) were routed to a central desk. This created massive bottlenecks: nurses had to physically return to the desk to get assignments, the receptionists were often busy with other tasks, and the system mixed urgent medical needs with menial complaints (like bad food or a loud neighbor). Another issue I saw the hospital face was that quite a few patients had didn't speak English, making it difficult for them to get the right care as the nurses had to find a translator before taking care of them. I wanted to build a solution that eliminates this signal noise, gives nurses their mobility back, and enures patients can get the best care as quickly as possible.
What it does
Our project is a mobile-first web application that decentralizes the call bell system and puts it directly in the hands of the nurses. Instead of a central desk, each nurse has the app on their work phone featuring a personalized, real-time queue of patient requests designed to make the patient care process more efficient.
The app features a three-section interface:
- Current Task Bar: Highlights the single active task a nurse is working on.
- Request Queue: A chronological list of AI-transcribed assistance requests.
- Room/Bed Map: A visual layout of the nurse's assigned rooms.
Nurses can read AI-generated transcripts of patient voice requests, play the original audio & translated audio, pin priority patients, and hit an "On the Way" button that actively signals to the patient's room that help is coming, reducing patient anxiety.
How we built it
We designed the application from the ground up to be mobile-first, ensuring it fits perfectly on a hospital-issued smartphone. We integrated an AI voice-to-text feature to automatically transcribe and translate unstructured audio requests from patients into concise, single-line summaries for the main screen, while keeping the full text available in an expandable modal.
We also built a bidirectional linking system between the UI components: the interactive bed map at the bottom of the screen is directly tied to the request queue. Tapping a bed on the map automatically scrolls to and highlights the corresponding notification card, and vice versa.
Challenges we ran into
One of the biggest challenges was figuring out how to reduce UI clutter and "signal noise" without losing important patient data. If a patient makes repeated calls about the same issue, standard systems just spam the queue. We had to implement logic to group these repeated calls. Instead of adding a new card, our app finds the existing request, increments a counter (e.g., "x3"), and appends the new transcript inside the expanded view.
Additionally, designing a mobile interface that comfortably fits a task bar, a scrollable queue, and an interactive map on one screen without overwhelming the user took several iterations to get right. To speed up this process, we used Google Stich and Figma to rapidly prototype and make adjustments to our UI before designing the final version.
Accomplishments that we're proud of
We are incredibly proud of the UI/UX design, specifically the bidirectional map-to-queue highlighting, which provides rapid spatial awareness for nurses on the move. We are equally proud of the "On the Way" feature. By allowing the nurse to press a single button that updates their own UI (moving the task to the top focus bar) while simultaneously sending a signal to the patient's room, we solve problems on both sides of the hospital bed: organizing the nurse's workflow and reducing patient uncertainty. Lastly, the transcription and translation feature is a major win. It allows nurses to read up on a patient's request while en route and immediately understand their needs, completely bypassing potential language barriers.
What we learned
We learned a lot about how cognitive load impacts high-stress environments like hospitals. We realized that simply digitizing a process isn't enough; you have to actively filter and organize the information. By turning unstructured audio into readable text and grouping repetitive notifications, we learned how intelligent UI design can directly save hours of time and reduce alarm fatigue.
What's next for Attenda Health
Our next step is to test the prototype with actual nurses to track our core metrics: hours saved, transcription accuracy, and overall patient satisfaction. We also want to expand the AI's capabilities to automatically detect the urgency of a transcript (e.g., flagging a medical need versus a request for water) to sort the queue by priority rather than just chronological order. Finally, we plan to explore hardware integrations to better connect the app directly to existing hospital bedside interfaces.
Log in or sign up for Devpost to join the conversation.