Inspiration

Our project stemmed from the fan favourite Disney movie revolving around a marshmallow-like individual that wasn’t able to touch his toes, Baymax! Now how did we get to him? We thought about the polarity when it comes to emergency services being able to provide their assistance and support when in need, not only in the waiting room, but especially in areas where it’s harder for them to reach us. We got to thinking, with how talented and helpful people are, could there be a way for these people to be connected together across a network where we could all help each other? This introduced the idea of Baymax, a personal healthcare companion at your service, immediately activated by the sounds of distress. Baymax is able to identify the problem and provide further assistance if needed, by not only providing immediate solutions, but a deep analysis of the situation for future reference. We wanted to encapsulate the idea of Baymax, but rather in the form of an immediate community willing to come together and help individuals out, introducing the idea of “Biometric & AI-Driven Medical Assistance eXchange, also known as “B(AI)MAX”.

What it does

Our app delivers emergency healthcare support for a person undergoing a medical emergency by notifying certified first aid professionals about the issue, requesting for help. Our application uses Gemini API to prepare the medical information previously entered by the patient to assist the first-aid responder in an emergency to help the patient. After a hardware trigger from a necklace or a manual trigger within the app, an emergency post is sent to the server which sends a notification to first aid responders within a close proximity (5 kilometre radius) to accept the request for help. Our application then uses location services to guide the responder to the patient for the quickest help possible. Incident reports are then summarized and recorded by the Gemini API for medical record history of the patient’s emergencies.

How we built it

Utilizing SwiftUI and Xcode, we built an iOS native app that used the Firebase API and the Gemini API to operate the AI operator and the server processes. For the hardware component, we combined a motion sensor, arduino, ESP32 chips, and LCD screen, and buttons to simulate the physical trigger. SwiftUI communicates directly with Firebase packages and Gemini API plugins to send and receive notifications as well as operate prompts respectively.

Challenges we ran into

We were able to get data communicated from the arduino R3 to esp32 which required alot of effort. We were then able to get that data converted to firebase. This would allow connection to our app in the IOS however we were having difficulties with this. We were also unable to figure out a way to get locations of people around a certain distance.

Accomplishments that we're proud of

On the hardware side, we were proud to see the sensors work and communicate that data all the way to the firebase.Getting the user interface of the app working was also very cool as many have never worked with anything like this before or programmed in Xcode or swift.

What we learned

We learned how to combine Firebase, Arduino, and Gemini APIs together with an iOS native app.

What's next for B(AI)MAX

A key next step for B(AI)MAX is enhancing user safety and data protection as the system evolves. Future development will prioritize minimizing privacy risks by limiting location and health data sharing to active emergency situations and restricting access to verified responders only. Additional safeguards will be explored to reduce the risk of data breaches and unauthorized tracking. On the software side, the system will expand to include a secure map-based feature that connects users to the nearest certified helpers, along with credential-based authentication to ensure only trusted individuals receive emergency alerts. An AI component will also be integrated to assist in emergency situations by analyzing sensor data and communicating relevant information such as detected falls, abnormal heart rate, or environmental hazards to medics in real time. This AI-driven communication can help responders better assess the situation before arrival. Hardware development would continue with the integration of additional sensors, including heart rate, air quality, and UV monitoring. These components will be consolidated onto a compact custom PCB and incorporated into a minimalistic wearable necklace design, ensuring comfort, discretion, and continuous monitoring.

Built With

Share this project:

Updates