Inspiration The main inspiration was taken from the beloved robot character in the movie, "Big Hero 6".

What it does The first iteration of Baemax treats for cuts, bruises, and allergic reactions. It listens for key terms as the patient describes their pain, then prescribes various medical supplies that would be best suited to treat their injury. In the end, it takes note of whether it was successful in treating its patient, and provides a complimentary lollipop.

How I built it Baemax interacts with its patients using various Google speech-to-text and text-to-speech APIs. it listens for key words and makes appropriate decisions based on what it hears. The decisions are reflected in various hardware (blinking LED lights, a servo arm, etc) that point towards various medical treatments.

Challenges I ran into Main challenges included: learning about API's and making use of them in a meaningful way in the project; integrating software with hardware; and setting up the various hardware at hand for easy programming and use.

Accomplishments that I'm proud of The user interface of Baemax (speaking/responding to the patient) is very responsive and friendly.

What I learned We learned about APIs, and integration of software and hardware to create a viable product.

What's next for Baemax With some more time and iterations, the goal is to increase the scope of injuries Baemax would be able to treat for. We would also like to integrate machine learning (computer vision) in Baemax's algorithm so that children in particular (who may have a difficult time verbalizing their injuries) could better benefit from this healthcare assistant. Improvements to Baemax's hardware, such as a more involved integration than LED's and moving arms, would also be made.

Share this project:

Updates