Inspiration

In the high-stakes world of surgery, every decision carries immense weight. A moment’s hesitation or a slight miscalculation can have lasting consequences for patients and their families. surARgy was born from a profound understanding of this responsibility—a heartfelt commitment to support surgeons as they navigate these critical moments. Imagine a surgical environment where fear and uncertainty give way to clarity and confidence. Our AR lens transforms this vision into reality, enabling surgeons to access real-time patient data and 3D visualizations during procedures. With step-by-step, voice-activated instructions at their fingertips, they can focus entirely on their patients, knowing they have the best possible guidance. surARgy isn’t just a tool; it’s a lifeline that enhances the surgical experience, ensuring every operation is performed with the utmost care and precision. Together, we can redefine the future of surgery, prioritizing patient safety and empowering surgeons to make a profound difference in the lives they touch.

What it does

surARgy is a AR lens that can be accessed with Snap spectacles to allow surgeons to perform safer surgeries. We start with a website that our surgeon can enter the patient data and information. The doctors are then able to get feedback on the steps that need to be taken for the surgery and can edit them prior to the surgery. These are the instructions that will be used to assist the surgeon during the procedure. The lens includes a 3D model based of the 2D scans of the patient for the surgeon to better understand the issue and have it accessible throughout the surgery for reference. They can rotate, resize, and relocate the model at any point. On the other hand, there is a set of instructions on the surgery that are being communicated to the surgeon, and using voice-to-command the surgeon can say "Next" and get the next instruction for the surgery.

How we built it

Frontend:

Our journey with the front end began with designing in Figma, where we explored various design resources from Dribbble and Figma to better understand industry standards and our target audience. We carefully selected our color palettes and decided on our tech stack, opting for React and Javascript. This choice enabled us to customize, animate, and structure our application effectively, ensuring a cohesive and readable design for our website.

We used Blender and Lens Studio for the creation of our filter. We were able to create our models in Blender to simulate the 3D models that our users would see when performing surgery on specific parts of the body. In Lens Studio the 3D model was taken and animated through toggling on or off for the appearance, rotating, enlarging, and moving around the 3D model body. Function calling was implemented to get the information about the surgical procedure and then display it through the spectacles.

Backend:

We deployed a RAG chain to bring into context the patients' scanning images the patient history and inputs from the doctor to create a plan for the surgery. Using Flask and MongoDB we deployed the model to give step-wise instructions to the doctor on demand in the spectacles on SnapAR. We also built an API for sign-up, login, fetching patient information, and uploading medical images and pre-surgery reports. This is utilized in the pre-surgery planning and during the surgery as well.

Challenges we ran into

Finding an idea was an early-stage challenge we ran into as we were trying to build something impactful and couldn't figure out what industry or solution to focus on. Once we figured out we wanted to use AR in the medical field we ran into challenges on building a multimodal to generate responses on both image and text data.

Learning to use SnapAR caused multiple challenges such as learning how to use the interactive features like being able to move around and resize components. We also ran into issues with integration as we were working with many different technologies and were integrating some of them for the first time.

Accomplishments that we're proud of

Learning to use AR was difficult for us and our ability to overcome the obstacles that came with Lens Studio in such a short period of time was something we are proud of. Building a complete application in two days and integrating the front, back, and SnapAR applications before the deadline was another accomplishment.

What we learned

We learned to stay flexible and open-minded when working with new APIs and technologies. Adaptability became key as we encountered obstacles, pushing us to experiment with different tools and solutions to get the job done.

We also learned a lot about collaborating effectively as a team. We worked closely together, discussing solutions, sharing ideas, and syncing regularly to ensure we stayed aligned and could tackle challenges together. This project helped us grow technically and showed us the importance of teamwork and communication.

What's next for surgARy

In future implementations, we plan to add the ability for doctors to contact specialists for specific surgeries and allow them to help assist with the surgeries wherever they may be. We can include more services catered towards diagnosis through future improvements to technology like Medpalm and in the future with fast inference technologies like Groq we can get real-time updates on patients well being.

Built With

Share this project:

Updates