This project is an sketch of a dashboard that could be used in operating rooms by surgeons to access information of the patients without having to compromise the sterile field.
The main problem
In every surgery, the working area is a sterilised ambient and the surgeon can't touch anything outside a very small workspace that has been previously prepared. This implies that the surgeon has very little control over his environment and he/she has to continuously ask other staff for information. The most clear example are the images (eg. CT-MR scans). We have thought that a little piece of technology could give them the freedom to manipulate all this data without relying so much in other staff.
We created a proof of concept to show the utility that innovative technology could have in a real medical environment. We have used some real scans to show how navigation can be done without the need of any periferical device, using a gesture based interface. As we didn't have access to live data, we have attempted to create a little panel representing some patient parameters (eg. O2, CO2, cardiac frequency, respiratory frequency, temperature, INR).
We have decided to use a Myo armband to provide the gesture recognition. We built a simple web application using a server which provided access to the Myo's data. We loaded scan data to the website and provided a very simple gesture based system to manipulate them, and to switch among different working screens.
Our background in the field
We are a team formed by a Biotechnologist and a Medical Student. We love to give solutions to the problems that we find in our fields. We think there's a huge intersection between medical sciences and technology waiting to be extended.