Who we are

Team name: VR-OR
Team Lead: Mo Kakwan
Other Members: Helena Deus, Uwe Gruenefeld, Alisha Harris, Michal Leszczynski
Category: Healthcare & Medicine


Inspiration was a personal story for each one of us

AH: One of my best friends is a surgeon. She shared with me that she was appalled by the lack of quality of surgical procedures performed in developing countries. There is lack of resources to buy sophisticated and highly accurate robotic surgical and simulation surgical equipment. It must be possible to circumvent these problems in the digital age with a simpler way to provide high-level affordable training to those highly passionate doctors.
MK: My cousin uses the Da Vinci machine for most of his surgical procedures. I asked him “what kind of haptic feedback does it give?” He said: “none whatsoever”. And it is so expensive they do not even publish price ranges on the website.
ML: At my previous company a surgeon came to us with a vision to provide a more consistent cardiac surgery training. Why? 5-10% fewer patients experience severe complications if the surgeon has frequent interaction with the procedure.
UG: The best thing is that opportunities are limitless – no need for material engineering to simulate new surgical procedures. In fact, any skill that requires delicate manual work can be honed with this approach!
HD: It is startling that the amazing advances in VR were not nearly matched by feedback mechanisms that could come with them. Joystick vibration was hip in the 90s when I got my first Nintendo… We need a “Google Cardboard for Haptic Devices”

In summary, we are inspired to offer a solution to VR-aided surgery training that is:
Affordable – it should be the "google cardboard" for haptic devices
Original and Innovative – even super-expensive surgical gear does not give haptic feedback today
Morally necessary – can save people’s life and health through better training of surgeons
Widely applicable – should be easy to adapt to other applications such as paintbrushes or other handheld devices
Addressing a niche – visuals progressed at crazy pace since 90s, feedback mechanisms have not. It is time to change that.

What it does

We have developed a piece of hardware – a robotic arm – that is able to provide haptic feedback and thus provide a much more immersive AR/VR experience that can be help develop muscle memory and even help students and others empathize with surgeons.

How we built it

The architecture for the proposed solution included hardware components such as sensors and 3D printed joints and software components, including Unity and an input/output module interfaced with arduino via node.js.

Platforms: Arduino IDE, Blender, Unity
Development tools: MonoDeveloper, SublimeText, Atom
SDKs & APIs: None
Assets: CAD models, copyright free mp3, Unity pre-fab sphere, Blender model of a scalpel (all assets in github)
Libraries: Vuforia, nodejs-websocket, serialport
Github Link

Hardware: For the robotic arm, we used 3 potentiometers, 2 common wood skewers (“bones”) and 3D printed wheels and “sockets” that acted as the joints between the various components. The “scalpel” consisted of another skewer connected to the tip of the arm using a spherical magnet. Wiring was also used to connect the potentiometers to an arduino platform (model Uno R3), which in turn connected to a computer via a USB port. The robotic arm as a whole (see picture) was allowed to freely rotate around a base (up to 270°). The “scalpel” itself includes a multi-target - a 2D printed cube, with a different texture in each face, optimized to capture orientation of the blade. The haptic feedback is achieved through the use of servo motors - these are connected to the wheels at the joints and, upon feedback from the 3D virtual model, prevent the rotation of the potentiometers (holding them in place), the effect of which is felt as tension on the “scalpel” when an object in the virtual world is touched. The models for all 3D printed objects were created using OpenScad.

Software: Each potentiometer generates input information in the form of serials. These values are integrated into Unity via a websocket which connects to a node.js server listening to the input available via a USB port (see architecture). Each of the 3 inputs are transformed into proxies for motion in a 3D vortex space. Blender was used to create a 3D model of the “scalpel” and imported into unity. Unity’s physics engine was then used to represent the mesh deformation and the force associated – a sphere mesh is generated on the fly representing the objects that the “scalpel” is cutting. The distance between the scalpel and the center of the sphere is used to compute a value for “force”, which reflects the resistance of the interaction between the scalpel and the skin. Finally, the force is transmitted to the servo motor controlling the forward/backward potentiometer motion.

Challenges we ran into

1) The “bones” were too heavy to be held together only by our 3D printed joints. This was a challenge early on. Our solution was to add three wheels, one in each joint, which were connected to each other by a string, offering structural support. This solution turned out to also allow a nice mechanism to connect the servo engine.
2) At first we connected the wheels with rubber bands, but then it turned out that they are too elastic, i.e. even if a wheel is blocked by a servo motor, the arm can move some more before the band is stretched and starts to give feedback. Substituting rubber bands with dental floss, which is not elastic, helped with this. It also can be cut at any length desired.
3) Precisely detecting in the 3D world each of the 6 degrees of freedom (3 axis x 2 direction in each) in our system. We solved this with 3 potentiometers, held together with 3D printed pieces and stabilized by servo motors.
4) Friction provided by the arm itself. In an ideal world, there would be zero feedback from the arm when the pen / scalpel is not touching anything (i.e. only friction provided by air), which is not the case for us (friction of the wheels at the base and joints). Moreover, the “natural” feedback of the arm is different depending on its position and direction of pen / scalpel movement. Solving / optimizing this as a next step post-hackathon.
5) We "burned" our first arduino board soldering to the wrong pins. The only solution was to buy a new board

1) Input/Output: one problem we needed to solve well, given the need for very accurate tracking of the tip of the scalpel in 3D space. Errors in sensor Input: the sensor input should in principle work but the sensors kept emitting numbers when the object was stable. This caused some difficulties as it made the virtual object vibrate when in fact it should be static.
2) Detection of scalpel direction: in real life motion, the scalpel doesn’t just move along one axis at a time, there is a 6-degree of freedom movement that we needed to capture. To do this, we implemented a mechanism of multi-targets. Our initial solution was to add a single image target, which only worked for one degree of freedom. In the next iteration, we create a paper cube and attached that to the back of the scalpel. Each face of the cube was printed with a natural pattern (e.g. pebbles), which significantly improved detection. To know exactly what is the rotation of the scalpel regardless of tracking we took this even one step further and printed a cube with a different pattern on each face. The faces were 60mm in size.
3) Calculation of force: In our system, we needed to represent how force associated with each of 6 degrees of freedom was transferred to each of the 3 robotic arm: for forward/backward motion resistance, up/down motion resistance and left/right motion resistance. As a first step, we can only distinguish between presence or absence of resistance.

Accomplishments that we're proud of

1) The complexity of the interaction with a robotic arm was reduced to 3 simple numbers which could be easily manipulated to extrapolate to a virtual reality and then again used as feedback to the haptic device.
2) The elegance of the mechanism for transforming deformation of an object in Unity into a force or tension value that can be applied in the real world
3) Extraordinarily good and productive interaction between curious and passionate teammates who will likely become good friends

What we learned

1) That open source tools make it relatively straightforward to connect the real world to the virtual world - in both directions
2) To not try to work with hardware when you’re tired, boards can get fried :-)
3) Unity is a great tool for coders and non-coders alike who want to get jump started into VR as it offers flexibility and power without loss of intuitiveness
4) Plan, be realistic about deadlines, solve the full problem first without falling into rabbit holes and then iterate over the solution to perfect it.
5) That anything is possible with hard work, determination and when we're all aligned with one goal and vision in mind.

What's next for VR-OR

We are not going to rock the world in two days, but we can start building a bridge to something that will. The next stage has three objectives:
1) to implement and test/increase robustness of the approach, in particular the hardware component - to bring it to a point where it delivers a satisfactory experience without compromising simplicity and affordability (e.g. replacing dental floss with a chain, optimizing size of elements, improving sturdiness);
2) to increase the size of the components used in the robotic arm, use more robust connections or at least larger pieces;
3) to apply the solution to as many use cases as possible and prepare a DIY guide and videos to inspire others (e.g. school teachers) and support its growth via the open source community.

At the end of this Hackaton, we were not too far from a solution that could become the “Google Cardboard for Heptic Devices” but more testing and documenting is needed so that other are able to contribute. Open sourcing the 3D models will get other smart and curious developers involved in improving the whole system.

Built With

Share this project: