The past year has had us all thinking about ways to better connect with the world around us. Digital communities are increasingly commonplace, and we need more interactive technology to connect with each other through the screen. We thought, "What if this could all feel more real?" and "What if I could feel the environment around me?"
What it does
Our GRIP Controller or rasp esponsive nflatable ouch Controller, is a revolutionary new way to feel the virtual world. Traditional controllers are rigid and static, but the GRIP controller can inflate and deflate a small pouch, right in the palm of your hand. Many of the objects we try to grasp in virtual reality are cylindrical in our hand: balls, food, weapons, etc. By inflating at a variety of pressures, we have stiffness control and can simulate a range of objects. The pouch is also covered with a contact sensor to detect the grasp on the object, which then interacts with the virtual world.
How we built it
We represent our virtual world in Unity VR and stream hand position data from MediaPipe, a recent Google Cloud neural network implementation for calculating hand pose through a camera feed.
Through some clever tweaks to extract hand rotation data from the image as well, we are able accurately move our hand about in 3D, captured from a single camera and streamed with Node.js.
To connect Unity and Mediapipe, we created a Mediapipe web interface frontend using React and a locally hosted web server using Node and Web Sockets to send hand positions to Unity, which was also connected to our backend. The demo VR environment and object interactions and physics are built completely on Unity. Meanwhile, the hardware of the GRIP Controller is built around a Teensy 3.5, streamed into Unity over a serial connection.
Challenges we ran into
Calculating hand rotation from a single image proved to be trickier than we thought. Also, this is our first time using Unity at all, so dealing with general game physics, collisions, and Unity scripting was a fun challenge to overcome. There were initial difficulties projecting the hand from the MediaPipe coordinates to Unity coordinates and we also had to remove the shakiness from the real-time hand and finger tracking, which caused the Unity hand to jitter. MediaPipe additionally doesn't track hand rotation well due to the lack of depth perception.
The project overall had many aspects that needed to be seamlessly connected together. Unity had to receive hand coordinates from the MediaPipe web interface and also be connected to the physical GRIP Controller hardware to receive and send inputs.
The physical hardware struggled from a variety of issues. There were some power supply issues with running multiple pumps on a single wearable source. Last minute smoke plumes out of the device were a bit worrisome, but we managed to get it all working at the end! The copper traces on the pouch should also be reconfigured as with capacitive sensors in future iterations. A chip with a faster clock speed would also eliminate some of the lag. Finally, the hand tracking model was trained on bare hands, which is why the pouch must remain clear with minimal occlusions on the hand itself. A more traditional VR glove would not work in this case because the model is trained specifically on hand landmarks, especially the silhouettes of the knuckles on the hand. Even a regular, tight latex glove renders the MediaPipe model unusable.
Lastly, in a remote hackathon, it was difficult to work together without being able to test the hardware with the software consistently. Good teamwork and communication were essential to our success.
Accomplishments that we're proud of
We're so proud that it works despite all the moving parts! We really think that this could help people feel more immersed in virtual reality.
For a lot of us, it was our first times developing in VR and being able to connect the hardware and software for development and testing was a huge task remotely.
What we learned
We learned that this was a lot to do over Spring Break, especially a hardware project while working remotely! This was a great opportunity to learn Unity VR, Node.js and React, and Teensy separately, and then integrate them all at the end. This entire Spring Break was filled to the brim with learning so many new things the first time.
What's next for GRIP Controller-Grasp responsive inflatable pouch controller
We want to upgrade the hardware first and foremost for a shorter response time and more seamless integration. In addition, we also want to look into rigging a 3D model for Unity to make it fully responsive to finger movements and for it to be reflected in Unity in real time, as well as adding more interactive gestures. It'd also be great to test this on an actual Oculus Rift!