Inspiration

The technological capabilities of the HoloLens, google cardboard and other AR/MR/VR devices along with a distinct lack of a virtual interface for the real world lead us to investigate how we could manipulate atoms with bits, control physical objects through virtual interactions.

What it does

This project provides a link between the virtual and physical worlds by allowing you to interact with physical objects from the virtual space. The Hololens provides real-time mapping of the area and functions as the gateway allowing the user to interact with holograms and other virtual objects which are then sent to other devices which control real world events such as driving a robot. We mesh atoms and bits allowing seamless interaction between MR/VR and the real world

How we built it

Using Unity, we made a Hololens app to visibly merge the virtual and physical worlds allowing the user to see the bits overlayed on the atoms. We then open a network socket between the Hololens and an android device running a pre-made app for the robot in order to send commands which are generated from the user's input in the app.

Challenges we ran into

The programmable robot kits that we're using is actually a prototype which hasn't been released. There is no wifi API for it, and the bluetooth is using a special protocol, and the development edition of HoloLens isn't that open at this moment as well. Thus we have spent most of our time (18hr) on the communication between these two devices and to overcome the difficulties of interfacing with the CellRobots, we contacted one of the CellRobot developers and they created a simple API for us to use to send commands to the robot, something it wasn't able to do before. The Gear 360 we are using for the VR component does not natively stream requiring us to use C# to mimic the Google Street View android application.

Accomplishments that we're proud of

By reading through lots of open source codes and getting the supports from technical mentors, we figured out a way to communicate between the robots and hololens. First, we deployed an app on a tablet as the user interface to control the physical robots. Second, we wrote a script to enable hololens to send requests to the tablet through HTTP protocol. Third, the tablet receive and transport the requests to the physical robots, giving it specific actions.

What we learned

  • We learned how to develop and deploy AR apps to the HoloLens platform
  • We learned about C# and Unity development
  • Gained Experience with the CellRobot Platform
  • We learned how to do networking on a HoloLens to communicate to an Android app

What's next for Mesh

  • Integrate HoloLens holograms into live streaming video from Galaxy 360 for VR interfaces like Google Cardboard for shared experiences between AR and VR users using a universal coordinate system
  • When AR and mixed reality technologies become more commonplace, being able to interact and affect the physical world through the virtual will be very important. We will add more sophisticated interactions with the CellRobots in terms of more fined-grade actions and adding support for voice recognition for tasks for the HoloLens

Built With

Share this project:
×

Updates