Metacast turns your existing Android device into a live, interactive hologram via the use of Meta AR glasses. Welcome to the very cutting of edge of technology!
We aimed to create the first interactive live phone hologram with natural interaction with our hands, and created a demo that accomplishes just that.
You can use your finger to interact with the hologram, with haptic feedback provided by Myo for an immersive and tangible experience. Above the phone's screen, there are 5 floating recent apps. When grabbed and dragged into the phone's screen, the app will be opened on the phone. When music is playing, next/prev skip buttons on the sides of the screen can be touched to skip tracks.
We had these constraints and challanges to overcome: Android (or iOS) does not allow third party developers to access the screen. Android (or iOS) does not allow third party developers to simulate touchscreen input. Not only must we livestream the phone's screen but also allow software to tap the touchscreen. We needed to develop two methods of interaction, one where the user simply touches the hologram, and an ultra accurate raycast method where the user points to where they want to click, then makes a fist to initiate the tap.
We hacked Android to access the screen, live broadcast it, as well as control the touchscreen via code! We used firebase as a realtime update mechanism for storing our data, which proved to be challenging as Unity does not have a library for firebase. We hacked together our own Unity firebase library in order to turn the vision into reality. Using certain linux commands, we were able to output the screen's buffer into a png file, which we read back, resized, and uploaded to firebase as a base 64 String. We relayed touch coordinates that simulated taps on the touchscreen via linux commands that simulate touchscreen input.
The future is now. The Metapro commercial shows the possibility of controlling a phone via Meta,