An Android background service to remotely control desktop input using voice commands via Google Assistant and orientation data from rotation sensors.
Please note that in the youtube demo video, the mouse cursor is being controlled by the phone's orientation but our laptop screen recorder (Camtasia) didn't record the mouse cursor. Sorry for the confusion.
From realms beyond us
What it does
Allows users to control keyboard and mouse input by speaking commands and tilting their phone. Features browser functionality such as switching tabs, opening Google Drive and Gmail and media controls such as fullscreen, play/pause, and volume control. It also operates cross-platform and supports Windows, Linux and macOS.
How we built it
Google Assistant/Actions SDK -> Dialogflow -> Firebase -> Android Background Service -> Python Desktop Client
Challenges we ran into
- We started out with using Cordova to get magnetometer readings and then had to switch to Java due to inability to send requests.
- We couldn't reliably deduce the position of the magnet from the magnetic field strength and had to use the rotation sensor instead.
- Google Assistant is incapable of local fulfillment so we had to add an unnecessary step of going through Firebase.
- Firebase + databases = lag.
- Sensors in phones are not the most accurate and can lead to jitter in mouse movements.
Accomplishments that we're proud of
- Forwarding pipeline was fast enough for negligible control latency.
- Seamless integration of all the parts.
- Getting mouse control to work accurately without any jitter.
What we learned
Google is good to consumers but their dev options are kinda shiet (Top 10 Anime Betrayals). (If a google recruiter is reading this, please hire us to fix them haha)
What's next for Desktop Buddy
Creating an actual desktop application with more functionality such as adding or customizing voice commands
How do we feel?
Lacking of fruits :)