Voice assistants pervade the mobile landscape today, enriching the ease-of-use and accessibility of our phones, homes, and wearables. Siri shortcuts and Google routines brought configurable and reliable macros to mobile operating systems. However, personal computers have not enjoyed the same advancements. Even with voice assistants, desktop operating systems have a difficult time exposing each program's capabilities to one another. Without standardized application frameworks, it's difficult to implement a deep-linking API like Android's App Actions or iOS's Quick Actions. But what if there was another way?
What it does
HoundBar exposes app "actions" on MacOS via voice commands, powered by Houndify. We compile available menu bar items and assign each one a global voice command. Click on the HoundBar menubar icon to record and just say what you want to do—HoundBar takes care of finding the right app for the job.
How I built it
Due to native SDK limitations, we created the app using Houndify's Java SDK. However, we used Kotlin rather than Java and communicated with AppleScript to trigger system events. The scripts require accessibility permissions.
Challenges I ran into
Our first implementation was going to be in Cocoa. Development with Cocoa was slow due to the lack of online help (there were very few StackOverflow questions about the particular functions HoundBar needed). Example code for MacOS accessibility APIs was virtually non-existant. Unfortunately, it wasn't until we implemented the more difficult functionalities that we discovered that Houndify's SDK does not support Cocoa. Rather, it only supports UIKit and C++. The largest challenge was transitioning over to a different SDK and rebuilding the app with little time to spare.
Accomplishments that I'm proud of
I successfully pivoted to Houndify's Java SDK and wrote a functioning app!