Inspiration
We saw that a lot of devs on Twitter disliked the lack of capabilities that Apple's Siri had. We were able to
What it does
- Capture your current screen and ask any question about it (half-functional, as it is only able to query what's pre-trained on the LLM)
What it was intended on doing
- Being able to access any live activity such as active media, bt devices, messaging services
- Use any capability of Apple's built-in system such as making calendar entries or setting timers as how Siri currently functions (as it was built for MacOS)
- Using previous context and saved memories for more educated decisions and output
- Using web search API in order garner more data for educated outputs
How we built it
Pivoting back and forth between Swift vs. Electron + React + TypeScript.
Challenges we ran into
Electron is a new concept to us; we're most adept in NextJS, so building components was when we had the fastest velocity. However, we were just fragmenting as a team as our ideas all pivoted from each other and it was difficult to work with MCP in general.
Accomplishments that we're proud of
Being able to finish up a feature.
What we learned
Electron, and more of the DOM.
Log in or sign up for Devpost to join the conversation.