💡 Inspiration: our neighbour Adam's daily struggles considering bad vision inspired us to create this app with a big focus on accessibility. For these people which are around 3 million in the Netherlands alone nearly 17% of the Dutch population, which really emphasizes the scale of the problem. 🔍 What it does: Much focus on the UI highlighting feature (describing it as a "hand-holding experience" that draws glowing boxes). Added specific details on how the receipt parsing works and expanded the descriptions of the accessibility modes. It is designed for users with low vision, dyslexia, or other accessibility needs. It combines voice input, text-speech output, camera-based receipt scanning, and a RAG pipeline over bunq's help documentation, so users can navigate their banking app, understand transactions, and take actions entirely through natural conversation. 🛠️ How we built it: We described the Vanilla JS state machine on the frontend, the recursive tool-use loop with Claude 3.5 Sonnet on the backend, the local ChromaDB vectorization of 68+ markdown docs, and the AWS Localtunnel setup required for microphone APIs (Elevenlabs and Whisper). ⚠️ Challenges we faced: Described the specific engineering hurdles, such as syncing the asynchronous audio with DOM highlighting, and forcing an LLM to accurately target hardcoded HTML IDs and obviously debugging. 📚 What we learned: More awareness should be brought up about how accessibility should be a core design principle, not just a checkbox and especially how many people it affects.
Built With
- amazon-web-services
- claude
- fastapi
- python
- react
Log in or sign up for Devpost to join the conversation.