Inspiration
What it does
How we built it
Challenges we ran into
Accomplishments that we're proud of
What we learned
What's next for FIN|ABLE
Inspiration A family member lost their vision two years ago. Overnight, every financial tool they relied on became unusable — insurance portals, banking apps, investment dashboards — all built for people who can see screens. They had three insurance policies, SSDI benefits, and an ABLE account, but couldn't independently manage any of it without sighted assistance.
We asked: what if your finances could talk to you?
42 million Americans have a disability. The financial tools that exist today weren't built for them. FIN|ABLE was born from the belief that financial intelligence should be accessible to everyone — not just people who can read fine print.
What it does FIN|ABLE is a voice-first AI financial advisor built specifically for people with disabilities. It consolidates insurance claims, disability benefits, income monitoring, and daily expenses into a single conversational interface powered by a multi-agent AI system called SAGE.
A blind user can:
Open the app and hear: "Welcome to FIN|ABLE. Press 1 for blind mode." Press Space and ask: "What insurance do I have?" Hear SAGE explain all three Chubb policies in plain language — no jargon, no PDFs Say "I fell and hurt myself" — SAGE identifies the right policy and exact coverage amounts Say "Yes" — SAGE generates a professional claim letter and sends it to Chubb's claims department Three voice commands. Eyes closed. Claim letter filed.
The app also handles denied claims (explains why, what to do, and the appeal deadline), tracks income against SSDI benefit thresholds (SGA limits), and monitors spending across disability-specific categories like medical transport and personal care assistants.
How we built it Architecture — 5 Specialized AI Agents:
SAGE (Superior Advisory & Guidance Engine) — the orchestrator that receives all queries, delegates to domain agents, resolves conflicts using a disability-aware priority stack, and generates a unified spoken response SHIELD — insurance intelligence (Chubb integration) — parses policies, tracks claims, drafts appeal letters PULSE — payroll & benefits guardian — monitors income against SGA thresholds ($2,700/mo for blind individuals) to prevent catastrophic benefit loss ORACLE — portfolio & ABLE account optimizer — tax-advantaged strategies calibrated to disability-related income volatility COMPASS — expense tracker — categorizes disability-specific costs (medical transport, DME, PCA services) and flags missed tax deductions Conflict Resolution: When agents disagree, SAGE uses a priority stack: Benefit Preservation > Insurance Coverage > Essential Expenses > Emergency Fund > Debt Management > Wealth Building. A blind person losing SSDI is a crisis — SAGE never lets an investment recommendation jeopardize benefits.
Tech Stack:
Frontend: Next.js 16, TypeScript, Tailwind CSS — cinematic glassmorphic UI with a canvas-rendered animated orb Voice Output: Deepgram Aura-2 TTS (Iris voice — cheerful, feminine, approachable) with ElevenLabs and Edge TTS as fallbacks Voice Input: Web Speech API (browser-native, zero-cost real-time STT) AI Reasoning: Featherless AI (OpenAI-compatible API, Qwen 2.5 72B for claim letter generation) Claim Letters: AI-generated formal insurance correspondence via Featherless API Accessibility: Three vision modes (blind, low vision, sighted), WCAG AA compliant, full keyboard navigation, ARIA labels, high contrast mode, screen reader support Challenges we ran into The Stale Closure Problem: React's useCallback and useEffect capture state values at creation time. Our spacebar handler would read isSpeaking = false even while audio was playing, because the state had changed but the closure hadn't. We solved this with useRef mirrors — every critical state value has a corresponding ref that stays in sync, and event handlers read from refs instead of state.
TTS Provider Cascade: ElevenLabs ran out of free credits mid-development. We built a 3-tier fallback system (Deepgram → ElevenLabs → Edge TTS) so the voice never fails. The first API call was always slow due to cold starts, so we added a warm-up fetch on page load.
SSR Hydration Mismatches: Math.random() for particle positions and new Date() for letter timestamps produced different values on server vs client, causing React hydration errors. We moved all non-deterministic computations into useEffect (client-only) to ensure server and client renders match.
Voice + Text Synchronization: If text appeared before audio, it felt broken. If audio loaded before text, there was an awkward gap. We synchronized both — the response text and voice appear together, and pressing Space during playback immediately cuts the audio and starts listening for the next question.
Accomplishments that we're proud of The entire app works with your eyes closed. That's not a feature. That's the product. Three voice commands take a user from "I had an accident" to a professionally drafted claim letter sent to Chubb — no forms, no portals, no fine print. The priority stack — SAGE understands that for a disabled person, losing benefits is worse than missing an investment opportunity. This isn't generic financial advice; it's disability-aware financial intelligence. Zero-screen onboarding — a blind user hears instructions, presses one key, and is immediately in a fully accessible voice conversation. No signup. No tutorial. No friction. A cinematic UI that looks premium — animated canvas orb, glassmorphic panels, holographic message animations — proving that accessibility and beautiful design aren't mutually exclusive. What we learned Accessibility is a design philosophy, not a checklist. Adding ARIA labels after the fact doesn't work. We built voice-first from day one, and the visual UI was layered on top — not the other way around. Refs over state for event handlers. React closures are a trap for real-time applications. Any value read inside a keydown listener needs to be a ref. TTS quality matters enormously. The difference between a robotic voice and a warm, natural one is the difference between a gimmick and a product people would actually trust with their finances. Disability finance is complex. SGA limits, IRWE deductions, ABLE account rules, trial work periods — the regulatory landscape for disabled workers is a minefield. AI can navigate it; humans shouldn't have to alone. What's next for FIN|ABLE Real document parsing — upload an insurance PDF and have SHIELD explain it in plain language, flag coverage gaps, and draft appeals for denied claims Live SGA calculator — connect to payroll data (ADP integration) for real-time earnings tracking against benefit thresholds with proactive alerts ABLE account automation — ORACLE optimizing contributions across ABLE, Roth IRA, and taxable accounts based on disability-specific tax rules Multi-language voice — Deepgram Aura-2 supports Spanish, French, German, Italian, Japanese, and Dutch — making FIN|ABLE accessible across languages Mobile app — React Native with haptic feedback for deafblind users Partnerships — working with disability advocacy organizations and insurance carriers to bring FIN|ABLE to the 42 million Americans who need it
Built With
- adp
- chubb
- deepgram
- elevenlabs
- featherless-ai
- next.js
- react
- supabase
- tailwind-css
- typescript
- vercel
- web-speech-api
Log in or sign up for Devpost to join the conversation.