Inspiration

Accessibility is often treated as an afterthought in web development, leading to fragmented tools that either check code or design, but rarely both. We were inspired by the idea that inclusivity should be seamless—developers, designers, and product managers shouldn’t have to juggle multiple tools to ensure compliance. With the release of Google Gemini 3’s multimodal reasoning, we saw an opportunity to unify these perspectives into a single, powerful audit.

What it does

AI‑powered accessibility auditing tool that makes inclusivity effortless. Unlike traditional checkers that only scan code or visuals, Echo‑Audit uses Google Gemini 3’s multimodal reasoning to analyze both simultaneously. It sees UI issues like poor contrast, spacing, and visual hierarchy, while it reads semantic HTML, ARIA labels, and keyboard navigation—all in a single pass. The result: instant, actionable WCAG compliance reports for developers, designers, QA engineers, and product managers.

How we built it

• Use of a PRD to guide Google AI Studio • Frontend: Typescript web app for rapid prototyping and intuitive UI. • Core Engine: Integrated the Gemini 3 API, leveraging its multimodal capabilities to “see” UI issues and “read” source code simultaneously. • Deployment: On both Github and Hosted on Vercel for scalability and quick iteration. • Workflow:

  1. User uploads a video and code snippet.
  2. Gemini 3 analyzes its unique multi-modal and reasoning capabilities to analyze the video and the code. Both.
  3. Echo-Audit generates a compliance report highlighting WCAG violations with suggested fixes.

Challenges we ran into

• Multimodal integration: Getting Gemini 3 to process both video and code in a single pass required careful orchestration. • WCAG complexity: The guidelines are extensive; mapping them into clear, actionable checks was a balancing act. • Designing for diverse users: Developers want technical detail, while product managers prefer executive summaries. We had to tailor outputs for multiple audiences without overwhelming them.

Accomplishments that we're proud of

Successfully built a working prototype that integrates Gemini 3’s multimodal reasoning into a Streamlit app.

Delivered real-time accessibility audits that combine visual and code analysis in one unified workflow.

Created actionable WCAG compliance reports that are both developer-friendly and executive-ready.

Demonstrated that accessibility can be embedded from the start, not patched in later.

Showcased how Gemini 3 can be leveraged for social impact, ensuring digital experiences are inclusive by default.

What we learned

• How multimodal AI can bridge the gap between visual analysis and semantic code validation. • The nuances of WCAG guidelines, especially around contrast ratios, ARIA labels, and keyboard navigation. • The importance of delivering insights in a way that is actionable and developer-friendly, not just diagnostic. • That accessibility is not just technical—it’s cultural. Building inclusive tools requires empathy as much as engineering.

What's next for Echo-Audit

Expanded Accessibility Coverage: Extend audits to cover advanced WCAG criteria, including multimedia captions, dynamic content, and mobile responsiveness.

Continuous Monitoring: Move from one‑time audits to real‑time monitoring, so teams can catch accessibility regressions during development and deployment.

Developer Integrations: Build plugins for VS Code, GitHub Actions, and CI/CD pipelines, embedding accessibility checks directly into the workflow.

Design Tool Extensions: Integrate with Figma and Adobe XD, allowing designers to validate accessibility before handoff.

Customizable Reports: Offer tailored outputs—technical detail for developers, executive summaries for product managers, and visual dashboards for designers.

Built With

Share this project:

Updates