Inspiration
Every developer has lived this moment — you're deep in a problem, you need one specific answer, and suddenly you're 10 tabs deep in Confluence, outdated READMEs, and Stack Overflow threads that may or may not apply to your version. The information exists somewhere in your docs. It's just buried. We kept asking: why can't documentation just talk back? That frustration became Nova DevDocs
What it does
Nova DevDocs is a real-time voice assistant for developers. You press a mic button, ask your question out loud — "How do I set up OAuth2 in my Node.js app?" — and Nova answers you in speech, instantly, pulling the answer directly from your team's own documentation. It supports any knowledge base: GitHub READMEs, Notion pages, Confluence exports, PDFs, or plain text files.
How we built it
We built a four-layer architecture entirely on AWS Bedrock. Amazon Nova 2 Sonic sits at the voice layer — capturing audio input, understanding intent in real time, and synthesizing the spoken response back to the developer. Amazon Nova 2 Lite powers the agentic reasoning layer — it runs a tool-use loop, calling our search tool multiple times if needed, retrieving relevant document chunks, and synthesizing a precise answer before handing it back to Sonic .
Challenges we ran into
Latency. Chaining two Nova models in sequence adds up fast, and voice users notice every extra millisecond. Prompt engineering for voice. Answers that read well on screen often sound terrible spoken aloud — too many bullet points, markdown syntax being read literally, responses that are too long. Audio quality across devices. Microphone input varies wildly across browsers and hardware. Normalizing audio before sending it to Nova 2 Sonic was trickier than expected.
Accomplishments that we're proud of
We're proud that the core loop actually works — you speak, Nova thinks, Nova answers, all in under two seconds. Getting Nova 2 Sonic and Nova 2 Lite to work in tandem as a coherent pipeline felt like a genuine breakthrough moment. We're also proud of how universally useful the knowledge base connector turned out — being able to drop any markdown or text file and have it immediately searchable via voice is something any engineering team could use on day one.
What we learned
We learned how transformative real speech-to-speech AI is when latency is low enough to feel conversational — Nova 2 Sonic makes voice feel natural in a way that previous TTS/STT pipelines never quite achieved. We learned that agentic tool use with Nova 2 Lite is genuinely powerful: a model that decides what to search, how many times, and how to synthesize across sources produces dramatically better answers than simple single-query retrieval.
What's next for Nova DevDocs
The immediate next step is deeper AWS integration — connecting Nova DevDocs directly to AWS CodeCommit, S3 buckets, and Confluence via native connectors so teams can point it at their entire documentation infrastructure without manual file uploads. We want to add Nova multimodal embeddings to replace BM25 with semantic search for dramatically more accurate retrieval
Built With
- 2
- amazon
- amazon-web-services
- api
- bedrock
- bm25
- css3
- express.js
- html5
- javascript
- lite
- node.js
- nova
- rest
- sonic
- speech
- web
Log in or sign up for Devpost to join the conversation.