Inspiration

We have all done user interviews in the past and felt overwhelmed with the amount of unstructured information. A seamless integration from data sources to mind maps allows product managers to summarise information and visually see trends about their product.

What it does

Ingests user interview transcriptions of user interviews and outputs a multi-level mind map.

How we built it

Frontend: Next JS (React) + MermaidJS Backend: Python + Claude + OpenAI Ada

Challenges we ran into

Due to the 1 request/ second rate limit, we could not pursue our original idea of chunking the transcriptions and combining the insights BUT Claude impressed us all with its ability to accurately make observations with a large context length.

Accomplishments that we're proud of

Finishing the product and actually doing user interviews about the hackathon helped us build a really compelling demo!

What we learned

Techniques commonly used with other LLMs aren't super effective with Claude. Its ability to capture all information in a large corpus truly cut the build time by a lot !

What's next for ClaudeMap

Support more input formats and improve the mind map capabilities. We would love to make the graph more interactive (perhaps video snippets with video inputs).

Built With

Share this project:

Updates