Inspiration

Rewind back to the beginning of this hackathon. Our group was stuck brainstorming a wide variety of ideas, and like any young group, we were spending so much time talking and weren't able to keep track of ideas. But what if all we had to do was speak and have our ideas written down? That's when we thought of Jabber AI. Taking inspiration from sticky note applications and AI that can have conversations, we built a project that listens to your ideas, summarizes them, and takes note of them to keep you organized. All you need to do is hit start and talk!

What it does

Meet your personal assistant Mindy, who is integrated into Jabber AI to help you brainstorm ideas about your next revolutionary project. Mindy helps you talk through your ideas, generate new possibilities, and encourage you when you are stuck. As you speak, GPT-4o processes your spoken ideas into digestible note cards, and displays them in a bento-box layout in your workspace. The workspace is interactive: you can delete note cards and start or stop conversations with Mindy while keeping your workspace untouched. Using Hume's Speech Prosody model, Jabber AI analyzes expressions in the user's voice and emphasizes notes on the screen that the user is excited about.

How we built it

Frontend:

Receiving the summarized notes, we display them out in their own respective note cards, adding different background colors to keep emphasis on unique qualities. We also implemented features such as deleting sticky notes to keep the notes that are important to the user. On the right sidebar, the message history allows the user to look back on responses they might've missed or to keep track of where there thought process went. We also made the notecards able to fit together and limit the gaps between them to allow the user to have a larger view of the notecards.

Backend:

We designed our voice assistant Mindy using Hume AI's EVI configuration by utilizing prompt engineering. We constructed Mindy to have different personality traits like patience and kindness, but also specific goals like helping the user with project inspiration. We passed each user message from the voice conversation into the OpenAI API GPT 4o model, where we gave it specific prompt instructions to process the voice transcript and organize detailed, hierarchal notes. These textual notes were then fed into the front end to be put in each note card. We also utilized Hume's Speech Prosody model by analyzing expressions for the emotions interest, excitement, and surprise, and when there were excessive levels (>0.7), we enabled a special yellow note card for those ideas and created a pulsing effect for that card.

What's next for Jabber AI

We'd like to add the idea of linking different notecards, sort of turning it into a mindmap so that the aspect of the user's "train of thinking" can be seen. Since the AI can recommend different ideas to into regarding a topic, the user could select that specific notecard, speak about topics related to it, and having the produced notecards already linked to the selected notecard.

Who we are

Kevin Zhu: rising sophomore at MIT studying CS and Math

Garman Xu: rising sophomore at NYU interested in intersections between technology and music

Chris Franco: rising sophomore at MIT studying CS

Built With

  • hume-ai
  • nextjs
  • openai
  • react
  • tailwind
Share this project:

Updates