About the Project: JobJam

Inspiration

As AI-based interview prep tools become more common, I realized that many of them lacked something critical: real human connection. While these apps can simulate interview questions and offer advice, they don’t capture the spontaneity and social pressure of an actual conversation with another person.

At the same time, I noticed that many teens and young adults don't have access to the professional networks or mentorship opportunities needed to practice for real interviews. Communication and presentation skills are essential, but practicing them in isolation isn’t always effective.

I thought — what if we could build something like a purposeful version of Omegle? A platform where people can instantly connect based on shared career interests and practice interviewing together. That idea became JobJam.

What I Learned

This was my first time building a full app from scratch. I had no prior experience with networked applications, API calls, or even backend infrastructure. Through this process, I taught myself:

  • The basics of Swift and SwiftUI
  • How to integrate APIs and manage asynchronous networking
  • Real-time database handling with Firebase
  • Securely handling tokens and video connections using Agora
  • Embedding AI and machine learning models into user flows

More than anything, I learned what it takes to go from idea to execution with a full-stack app.

How I Built It

  1. Anonymous Authentication
    Users sign in anonymously using Firebase Authentication. This ensures that no personal information is required, lowering the barrier to entry.

  2. Job Description Input and AI Embedding
    Users enter a short job description. I use the Hugging Face model intfloat/e5-large-v2 to generate an embedding for this text and store it in Firestore.

  3. Matching Algorithm
    I use cosine similarity to compare embeddings across users in the Firestore database:

$$ \text{similarity}(A, B) = \frac{A \cdot B}{|A| |B|} $$

Once a high enough similarity is found, a match is created and both users are assigned to a shared video room.

  1. Video Chat with Agora
    Using Agora’s video SDK and a token server deployed via Render, matched users are brought into a live video chat. Users can mute their audio, turn off their video, or leave the call at any time.

  2. Chatbot Assistant
    During the video session, users can also open "JobBot" — a chatbot powered by Gemini’s API — to receive resume advice, job tips, or practice questions.

Challenges Faced

Building JobJam came with many technical challenges:

  • Handling asynchronous API calls while keeping the app responsive
  • Syncing user states across devices (e.g., muting or disabling video)
  • Preventing race conditions when two users attempt to match at the same time
  • Structuring my Firestore database to support real-time updates
  • Managing Agora token generation and ensuring secure room access
  • Dealing with edge cases like users quitting mid-call or losing connection

Even simple features, like showing that another user has their video turned off, turned out to be more complex than expected due to sync issues between devices.

Final Thoughts

Through this project, I gained hands-on experience with app development, machine learning integration, and real-time communication systems. More importantly, I built a product that supports a meaningful cause.

JobJam addresses United Nations Sustainable Development Goal 8: Decent Work and Economic Growth by helping young users:

  • Practice interview skills
  • Build professional confidence
  • Connect with like-minded peers
  • Prepare for jobs in an accessible, anonymous, and inclusive environment

Built With

  • agora
  • comma-separated-version:-**built-with:**-swiftui
  • firebase
  • firestore
  • gemeni
  • gemini-api
  • hugging-face-(`intfloat/e5-large-v2`)
  • huggingface
  • intfloat/e5-large-v2
  • render
  • swift
  • swiftuirtc
  • xcode
Share this project:

Updates