Check out our website: buymemaybe.us

Inspiration

We’re college students. In less than a month, we have to pack our entire lives into cardboard boxes, and we need to get rid of a LOT of "junk."

We couldn’t sell anything on eBay or Facebook Marketplace… so we made an app where our stuff does the selling for us. With AI videos taking over our feeds (talking fruits/singing dolphins/walking statues), we thought it would be cool to bring our old items to life and have them take over!

What it does

For Sellers: You take a photo of the item you want to sell, and add a few details. In turn, 'Buy Me Maybe' creates a short video where your item pitches itself to potential buyers.

For Buyers: Instead of reading through long descriptions, you scroll through a "For You" feed of talking objects which give you everything you need to know in order to purchase the item.

How we built it

BuyMeMaybe is built with Next.js 16 (App Router), React 19, TypeScript, and Tailwind CSS 4. The backend uses Next.js route handlers for all API endpoints (analyze, generate, feed, likes, items) and Prisma with PostgreSQL to manage listings and short-lived generation jobs. The AI runs on xAI Grok via the OpenAI-compatible client pointed at api.x.ai, handling vision, copywriting, and scripting. Square video clips are generated through Grok Imagine's video API, then post-processed with FFmpeg (fluent-ffmpeg with bundled binaries). In production, the app deploys on Vercel with Vercel Blob for uploads and MP4 storage (since serverless has no writable filesystem) and hosted Postgres via Supabase. The front end is a mobile-style snap-scroll feed using HTML to autoplay the active card. github: buymemaybe github

Challenges we ran into

  1. Generating 'good' AI Videos was the hardest part. Most image-to-video tools really struggle with sync. You usually end up with mouth movements that don't match the words and vocal pacing that feels totally awkward.

  2. AI doesn’t know where a mug would have a face. We had to iterate constantly on how the AI maps "life" onto inanimate objects so that the product’s actual appearance stayed 100% intact while still becoming "alive".

  3. Shortening render time was another challenge we experienced. At first, video generation would take upwards of 5 minutes. We had to iterate on our image-to-video pipeline to ensure the listings would take under a minute, making it user-friendly for both sellers and buyers.

Accomplishments that we're proud of

~ The speed! We managed to optimize our app, so that the image-to-video generation happens in under 60 seconds. Looking back on our first iterations, we shortened the listing generation function to perform in a fourth of the time it did.

~ We had other students at HackPrinceton test our app, and the feedback was great! Hackers loved the UI and said they would definitely use this to hunt for second-hand items.

What we learned

We learned that in a marketplace, attention span is a huge factor when it comes to hitting that "buy now" button. The short form content we see on social media everyday has rewired our brains to crave instant gratification, which inspired the foundational principle of this scrolling marketplace app. On the technical side, we learned how to bridge the gap between high-powered AI models and a user-friendly mobile experience. Happy customers means happy sellers and vice versa!

What's next for Buy Me Maybe

In-App Haggling - We want to add a direct negotiation feature where buyers can haggle with sellers to land on the perfect price without leaving the app.

Local Items - We hope to add geographic markers to prioritize items within your specific area, making shipping/item pickup easy.

App Store Deployment - In the future, we will continue working on the UI/UX with the ultimate goal of launching on the App Store so people everywhere can use 'Buy Me Maybe.'

Built With

Share this project:

Updates