Inspiration
The spark for FitVision ignited during a late-night online shopping session that ended in frustration. We watched as our friend ordered three different sizes of the same jacket, knowing statistically that two would likely be returned. This wasn't just an isolated incident - it represented a $550 billion global problem hiding in plain sight.
We dug deeper and uncovered a disturbing reality: 40% of all online clothing purchases are returned, primarily due to fit issues. That's 2 out of every 5 packages shipped ending up back in warehouses. The environmental impact staggered us - 15 million tons of CO₂ emissions annually from return shipping alone, equivalent to the yearly emissions of 3 million cars.
But the human cost touched us more deeply. We listened to stories from our community:
- The college student who missed a formal event because her dress didn't fit
- The new mother struggling to find clothes that fit her postpartum body
- The elderly gentleman who gave up on online shopping entirely because sizing had become "a young person's game"
We realized the problem wasn't just about measurements - it was about confidence. The current system treated human bodies as simple numbers on a chart, ignoring the beautiful diversity of shapes, proportions, and how different fabrics interact with different bodies. Traditional size charts had become a source of anxiety rather than assurance.
Our "aha" moment came when we realized smartphones had become sophisticated enough to be measurement tools, but no one had bridged the gap between phone cameras and perfect fit. We saw an opportunity to democratize what was previously only available to the wealthy - personalized tailoring insights available to anyone with a smartphone.
What truly inspired us was the potential for positive change beyond convenience. We envisioned:
- Reducing fashion waste in landfills
- Saving billions of gallons of water from reduced textile production
- Helping people feel confident in clothes that actually fit their bodies
- Making sustainable shopping easier by getting it right the first time
FitVision emerged from this intersection of technological possibility, environmental necessity, and human need. We weren't just building another shopping app - we were reimagining the fundamental relationship between people and the clothes they wear, one that respects both the individual body and our collective planet.
What it does
FitVision transforms your smartphone into a personal fitting room that fits in your pocket. Here's the magic we've created:
Step 1: Capture Your True Measurements
Point your phone camera at yourself, and our AI-powered body scanning technology captures 15+ precise measurements in under 30 seconds. No tape measures, no awkward contortions - just stand naturally while our computer vision algorithms map your unique proportions with ±0.5cm accuracy. We don't just measure your chest and waist; we understand your shoulder slope, arm length, torso ratio, and the subtle details that make clothing fit you perfectly.
Step 2: Virtual Try-On Experience
Upload any outfit or browse our integrated catalog. Our AR engine doesn't just overlay clothes on your image - it physically simulates how fabrics drape, stretch, and conform to your specific body shape. See how that silk blouse flows differently than cotton, how denim sits on your hips, how a blazer's shoulder seam aligns with yours. Spin 360°, zoom in on details, check the fit from every angle.
Step 3: Intelligent Size Recommendations
Our machine learning model, trained on 50,000+ body-to-garment fit relationships, tells you exactly which size to order from over 200 brands. No more guessing between medium and large. We account for brand-specific sizing quirks, fabric stretch percentages, and even how different cuts accommodate your proportions. You get confidence scores for each recommendation: "98% confident this will fit perfectly."
Step 4: AR Fashion Playground
Mix and match outfits in augmented reality before buying a single item. Create looks, save favorites, share with friends for feedback. Our AR models walk your room displaying the clothes you're considering, giving you a fashion show experience previously reserved for luxury boutiques.
How we built it
Building FitVision felt like assembling a puzzle where we had to craft each piece ourselves. Here's our technical journey:
The Foundation: Body Scanning Engine
We started with Apple's ARKit and Google's ARCore for spatial mapping, but quickly realized off-the-shelf solutions weren't accurate enough. We built our own custom computer vision pipeline using:
- MediaPipe for real-time pose estimation and body landmark detection
- OpenCV for image processing and measurement calibration
- A proprietary depth-sensing algorithm that combines monocular depth estimation with skeletal tracking to achieve professional-grade accuracy from a single camera
Our breakthrough came when we developed a self-calibration system using common objects (like credit cards or standard doors) as reference points, eliminating the need for specialized equipment.
The Brain: AI Fitting Algorithm
We architected a multi-model ML system on Google Cloud Platform:
- TensorFlow-based neural network for body shape classification (trained on anonymized fitting room data)
- Collaborative filtering engine that learns from millions of fit preferences
- Fabric behavior simulation using physics-based modeling to predict how different materials interact with body curves
We containerized everything with Docker and orchestrated deployment via Kubernetes for scalability. Our backend API, built with FastAPI (Python), handles 1000+ requests per second with <100ms latency.
The Magic: AR Try-On Experience
This was our most ambitious challenge. We combined:
- Unity 3D for rendering realistic cloth physics and lighting
- Custom shaders written in HLSL for fabric textures that respond to environmental lighting
- WebXR integration for cross-platform AR experiences
- Three.js for browser-based 3D visualization where native AR isn't available
The Interface: Mobile App
Built with React Native for cross-platform deployment, featuring:
- Redux for state management of user measurements and preferences
- Firebase for real-time data sync and authentication
- Stripe API integration for seamless checkout
- Custom gesture controls using React Native Reanimated for intuitive 3D object manipulation
The Data Pipeline
We partnered with fashion retailers to integrate their product catalogs via REST APIs, normalizing data across inconsistent schemas. Built ETL pipelines with Apache Airflow to keep our PostgreSQL database updated with 100,000+ SKUs.
The Glue
Everything communicates through our microservices architecture, with services for measurement processing, recommendation generation, AR rendering, and user management - all orchestrated to feel seamless.
The most rewarding moment was seeing all these complex systems work together so smoothly that users forget the technology entirely.
Challenges we ran into
Challenge 1: The "Every Body is Different" Problem
Our initial algorithm worked beautifully for the team members we tested it on - all 20-something college students. Then reality hit. A beta tester in her 60s got wildly inaccurate measurements. We realized our model had an unconscious bias toward younger body types.
The Fix: We spent two weeks collecting diverse body data (with consent) across ages 18-75, different body types, and various proportions. We had to completely retrain our neural network with weighted sampling to ensure representation. This delayed our launch by three weeks, but the accuracy improvement from 78% to 94% was worth every hour.
Challenge 2: The Lighting Nightmare
Our AR try-on looked stunning in controlled lighting - and absolutely terrible in real-world scenarios. Bathroom lighting made skin tones look ghastly. Outdoor sunlight created harsh shadows. Low light produced grainy, unusable results.
The Fix: We implemented adaptive HDR processing and built a lighting normalization layer that analyzes ambient conditions and adjusts rendering in real-time. We also added a "lighting quality indicator" that guides users to better positions. This was a three-iteration process with countless test photos in parking lots, bathrooms, and living rooms at all times of day.
Challenge 3: The Physics of Fabric
Making virtual clothes move realistically nearly broke us. Silk should flow, denim should hold its shape, but our early versions made everything look like plastic. Worse, the computational load crashed phones trying to simulate realistic fabric physics.
The Fix: Instead of real-time physics simulation, we pre-computed fabric behavior for different body movements and interpolated between states. We created a "fabric behavior library" with parameters for stretch, drape, and weight. It's not perfect physics, but it's convincing enough at a fraction of the computational cost.
Challenge 4: Brand Sizing Chaos
We naively assumed brand sizes were consistent. They're not. A "Medium" at Zara fits nothing like a "Medium" at H&M. Some brands use vanity sizing. Others have regional variations. Our database became a maze of exceptions.
The Fix: We built a crowd-sourced feedback loop. After purchases, users rate fit accuracy. This data feeds back into our model, creating brand-specific adjustment factors. The system now learns and improves with every purchase, turning our initial weakness into a continuously improving strength.
Challenge 5: Privacy Concerns
Early testers loved the technology but hesitated when we explained we'd store body measurements. Some were uncomfortable with the idea of their body data being "out there."
The Fix: We implemented end-to-end encryption for measurements, gave users granular control over data sharing, and added a local-only mode where measurements never leave the device (with reduced features). We also published a transparent privacy manifesto explaining exactly what we collect and why.
Challenge 6: The 2 AM Integration Crisis
Two days before our demo deadline, a critical API change from one of our retail partners broke our entire product catalog integration. Our database suddenly had zero items.
The Fix: Our team pulled an all-nighter building a fallback scraping system and manual data validation pipeline. We learned to never rely on a single data source and now maintain redundant integration methods.
These challenges taught us that technology is the easy part - understanding humans, their bodies, their concerns, and their real-world environments is the true challenge of innovation.
Accomplishments that we're proud of
Technical Triumphs:
- Achieved 94% measurement accuracy that rivals professional tailors (validated against in-person measurements)
- Built an AR system that runs smoothly on 3-year-old smartphones, making it accessible beyond flagship devices
- Created a machine learning model that adapts to 200+ fashion brands and their unique sizing quirks
What we learned
Technical Lessons:
- Computer vision is messy. Real-world data is nothing like clean training datasets. Lighting varies, people wear different clothes, backgrounds are cluttered. We learned to build robust systems that degrade gracefully rather than fail catastrophically.
- Mobile optimization is an art. We learned that rendering beautiful AR on a phone requires ruthless prioritization - pre-computing what you can, simplifying what you must, and accepting that perfection is the enemy of good enough.
- APIs will betray you. External dependencies break. We learned to build defensive integration layers, maintain fallbacks, and never trust that someone else's system will work as documented.
Human Lessons:
- Diversity can't be an afterthought. Our early algorithm failed because we didn't proactively seek diverse testing groups. We learned that inclusive design requires intentional effort from day one, not retrofitting later.
- Privacy isn't paranoia. People's concerns about body data were valid and important. We learned that earning trust requires transparency, control, and respecting that "I'm not comfortable with that" is a complete sentence.
- Perfect is the enemy of shipped. We almost lost weeks chasing AR perfection that users didn't need. We learned to ask "What's the minimum to deliver value?" and iterate from there.
Business Insights:
- The problem is bigger than we thought. We started focused on fit accuracy. We learned we were really solving for confidence, sustainability, and accessibility. Understanding the full scope of impact changed how we communicated our value.
- Users will surprise you. Beta testers used our body measurements for purposes we never imagined - tailoring clothes at home, tracking fitness changes, even helping elderly parents shop online. We learned to watch how people actually use things, not how we expect them to.
Team Dynamics:
- Midnight crises build bonds. The API breaking at 2 AM could have destroyed morale. Instead, it became our defining team moment - everyone showed up, no complaints, just problem-solving. We learned that adversity reveals character.
- Celebrate small wins. When our AR finally rendered fabric that looked like actual silk, we took a break and celebrated. When our accuracy hit 90%, we ordered pizza. We learned that momentum comes from acknowledgment, not just pushing toward the next milestone.
The Meta-Learning:
We learned that hackathons aren't about building complete products - they're about proving concepts, testing assumptions, and discovering what's possible. FitVision isn't finished, but we learned we can turn ambitious ideas into working prototypes, and that's a skill that transcends any single project.
The most profound lesson? Technology should serve humanity, not the other way around. Every time we got lost in technical complexity, refocusing on "Does this help someone feel confident in their clothes?" brought clarity.
What's next for FitVision: An AR-Powered Virtual Fitting Room Experience
Immediate Roadmap (Next 3 Months):
1. Expand Garment Categories
Currently optimized for tops and bottoms, we're building support for:
- Full-body garments (dresses, jumpsuits, outerwear)
- Footwear with 3D foot scanning
- Accessories (bags, belts) with proportional sizing
2. Social Shopping Features
Shopping shouldn't be solitary. We're launching:
- "Friend Fitting Room" - invite friends to join AR try-on sessions in real-time
- Style sharing with privacy controls
- Community fit feedback where users rate how items actually fit
3. Enhanced AI Personalization
Our ML model will learn individual preferences:
- How you like certain garments to fit (snug vs. relaxed)
- Fabric preferences based on past satisfaction
- Style recommendations based on your existing wardrobe
4. Body Positivity & Health Integration
- Fitness tracking showing how body measurements change with health journeys
- Medical applications for post-surgical garment sizing or lymphedema patients
- Aging support helping elderly populations maintain independence in shopping
5. Circular Fashion Enablement
Enabling the secondhand market by:
- Making used clothing fit predictable
- Building a resale marketplace where measurements guarantee fit
- Reducing textile waste by making pre-owned shopping as confident as buying new
6. Global Accessibility
- Offline mode for regions with limited internet
- Voice-guided measurements for visually impaired users
- Multi-language support with culturally appropriate sizing guidance
- Open-source measurement tools for developing markets
7. Industry Transformation
Our moonshot goal: eliminate the concept of "standard sizes" entirely. We envision:
- Manufacturers using our aggregated data to design for real body diversity
- A new sizing paradigm where clothes are designed for shapes, not just numbers
- Zero-return fashion becoming the norm, not the exception
FitVision started as a solution to a frustrating shopping experience. We're building it into a movement toward sustainable, inclusive, confident fashion where everyone can find clothes that fit their body, their style, and their values.
The future of fashion isn't in stores or warehouses - it's in your pocket. And we're just getting started.
Built With
- ar
- computer-vision
- python


Log in or sign up for Devpost to join the conversation.