Project Story: PrizePicks Companion ๐ฏ What Inspired Me As a sports fan who often watches games alone or in small groups, I've always felt something was missing from the second-screen experience. Most sports apps either show basic stats or social feeds, but none truly make you feel like you're part of the stadium atmosphere.
When I saw the PrizePicks hackathon challenge, I immediately thought: "What if your phone could actually make you feel like you're in the stands, not just watching from your couch?"
The inspiration came from three personal experiences:
Missing iconic moments and wishing I could instantly clip and share them
Watching international games where I couldn't hear the crowd energy
Trying to understand player movements beyond what the broadcast camera shows
๐ What I Learned Technical Growth: typescript // From zero to functional React Native app in days const learningJourney = { startingPoint: "Complete beginner in mobile development", challenges: [ "React Native ecosystem setup", "Android Studio configuration", "WebSocket real-time integration", "Mobile UI/UX best practices" ], breakthroughs: [ "Mastered component-based architecture", "Understood mobile navigation patterns", "Implemented real-time data flow", "Created responsive designs for multiple screen sizes" ] }; Key Technical Learnings: React Native Architecture: Understanding the bridge between JavaScript and native modules
Mobile-Specific UX: Touch interactions, swipe gestures, and mobile-optimized layouts
Real-Time Data: WebSocket connections and efficient state management for live updates
Performance Optimization: Preventing re-renders and managing memory on mobile devices
๐๏ธ How I Built the Project
Phase 1: Foundation (Days 1-2)
text
๐ฑ React Native Setup
โโโ Environment configuration (Node.js, Android Studio)
โโโ Project initialization with TypeScript
โโโ Basic navigation structure
โโโ UI component library integration
Phase 2: Core Features (Days 3-5)
text
๐ฏ Feature Implementation
โโโ ClipIt: Moment capture with buffer system
โโโ Crowd Sound: Audio streaming and WebSocket integration
โโโ 3D Field View: Player tracking with mock data
โโโ Social Layer: Real-time chat and notifications
Phase 3: Polish & Integration (Days 6-7)
text
โจ Refinement
โโโ UI/UX improvements with Tailwind CSS
โโโ Performance optimization
โโโ Cross-platform testing
โโโ Bug fixes and edge cases
Technical Architecture:
โก Technical Implementation Highlights Real-Time Clip System: typescript // Circular buffer for last 15 seconds of gameplay class ClipBuffer { private buffer: GameFrame[] = []; private readonly maxSize = 15; // 15 seconds
addFrame(frame: GameFrame) { if (this.buffer.length >= this.maxSize) { this.buffer.shift(); // Remove oldest frame } this.buffer.push(frame); }
createClip(): Clip {
return {
id: generateId(),
frames: [...this.buffer], // Copy current buffer
timestamp: Date.now()
};
}
}
Crowd Sound WebSocket Integration:
typescript
// Bi-directional audio streaming
const crowdSoundService = {
connectToStadium: (gameId: string) => {
const socket = io(${WEBSOCKET_URL}/crowd-sound/${gameId});
socket.on('audio-stream', (audioData: AudioChunk) => {
AudioService.playChunk(audioData);
});
socket.on('crowd-reaction', (reaction: CrowdReaction) => {
showCrowdNotification(reaction);
});
},
sendCheer: (type: 'cheer' | 'boo') => { socket.emit('user-reaction', { type, userId, timestamp: Date.now() }); } }; ๐ Challenges I Faced
- Android Environment Setup Challenge: Configuring Android Studio, SDKs, and emulators was overwhelming for a beginner.
Solution: Created step-by-step documentation and learned to troubleshoot common issues like missing SDK packages and environment variables.
- Real-Time Data Synchronization Challenge: Keeping the UI responsive while handling WebSocket streams and user interactions.
Solution: Implemented debounced updates and optimized re-renders using React.memo and useCallback.
- Audio Streaming Performance Challenge: Streaming crowd sounds without lag or audio glitches on mobile networks.
Solution: Used adaptive bitrate streaming and implemented audio buffering with predictive loading.
- Cross-Platform Compatibility Challenge: Ensuring consistent experience across iOS and Android with different screen sizes.
Solution: Used responsive design principles and platform-specific optimizations.
๐ Mathematical Model for Player Tracking For the 3D Field View, I implemented a simple physics-based player movement prediction:
text Player position prediction using velocity and acceleration:
Let:
Pโ = position at time t
Vโ = velocity at time t
Aโ = acceleration at time t
Predicted position at time t+ฮt: Pโโโ = Pโ + Vโยทฮt + ยฝยทAโยท(ฮt)ยฒ
This allowed smooth player movements even with intermittent WebSocket updates. ๐ Key Innovations Circular Buffer Clipping: Efficiently capturing the last X seconds without continuous recording
Crowd Sound Layering: Mixing live stadium audio with user-generated reactions
Predictive Player Movement: Smooth animations based on physics calculations
Progressive WebSocket Connection: Graceful degradation when network quality changes
๐ The Result What started as a beginner's attempt to understand mobile development became a fully functional second-screen experience that genuinely enhances live sports viewing. The app successfully bridges the gap between passive watching and active participation, making every user feel connected to both the game and fellow fans.
This project taught me that with the right tools and determination, even complete beginners can build something meaningful in a short timeframe. The React Native ecosystem proved incredibly powerful for rapid prototyping and cross-platform development.

Log in or sign up for Devpost to join the conversation.