Inspiration
We’re huge F1 fans, we love the thrill, the data, the strategy, and the emotion behind every race. So we thought, what if we built something we’d actually want to watch and play with ourselves? We imagined it from the point of view of both a driver and a race operator, what would they want to see, track, and analyze in real time? That’s how EagleX started, a mix of passion, curiosity, and the idea of making racing feel more alive and intelligent.
What it does
EagleX is a real-time race simulator that shows not just motion, but emotion. Cars (or drones) move dynamically, crash, recover, and even react based on driver focus and stress thanks to our NeuroDrive concept. It’s got live commentary, a mini-map, driver's POV,a leaderboard, and the drama of a full broadcast, right inside your browser.
How we built it
We used React, TypeScript, Vite, Tailwind, and shadcn-ui to design a fast, futuristic interface. The race logic and movement engine were built in TypeScript, controlling speed, crashes, laps, and NeuroDrive reactions. We deployed the project on Vercel for instant accessibility and smooth performance.
Challenges we ran into
.Since this is our raw prototype, the UI isn’t fully refined yet, we focused first on getting the simulation and logic working perfectly. Because of the short submission deadline, we couldn’t polish the visuals as much as we wanted, but we absolutely plan to upgrade the design post-submission.
Accomplishments that we're proud of
We discovered a new way of simulating,blending logic, visuals, and emotional cues for a more lifelike experience. We’re proud that we got a fully functional, real-time simulator working smoothly and proved the concept of emotion-driven motion in action.
What we learned
We want to refine the UI, add multiplayer and spectator modes, and experiment with real sensor-based NeuroDrive input. We’re also exploring ways to extend it to drones and supply-chain systems, turning EagleX into a full mobility simulation platform.
What's next for EagleX
We want to refine the UI, add multiplayer and spectator modes, and experiment with real sensor-based NeuroDrive input. We’re also exploring ways to extend it to drones and supply-chain systems, turning EagleX into a full mobility simulation platform.
Built With
- express.js
- figma
- framer-motion
- git
- github
- html5
- javascript-(es6+)
- node.js
- react
- shadcn-ui
- tailwind-css
- typescript
- vercel
- vite
- websockets
Log in or sign up for Devpost to join the conversation.