Inspiration
We we're inspired into making a project that could combine the team's expertise of hardware and software. We felt the Childhood Games Track would be the best track for us to do this in. After brainstorming many ideas from chutes and ladders, flappy bird, and even duck duck goose, we landed on a Where's Waldo inspired game. We all had fond memories of reading the book as children, and we wanted to make something very inspired that was NU related. Hence, we landed on the mascot being Willie rather than Waldo.
What it does
Where's Willie is a digital Where's Waldo inspired game built for the WildHacks 2026 childhood games theme. A flashlight spotlight controlled by mouse or physical hardware reveals a hidden character named Willie against a dark canvas. Find Willie enough times before the timer runs out to advance to the next level. The software is fully built to support physical hardware integration. A physical flashlight tracked by an IMU sensor streams coordinates over USB serial to control the spotlight in real time.
How we built it
The software stack is split across three layers:
Hardware: Proof of Concept. An IMU sensor tracks the flashlight position on the physical board. A microcontroller reads the sensor, normalizes coordinates to a 0-1 range, and streams them over USB serial continuously. The architecture is designed for hardware integration. A Java serial reader bridges USB serial input from a microcontroller to the WebSocket layer, with bidirectional communication built to trigger physical motors on bush proximity. The full hardware pipeline is implemented in software and ready for integration.
Java Backend: A multithreaded WebSocket server reads the serial stream, validates and converts normalized coordinates to screen pixels, and broadcasts position updates to the browser in real time. It also receives bush proximity signals from the frontend and writes indices back to the microcontroller over serial to trigger motor vibrations.
TypeScript Frontend:
A canvas-based game engine built with MVC architecture. A WebSocket client feeds incoming position data into a 60fps game loop. The renderer uses offscreen canvas compositing with destination-out blending to create a darkness mask with a radial gradient flashlight cutout. Collision detection uses rectangle bounds checking with a 2 second hold timer to register a Willie catch.
Challenges we ran into
The hardest challenge was hardware-software integration under time pressure. Coordinating the serial communication protocol between three different technology layers, specifically the C firmware, Java, and TypeScript. Although our proof of concept was solid, issues with Pi and Arduino components ultimately held us back.
The second major challenge was maintaining clean MVC architecture on the frontend while keeping game state accurate across a real-time update cycle. Position updates arrive from hardware asynchronously while the game loop renders at 60fps independently. So we needed to keep GameState as the single source of truth between these two cycles required deliberate architectural discipline.
The canvas flashlight effect also took significant debugging. The destination-out compositing operates on a flat pixel surface with no concept of layers, requiring another layered canvas to isolate the darkness mask from the game scene beneath it.
Accomplishments that we're proud of
- Built a complete real-time hardware-software pipeline in 24 hours
- First time connecting a frontend to a backend over a network achieved via WebSocket with bidirectional communication
- Clean MVC architecture in TypeScript that mirrors enterprise patterns
- Offscreen canvas compositing for a polished flashlight reveal effect
- Designed and implemented full hardware integration layer in software, ready for physical device connection
What we learned
The most important lesson was designing the data contract between hardware and software before writing any code. Agreeing on a simple x,y\n normalized format as the serial protocol meant hardware and software could be built independently.
On the software side, the distinction between WebSocket and REST became concrete. REST polling at 60fps would have been unusable, WebSocket push made real time hardware control possible.
Architecturally, separating rendering from game logic in the frontend meant the canvas layer never needed to know where data came from. Whether it was mock serial data or real hardware, the game loop behaved identically.
What's next for Where's Willie?
- Complete physical hardware integration with IMU sensor tracked flashlight and servo motor bush vibration system is built and pending final assembly
- Letter and word system ie each Willie holds a letter, catching enough Willies spells a word
- Global leaderboard with persistent high scores
- Multiple difficulty levels with increasing Willie count and speed
- Polished physical board with themed artwork
Built With
- c
- css
- google-gson
- html5
- imu
- java
- jserialcomm
- maven
- pico
- solidworks
- typescript
- websockets
Log in or sign up for Devpost to join the conversation.