What it does

RoRadar is a parent-facing Roblox safety screening tool. A parent can enter a Roblox username and get a structured snapshot of the account’s public risk signals, including suspicious friends, flagged games, and a readable overall profile score. Rather than just a black-box number, RoRadar breaks the result down into explainable factors so a parent can see what actually drove the assessment.

On the game side, RoRadar looks at publicly visible game associations, scores them for signals tied to unsafe social environments, and then checks wider community discussion for corroborating evidence. On the account side, it analyzes friend profiles and graph signals such as account age, suspicious profile language, mutual-friend context, and growth patterns. The result is meant to help parents notice accounts, games, and patterns that deserve a closer look before a problem escalates.

How we built it

We built RoRadar as a full-stack Next.js application with a React frontend and a server-driven assessment pipeline. The app takes a Roblox username, pulls together public Roblox profile, friend, and game data, and then runs that data through a deterministic scoring model we designed so the results stay explainable. We also added account-backed saved profiles, downloadable PDF reports, and a modal-based review flow so the experience feels like a real product rather than a static demo.

For infrastructure, we used Supabase for persistence and caching, and an auth layer so saved children are tied to the correct parent account. For deeper game investigation, we layered in external corroboration from Roblox-related public discussion sources and wider web search. Because raw search results were too noisy to trust directly, we paired retrieval with an AI-based evidence validation step so the app can distinguish between generic Roblox safety chatter and discussion that is actually about the specific game being scored.

Challenges we ran into

Our biggest challenge was building something trustworthy from incomplete and messy public data. We originally wanted richer activity signals, especially around recent play behavior, but public Roblox access was much more limited than expected. That forced us to redesign parts of the product around the data we could reliably access, such as profile details, friend graphs, favorite games, created experiences, and direct place lookups.

A second major challenge was false positives. Generic terms like “friends,” “chat,” or “voice chat” can appear in completely harmless contexts, and broad web or forum posts can mention a theme without actually referring to the exact game we were scoring. We had to spend a lot of time tuning weights, tightening matching rules, and separating explainable rule-based scoring from AI-assisted evidence validation so the product stayed useful without becoming alarmist.

We also ran into engineering problems that made the project harder than a typical hackathon prototype: account and database wiring, caching expensive web searches, graceful failure when external services were unavailable, PDF generation bugs, and deployment issues that only appeared in production builds. A lot of the real work was making the app resilient enough to behave well even when parts of the pipeline failed.

Accomplishments that we're proud of

The thing we are most proud of is that RoRadar is not just a flashy interface with a vague AI score. It is a working system with explainable scoring, real external data integration, saved account monitoring, and downloadable reports. Parents can inspect why a friend or game was flagged, open supporting evidence, and review a structured breakdown rather than being asked to blindly trust the model.

We are also proud of how much we were able to turn messy public data into something coherent and product-like. The combination of deterministic scoring, cached search- backed corroboration, and AI validation gives RoRadar a level of depth that goes beyond a simple username lookup. We also pushed the project past “demo-only” territory by adding persistence, report export, graceful fallback behavior, and a polished interface that can support real-world parent workflows.

What's next for RoRadar

The next step is expanding coverage while keeping the model trustworthy. The biggest opportunity is bringing in better gameplay-context signals, especially recent-play behavior if we can access it in a reliable and policy-safe way. We also want to deepen the evidence layer with broader creator reputation signals, stronger platform-specific corroboration, and continued tuning so the system gets more precise without becoming overly aggressive.

On the product side, the next phase is turning RoRadar from a one-time screening tool into an ongoing safety dashboard. That means richer saved-profile history, clearer trend tracking over time, stronger report workflows for parents, and continued mobile and usability refinements. The long-term goal is for RoRadar to become a practical decision-support tool for families trying to understand online risk, not just a one-off risk score.

Built With

Share this project:

Updates