Forgetfulness is something that we all experience, but it is especially prevalent for people with attention deficit hyperactivity disorder (ADHD). Some of our friends who have ADHD are prone to forgetting items if it's not directly in their line of sight—out of sight, out of mind. We wanted to create a tool that would be able to help them find misplaced items, even if they are hidden in drawers or behind doors.

What it does

XRayVision is an assistive technology that enables people to easily locate their personal items. Our goal was to implement two functionalities: tracking and finding. For tracking, the user tags items that they want to track, and these objects will be revealed to the user, even if they are hidden in a drawer, cabinet, or fridge. The finding component would have involved implementing a pathfinding mechanism to direct the user from their current location to their tagged object.

Out of our desired functionalities, we were able to build and deploy the object tagging process on HoloLens. When the user puts on the headset, a tracking cube appears in front of them with a button on top of it. To tag an item (ex. their keys), they just move the cube to the location of the item and press the button to lock it in place. This creates an anchor that keeps the object spatially locked in to the physical reality around it. Even if the key is in a cabinet, the tag will be able to reveal its location.

How we built it

We used Unity, MRTK, and spatial anchors to create tracking tags.

Challenges we ran into

We had some trouble getting Azure Spatial Anchors to work, so we decided to use local native anchors instead. These get stored locally in our HoloLens device rather than in the cloud, so we were limited by storage space and on-device capabilities. Unfortunately, this meant that we couldn't build out our environment map for pathfinding.

Accomplishments that we're proud of

We're proud of our design thinking process! Over the course of the hackathon, we interviewed several people with ADHD to ensure that our solution offered aligned well with what they would find useful, and their insights were key to informing our design choices.

From a development standpoint, we're proud of finding a workaround for the Azure Spatial Anchors by utilizing local native anchors instead.

What we learned

We spent the first day familiarizing ourselves with Unity, HoloLens, and MRTK, which involved running through many tutorials and troubleshooting environment setups. This was the first time any of us had worked with AR/VR tech, and by the end of the hackathon, we felt much more comfortable.

Our team consisted of one designer and two developers. Throughout the process of building out XRayVision, we learned a lot from each other, which was a really lovely experience!

What's next for XRayVision

We would love to build out our pathfinding functionality and voice activated tagging, which would make the user experience more seamless and convenient.

Built With

Share this project: