What it does

The Follow Me team has built an augmented reality tool that saves lives; a fire drill training application.

Traditional fire escape drills are critical for saving lives, but have several inherent problems. They are highly disruptive to class or work schedules, are time consuming and costly to organize, and safety information retention is abysmal. Participants go through the motions passively, and aren’t engaged in a way that ingrains the procedures into their memory. More problematic, there is no documentation that an occupant has completed a drill or retained the information. They could be absent, new to campus, or a new employee, and a fire emergency can happen anytime.

Our implementation is cheaper and less disruptive than mass fire drills. The app enhances learning retention by using gamification, multimodal teaching, comprehension acknowledgement, and repetition. The entire process can be automated through existing people management systems to schedule and complete training modules, log results, and trigger annual recertification.

For the hack, we have achieved a functional fire drill training module. For fun, the app is stylized for kids, but a full build could easily swap in age appropriate styles. We have implemented 3 skill tests, and a single exit path from an assembly space to outdoor safety. The exit path was authored by generating the spatial mapping mesh from Hololens, and although not yet implemented, a full app would include “authoring” mode to enable a building manager to easily create the modules themselves by recording as they walk each exit path and place the safety skills along that path from a prefab library. We’re excited for everyone to try it out, and who knows, maybe what you learn will save your life someday.

How we built it

Built in Unity C# 2017.1.2.1f1, and using the Mixed Reality Toolkit library. We used the spatial mapping utility to output an obj of the exit path. Using the mesh as reference, we placed coded prefab skill modules. Each module is coded to play when triggered through a sequence; audio overview, an animation visual, a verbal acknowledgement, a success sound cue.

Challenges we ran into

Calibration: We weren't able to get to a calibration step to make sure that regardless of where the app is launched, the digital path automatically aligns with the real world path.

Drift: Similar to calibration, we would want the path to recalibrate alignment along the path to ensure that even if there is drift in the authored path, or tracking of the user's path, that the two remain aligned. In our hack, if the calibration is incorrect or the device loses tracking, the path to follow can end up misaligned and inaccessible (for example on the other side of a wall).

Accomplishments that we're proud of

Modularizing prefab code for skills to allow easy authorship of additional skills using 1) voice 2) visual animation 3) user voice acknowledgement 4) success audio. This would allow a building manager to easily place skills into a given scenario, and also to author custom skills via a simple web app.

Authoring of path spline and skill prefab locations by capturing a mesh using the spatial mapping utility of the HoloLens.

Utilizing speech input as a way to force users to acknowledge their understanding of a skill and improve their likelihood of retaining the information.

Using visual and audible teaching to cater to individual learning strengths.

Making a ho-hum subject fun and engaging.

What we learned

The limited FOV of the HoloLens is a fun design challenge. We had to design the UX so that a user had constant cues as to where to look and keep track of the path they are following.

Indoor location and navigation is a completely different problem than outdoor navigation.

What's next for Follow Me

For Follow Me to be viable in the real world, we need to solve using the HoloLens spatial awareness to calibrate to a path in a space upon launching, and to make implementation practical, we need the easy authoring mode for non-developers. We would need an API to allow Follow Me to be linked up to various people management systems for scheduling, equipment checkout, and logging training results.

We're also interested in exploring whether Mobile AR can achieve a similar level of beacon-less tracking and accuracy indoors as tools like ARKit and ARCore improve, and to see if the visual impact of mobile AR is as effective for digesting the information.

Lastly, we're excited about applying the same learnings and capabilities to other educational and entertainment applications for indoor, location specific content.

Attributions:

Star icons: Created by Gregor Cresnar on the Noun Project (https://thenounproject.com/grega.cresnar/) Sounds: Includes sounds from freesound Alarm Sound by use Kinoton (https://freesound.org/people/Kinoton/sounds/420661/) Star / Success Sound by user Gabriel Araujo (https://freesound.org/people/GabrielAraujo/sounds/242501/) Bird Sound adapted from user juskiddink (https://freesound.org/people/juskiddink/sounds/56301/) 3dmodel hand attribution Low Poly Hand Model by sketchfab user scribbletoad Creative Commons 4.0: https://sketchfab.com/models/d6c802a74a174c8c805deb20186d1877? Edits: 1.) phong tag was added to create smoothing groups 2.) The fingers were rigged for animation 3.) 3 Animations were created: 1. Pointing 2. Pulling 3. Cupping

Built With

Share this project:
×

Updates