• 2 existing production pilot solutions we've developed- "AR CheatSheet" and "Inspect & Collect", which contain fully-functional, workplace-integrated foundational building blocks for AXAPMAV.

What it does

AR CheatSheet - A Hololens solution, which leverages computer vision and AR to present an operator with real-time contextual guide overlays allowing anyone to become an expert in an instant.

Inspect & Collect - An AR/IoT tablet-based assisted inspection app, which standardizes workflow, captures and archives data around each executable task, and intelligently parses data to create predictive reports for upstream machine maintenance and human resource optimization.

How we built it

Note: the process described below applied generally to both "AR CheatSheet" and "Inspect & Collect"


  • Step 1 - capture expert operator's POV via head-mounted 360 video camera
  • Step 2 - immersively study the expert's workflow from his or her direct POV as if we are executing the task
  • Step 3 - document and vet workflow with SMEs; define and finalize project requirements


  • Step 4 - create instructional design storyboard adding graphics and animations to steps
  • Step 5 - create test create test create test UX/UI options to optimize placement and intractability of virtual assets layered into real world


  • Step 6 - decide on best hardware, software and AI development criteria for specific work environment requirements - ie. wearable vs tablet vs handheld - X sdk vs Y sdk, local vs networked etc
  • Step 7 - program/test (alpha, beta, gold)

Deploy [Phased release cycle]:

  • Step 8 - UX/UI + Functionality 'release & refine' sprint cycles based on closed feedback loops
  • Step 9 - Progressively evolve Human/AI balance of roles, ie: --> human only --> AI learns from human gathered data & analytics --> AI validates human in realtime --> AI validation refined by Human via overrides --> AI validation automated --> Analytics + Prediction Engines automated
  • Step 10 - QA
  • Step 11 - release
  • Step 13 - measure

Challenges I ran into

  • AR CheatSheet - There were many... but the biggest challenge actually lead to the biggest breakthrough. Originally we expected the machine CAD to be available, but the manufacturer would not share the files due to IP concerns. Without access to the 3D data, we didn't have a way to map AR assets to real-world objects efficiently... we could use image or QR style triggers, but then we'd be backpedalling toward the sticky note problem we were trying to get away from. Ultimately, our lead developer (team member here Garrett Hoofman) had the epiphany that HoloLens included computer vision capabilities via an XBox Kinect camera system, which unlocked environment depth scanning functions. After testing a theory to confirm using the device to scan the room, including the machine, it allowed us to map the objects directly to a specific part with the least amount of friction. Moreover, it also allowed us to create a "Builder Mode," which can be used to program/customize the app to any machine or workflow on-the-fly directly in any environment.

  • Inspect & Collect- There were many here too... mainly surrounding current limitations in computer vision based on environmental factors as well as user adoption and training hurdles for tool integration. Moreover, deeper insight required regarding AI/IoT needs for pushing beyond AR for visualization into IoT/Smart Connected Operations and AI predictive-analytics. 

Accomplishments that I'm proud of

  • Our ability to nimbly work with our team to define, design, develop, and deliver innovative emerging tech products for our pioneering clients who welcome change and risk budgets in order to ultimately increase overall operational value and ROI.

What I learned

  • That we're just scratching the surface...

What's next for AI/XR Automated Predictive Maintenance Assist & Validation

  • To integrate the "AR CheatSheet" and "Inspect & Collect" features and create one master, robust yet easily customizable AI/XR/IoT solution. 
  • Focus specifically on refining and developing out the AI development journey established with both clients currently using these solutions. In this specific Aircraft Maintenance use-case, general development path would:

Start with developing AR assist tool to aid and train workers on-the-fly, capture, collect, archive and analyze workflow data Use data to inform and create AI Integrate AI into workflow to validate human choices Automate analytic report building Flip AI/Human balance to where AI makes choices and Human validates adding new layer of refinement to AI dataset Automate entire process and reallocate human

For example:

1) Hololens 2 - provide a UX/UI with guides and inputs like current "AR Cheatsheet" + "Inspect & Collect"; take photos when decision is made, archive them for analytics 2) Introduce AI where after we get X amount of data, use that data to program AI so it knows when a job is completed successfully and can validate the human choices through a process 3) Take the AI up a notch where after it has enough use cases, we flip roles with the worker and as he or she is going through the process the AI is watching and doing the verification... at this phase, the human would still have ability to validate and override the AIs decision. All the while the process is still being captured and archived providing rich and targeted fodder for algorithms because now we have the human expert validating the machine’s decisions.  4) Install a system that overlays the "how-to" information so the worker is an expert from day one (AR CheatSheet)... and they go through executing their job function. The main difference now though is the user doesn’t have to do paperwork at the end of a shift. Rather, while they are doing their work, they wear a device that toll gates process and measures success in real-time. The process is recorded (what was done? Was it done correctly?), cycle time measured, and any information regarding maintenance defects achieves a higher standard while process and efficiency increases. Moreover, Human performance data on the worker is measured and recorded so we know where gaps in training are or if that resource needs to be reassigned. 5) Finally an automated AI-driven robot approach is implemented where the human monitors the machine’s performance and/or can take on other job functions...

Tools needed:

  • Hololens 2
  • Vuforia suite
  • Studio for contextual AR how-tos
  • Expert capture for photo/video archiving
  • Chalk for remote assist (if needed)
  • Thingworx/Windchill for analytics and automation
  • Microsoft + customized AI and Computer vision SDKs 
  • other as needed 

BONUS: We will build on our already existing user-centered design mantra to develop and hone best practices specifically relating to UX/UI design for 3D/Spatial computing environments via a recently inked innovation lab partnership with Kendall College of Art and Design of Ferris State University. 

Built With

Share this project: