The 2019 pandemic has accelerated Digital Transformations across numerous industries to bring about business agility. With 5G on the horizon to revolutionalize & disrupt further, a never-before opportunity for unique XR solutions has presented itself – To improve the end-user experience of interacting with applications that assist people to drive outcomes while lowering effort & cost and improve personnel productivity.


We wanted to take the advantage of the unique interconnectivity provided by AWS Wavelength and blazing fast speeds offered by Verizon 5G to create to address a real problem and create a solution that assists enterprises to function efficiently. Remote technicians – whether in healthcare or energy need reliable connectivity to perform their functions and collaborate. The current connectivity solutions for remote technicians are plagued with pixelated video streams, choppy voice & inadequate access (seldom analogue) to enterprise knowledge base. We set out to remedy that with our submission.

What it does

  • Augmented Reality powered iPad application - AR Remote Assist helps engineers remotely collaborate in real-time, and enables interactive troubleshooting to circumvent sticky spots.
  • On-demand download of complex & bulky 3D models to execute standalone troubleshooting for technicians leveraging Augmented Reality.
  • Conduct unidirectional ”see what I see” video call with live multidirectional annotations.
  • Improve AR troubleshooting through ML-powered knowledge base recommendation system.

How we built it

With covid raging, the team is dispersed throughout the country. We conducted lengthy workshops online to setup the infrastructure to support the application and then the application itself. It was a rather tardy & fun experience to collaborate on certain technologies that we have not used before - AWS Wavelength, AWS Device Farm, Verizon Nova Testing, OpenVPN, Python APIs.

Bringing digital concurrency into the real world realm, we executed all threads parallelly - Setup of AWS infrastructure, Model Creation, App Creation, investigation on test strategy and of course - the overall application architecture.

Challenges we ran into & how we resolved them

Building the solution was a truly enlightening experience. The team was exposed to a horde of new technologies we did not have prior experience with.

  • We were able to establish an ICMP connectivity between the parent EC2 instance and the T3 wavelength instance. However, establishing a connection between the front end application and T3 took considerable time. The trick was to use NAT routing in our OpenVPN implementation.
  • Testing of python APIs was executed on the parent EC2 and migrated to T3 wavelength EC2. Post the migration, we were not sure if access issues were related to the python server environment or connectivity.
  • iOS devices were unavailable on the Verizon Nova testing platform. Our application being only supported for iOS, we were unable to conduct the tests as we wanted to. To overcome this hurdle, we came up with a strategy to use Android devices instead and test the download speeds using wget commands on a Terminal Emulator - Termux. To further fortify our findings, we invested considerable time to compare real-world alternatives - devices running on 4G accessing the same data from other services as well. We learnt that Verizon 5G and AWS Wavelength produced lowest downloads time every time.
  • We were unable to configure a suitable service to act as the core for our messaging/calling requirements. Due to time constraints, we used Twilio to achieve it.

Accomplishments that we're proud of

  • The level of detail a 3D model demands for real world relatability is quite high. We invested immense amount of time for crafting it to ensure we got the details just right. An added hurdle was unavailability of these details. Our 3D model designer took the painstaking task of going through numerous documentations for image references to build the model out from these images.
  • The successful interconnectivity between T3 wavelength and parent EC2 to provide access to python APIs to the client was a moment of exuberance.
  • In our local tests over WiFi & 4G, the model download times was greater than 30-40 seconds (highlighted in our benchmark test results - refer the attached document). We were enthralled to see models being downloaded within ~5 seconds.

What's next for Augmented Reality Remote Assist

  • To augment our video calling feature with 4K, we would like to build prototypes with a matrix server running within wavelength.
  • Extending support to other devices - Android, wearables, others.
  • The current application being a very controlled PoC, the bot does not possess various scenarios. We would like to integrate it with AWS Lex for NLP.

Built With

Share this project: