This was originally going to be a personal project that I wanted to work on my own as part of learning more about the Internet of Things and making mixed reality apps. It just so happened that this challenge was going on and that it would be a great opportunity to kick start this project.
What it does
This app provides a visualization solution for monitoring and analyzing sensor data from your AWS IoT-connected boards. It features basic charts to monitor your data in either real-time or over a period of time, a dynamic data flow diagram for overseeing where your data is going, and a voice bot interface for interacting with your data in near real-time. You can ask the bot to create bar graphs and time graphs of your sensor data, analyze for anomalies in your streaming data, and store the data into a database for later use such as visualizing the collected data over time.
How I built it
- Unity3D (C#)
- Dialogflow (formerly API.AI)
- node.js (backend logic)
- Heroku (node.js)
- AWS Node.js SDK
AWS Services Used:
- AWS DynamoDB
- AWS S3
- AWS IoT
- AWS Kinesis Streams
- AWS Kinesis Analytics
Challenges I ran into
The biggest challenge was synchronizing between the Unity client app and the web server, especially with the near real-time data streaming. Although Amazon does have an AWS Unity SDK, it only supports a small subset of all of their services, so I had to use a web server to communicate to AWS on the app's behalf.
Accomplishments that I'm proud of
I'm most particularly proud of creating my first augmented reality app and being able to combine IoT with it.
What I learned
Developing augmented reality apps can be very rewarding and fun.
What's next for BigDOT
- Adding support for more sensor boards
- Removing the web server from the equation
- Connecting more Amazon web services such as AWS Lambda and AWS Machine Learning