We aimed to improve understanding and engagement in sustainability issues by allowing users to physically interact with relevant data. By making data more accessible to a wider audience--particularly younger children who learn through tactile feedback--we incorporate more voices in the sustainability dialogue. We wanted to bring the excitement and fun of AR technology seen in movies like Iron Man to make people informed and excited about sustainability.
What it does
Our core goal was to make learning fun and intuitive. With EcoTouch, users can interact with a map of the United States to explore data on water usage, renewable energy, and air quality. Motion commands are used to make playing with the map feel intuitive and fun! EcoTouch supports panning and zooming around the map, communicated via grabbing and lengthening respectively. Users can also dive deep into chosen locations by pulling up in-depth, MicroStrategy Workstation-created visualizations using a simple pinch and drag. A Leap Motion Sensor is used to capture and interpret hand gestures.
How I built it
We created our interactive map using JavaFX and OpenGL alongside a visualizer where you can see your hand placement on the Leap Motion sensor. The map was read in from an SVG file and converted into 3D models representing each US state. The hands were made from our own design. MicroStrategy visualization was included through the Embedded SDK.
Challenges I ran into
We ran into challenges learning the MicroStrategy Workstation and Embed SDK from scratch. Additionally, translating established functions such as zooming, panning, and selecting into intuitive, readable hand motions through the Leap Motion sensor took a lot of pre-planning and adjustments along the way. Lastly, creating an interactive map from scratch that was populated with relevant data in Java was an additional level of challenge.
Accomplishments that I'm proud of
The map graphic, interactive elements, and Leap Motion hand sprite were coded from scratch by our team. Additionally, we are very proud of the choices we made with regards to intuitive motion controls that anyone can pick up quickly. Lastly, we’re very proud of gaining a better grasp of MicroStrategy’s Workstation and Embed SDK, both of which proved to be very rewarding learning experiences.
What I learned
We learned how to quickly generate informative visualizations through MicroStrategy’s Workstation application.Lastly, we learned how to translate Leap Motion hand gestures into panning, zooming, and selecting functions within the map that we created.
What's next for hoohacks
Adding voice commands that are processed using MicroStategy’s built-in Natural Language Processing module. This should allow for greater intuitive use of our application to explore sustainability data. For example, verbally asking “What was the air quality like in Wymoning in 2004?” would translate into a relevant visualization being displayed.