We hope this app encourages users to invest in the stocks and grow their personal investments.
What it does
When the user sees a logo in everyday life and wants to learn more about the financial performance of said company in the stock exchange, the user can simply perform the airtap gesture and the HoloLens will take a snapshot of the current view. Then, the image is sent to Google’s Cloud Vision API and analyzed to see if there are any logos in the picture. If a logo is detected, the company of the logo is found and the NASDAQ API is used to determine the recent performance of the company’s stock. Finally, this financial data is visualized on the HoloLens through Unity.
How we built it
We had the HoloLens take a picture after receiving an airtap gesture. The picture is then run through a chain of APIs, specifically Google Vision (to detect logos), a company name .csv (to get ticker names from company names), and NASDAQ API (to check stock prices).
Challenges we ran into
We had challenges in implementing a gesture to capture photos, capturing photos, sending a JSON request through Unity to the Cloud Vision API, parsing XML from the NASDAQ API, and scripting in Unity. Because the HoloLens hasn’t been developed on much yet, there was little documentation and examples to learn from.
Accomplishments that we're proud of
Becoming familiarized with developing for an AR environment, using Unity and the Windows Holographic Platform. Figuring out the Cloud Vision image recognition API to detect images from the snapshot the user takes on the HoloLens.
What we learned
AR/Mixed Reality is an emerging field and devices like the HoloLens have incredible potential. It was a great learning experience to work with the HoloLens and figure out how to use Unity
What's next for Logo Lens
1) By incorporating Capital One API, we could simulate buying and selling stocks on the go. 2) Selecting what pieces of data are more helpful to the user and presenting them using clean and easy to understand visualizations.