wanted to get the t-shirt that my friend was wearing and was lazy to search for the product on the web , so we came up with an idea to use augmented reality to just point my phones camera at the product and apply multiple applications with that data.
What it does
Seeget allows users to scan products based on logos and text. Then we used NEC API for image recognition to determine what type of object that item is. Once a match we will display information about that item. Here is where AR comes into play and displays the object using AR projecting on the user to get a better understanding of the fitting and also allows the user to touch on that augmented projection and feel the material of the product using TANVAS hardware. The user is now able to take a picture with that object to see and get a better feeling on how it will look on them physically and visually.
How we built it
We programmed the application based on android java sdk. We embedded two different apis: tanvas, and nec api. We use image recognition to recognized different types of clothes, brands. When we used tanvas api to implement touch functionality at the end before users purchases item.
Challenges we ran into
The hardest part was to learn the new API's dealing with pixel recognition in augmented reality and combining it with the TANVAS hardware to feel the touch of the material of the product.
Accomplishments that we're proud of
learning a new API for the 13 hours and then ending up developing developing our own API on a different platform.
What we learned
What's next for SEEGET
We plan to make this a business. We have some monetizing strategies like: B-B subscription based service. In app purchase fee, and in the future we will partner up with tanvas to provide haptic screens for phones . we plan to inculcate machine learning algorithms to determine future trends.