We believe today's technology are taking people into the virtual world. We are are not appreciating what is around us anymore. As a result, technology are taking people's lives in the car, on the street and many other places. Instead of going to the virtual world, we want to bring technology from the virtual world to our reality.

What it does

It is a glass which support with an intelligent support system. It used voice interface to communicate with user. User will be able to use this device everywhere as only voices are emitted from the glass. The intelligent support system was supported by our own self-create A.I. and Amazon Alexa. Some features that A.I. tap are able to do includes taking picture, post text to Facebook, check people's emotion status and weather report. This is just some of the main feature that A.I tap has, but it is able to do a lot more.

How we built it

We use Raspberry-Pi as the main computer of the device. We created the A.I. Using Microsoft API which translate user speech to text and match user input with the skill sets that the glass have. If the glass do not have those skill sets, the A.I. will pass that input to Amazon Alexa which will try to handle those command. Users can give input to A.I. Tap in 2 different way speech and push button located on the left sight of the glass. We use Microsoft API, Twitter API and Facebook API to build the main skills of the device. Microsoft API enable our device to read people's emotion. Twitter API give A.I tap the capability to read recent tweet. Last, Facebook API give the possibility for A.I tap to post picture and text to our Facebook's page. The twitter application are program in Amazon Alexa console, this is to show that our glass are able to use Amazon Alexa skills that we programmed. We used Amazon Lambda and Node.js to program the skills in Amazon Alexa.

Challenges we ran into

Figuring out how to program Amazon Alexa and implementing Amazon Alexa itself into the raspberry pi. Understanding how Facebook, Twitter, and Microsoft API. Getting every different technology to work together is our biggest challenge.

Accomplishments that we're proud of

Finishing both software and hardware within 24 hours and make it to work.

What we learned

We learned how to incorporate different technology, using different API's, and learning more about Node.js

What's next for A.I Tap

We want to expand the features of this device. We believe this device could replace the role of smartphone

Built With

Share this project: