A specified search based on an object in the photo (Here we used laptop as an example)
The camera interface
The main grid for all your photos (It pulls from
Located at table B1
The inspiration behind AISee was to combine the skills of all the team members on the team, while also learning platforms we have never developed on before.
What it does
We think data security is very important now more than ever, so we wanted to create an alternative to a very popular platform (Google Photos) that you can self host and self manage that way YOU manage your personal data. We also wanted to focus on the features of the popular platforms and make sure that those didn't get left out when you used AISee. We used both Alexa and IBM Watson Vision Recognition to voice search through your photos hands free, but also tag your photos with relevant data so the next time you go looking for those dog pictures you can say "Alexa show me my dog pictures" and they will be on your screen before you know it.
How we built it
As mentioned in the inspiration we built this using IBM Watson and Amazon Alexa, but as for the Web and Mobile apps we used HTML5, PHP, Swift, Java, and JSON. We used a Raspberry Pi for our web server which hosts all the pictures and tagged data that we use to make searching effortless.
Challenges we ran into
IBM Watson has a broken SDK for Java, so I spent about 3 hours making an app that just didn't work because of a HTTP 500 error from the Watson servers, but I didn't let that stop me there. We continued looking into other platforms, and we found the Watson iOS SDK which is written in swift and was much more up to date than the Java SDK. Unfortunately we still ran into issues with Watson as the returned image classifier data was NOT in a json format and just in a big block, so this made it very difficult to work with parsing the data that we needed to pass along with the picture.
Amazon Alexa SDK, while on UC's network has been very temperamental during the time at Revolution UC, we can have it working one minute, yet the next minute it won't work. We are trying to iron out the stability of the Alexa API, so we can provide a consistently great experience.
We also struggled with the styling aspect of the web platform as the css files were basically doing the opposite of what we needed them to do. Through the help of mentors we were able to straighten out all the bugs we were encountering with the website!
Accomplishments that we're proud of
Despite all the setbacks with the platforms we were working with, we managed to get everything up and working as expected. The best thing and I think the thing that we are the most proud of is the fact that we all learned new platforms during this event.
What we learned
I (Skyler Martin) learned how to work with the IBM Watson SDK, and got to further develop my skills with Swift. Aidan Hembree got allot more experience with PHP scripting and HTML5 coding. Henry Griffiths set up the web server we all worked off of during the event, and got our raspberry pi to talk to the Amazon Alexa servers for voice searches within the AISee application. Eli Garcia focused on the database creation and management part of our project where we aim to store all our pictures and contextual data.
What's next for AISee
Better User Interface, Ability to edit tags on photos, search by geo location.