After seeing the variety of services provided at Microsoft, we wanted to utilize one or two of them in our project, so we came up with Jamcam. We believe that Jamcam could be used to help people go to other countries and instead of needing to ask what something is, to just take a picture of it and show the translation to the non-English speaker. Additionally, this could just help with practicing other languages.

What it does

Take a photo, let the app analyze what you're seeing, and let the translator API translate it to a language of your choice

How we built it

Started with a React Native boilerplate and had to get the react camera API to function properly. Integrated the microsoft API's and played with a new technology called lottie by AirBnb.

Challenges we ran into

A few of us are familiar with React but React Native still had some differences. Integration of redux was quick but hard to understand. Also, the documentation on the microsoft API wasn't great so we had to fiddle around to get the API to work with our app.

Accomplishments that we're proud of

Built out a full-functional app that uses low level native functionality as well as external API's but using full javascript. This app is also cross-platform.

What we learned

Computer vision is real and it has tons of potential. React Native is a great technology to quickly whip up great-looking apps while still retaining tons of functionality

What's next for JamCam

Right now, it only translates from English to other languages, but if we want to target non-English speakers, we should translate the entire app.

Built With

Share this project: