After seeing the capabilities of the new Microsoft cognitive services API, we thought that build an app with this technology could be a good idea and so we decided to do an App with react native that uses the emotions API exposed by Microsoft .

What it does

Allows the user to easy login into the app, take a selfie, receive score about their smile with the help of the Microsoft API and share it with the other users .

How we built it

We have build the App with Exponent, a set of tools on top of React Native that allows native developments (iOS/android) using JavaScript .

Challenges we ran into

  • Organize all the task we need to do as a team to finish the App
  • The use of new technologies for us such as Firebase Storage
  • Code organization in short time
  • Combine the different APIs together

Accomplishments that we're proud of

To have a functional application that do what we intended to do in the mockups

What we learned

We learnt:

  • Firebase Storage
  • React navigation library
  • Working as a team to accomplish the goal
  • Combine the multiple APIs used by the project

What's next for FeelShare

  • Refactor all the code
  • Add security in the client & server
  • Make the app offline-first
  • Add unit & integration tests ( jest & appium)
  • Add type checking with flow

Also We'll love to make this app multiplatform ( currently only run on iOS ) and apart from that we intent to add social features like "share on Facebook" and publish it in the different stores...

Built With

Share this project: