I really liked the idea behind GenomeLink in that it is able to teach you about yourself. However, when I was reading all of the traits that it can determine, I realized that I am more interested in how they apply to my world. Being able to draw a connection with how the things around me connect to me is much more powerful then just reading about it.

What it does

This app is meant to be as simple as possible from the perspective of the user. At first, it requests the user to give it permission to access all of the traits it requires. After that, the user is able to take a picture of any object around them. When they click on the "Genomify" button, the image is then uploaded to Google Cloud Vision to determine what objects are present in the image, and the related traits are then sent to GenomeLink to find how those objects connect with the persons genome. The app then displays all of the relevant information.

How I built it

I used two main libraries, GenomeLink and Google Cloud Vision. The app was built natively using Android Studio. The GenomeLink with Oauth support was made using Flask and hosted on Heroku. The Android App would load up the url containing the GenomeLink Oauth in order to get the user to logged in. After that, it would call get commands to the flask api to get genome information. The cloud vision api was used through uploading an image taken by the device camera and receiving the contents within the image, as well as the certainty that those were the correct items.

Unfortunately, there are sensitive Api keys (Google Cloud) in the source files so I have made it into a private repository and have only allowed GenomeLink to access it :(.

Challenges I ran into

I felt that a native Android app would be much more powerful then a web app for my idea, so the biggest challenge was how to integrate GenomeLink, which was meant for web apps, into a native app. I ended up modifying the GenomeLink Oauth Flask app to support a new kind of workflow. Essentially it worked like this:

1) The Genomify app would open and launch a webview which opens up the GenomeLink app. The Flask app using GenomeLink was hosted here:

2) The webview would then track url changes. Once the url change in the webview was the callback url, the app would know the user was authenticated and close the app.

3) Whenever the app required new information about the genome, it would invoke a call that looked something like the one below. This would then use this info to display relevant information. anger

Another challenge I had was getting the Google Cloud Vision api working. This was the first time that I had ever used it and it was challenging to understand at first.

The last challenge I ran into was creating the library of items that would connect to a relevant trait. For example, a cake would relate to Blood-Glucose, or eggs would relate to egg-allergy. This simply took a lot of time and testing to get right!

Accomplishments that I'm proud of

The final app really got me excited. Creating something that I believe I would actually use in my life is a great feeling.

I am also really proud of getting the GenomeLink api working really well with a native Android App as I was unsure on how it would work in the end as there were no such examples. Seeing the flow I created work well and without fail was awesome!

What I learned

I learned how to use GenomeLink as well as Google Cloud Vision as they were both firsts for me. I also learned how to edit a video with much higher production value then I have usually done by using After Effects for the first time.

What's next for Genomify

Fixing up any small bugs remaining in the application and then creating a final Android and iOS app in order to make it available for the general public.

Built With

Share this project: