After talking to others at the Cal State University Fullerton LHD hackathon and attending the Introduction to Machine Learning workshop, I was inspired to make an image recognition app through machine learning.
What it does
AnimalRecognition is an animal recognition by image app that uses the CoreML framework and integrates machine learning to accurately find the subject of the image and will show this on a MKMapView.
How I built it
I built my iOS app with Xcode 9.1 and Swift 4.
Challenges I ran into
There were various problems with type downcasting, as there are various types of data in the app, and it becomes a hassle to switch from one type to the other, and usually these switches would take me 10-15 minutes to execute.
Accomplishments that I'm proud of
After scouring various Apple Developer Documentation sites, I am proud to have picked up a new skill in the CoreML iOS framework.
What I learned
I learned that it takes a lot of different trials and brainstorms to be able to create a functioning app. After spending a good 4 hours and brainstorming and revising itself, I understood the overwhelming time it takes to make an app like this.
What's next for AnimalRecognition
I plan to add a list of previous searches and also a description of the animals that have been recognized, so the app is more user-oriented. I also plan to expand upon the design of the app, as I plan to make it more appealing.