What it does
Provides a summary of your day (connecting with native iOS features like "reminders" and "healthkit") through a basic natural language processing model in iOS. Uses iBeacons to mark your home.
How we built it
Both of us were pretty familiar with iOS. We used Carthage for dependency management, and used RxSwift to handle asynchronous calls (MVVM). Used a lot of iOS native frameworks, along with OpenEars for the microphone input processing. We have a little bit of persistence running in the back, and as we're running on Realm it acts as a replacement for a fast RESTful api. At the moment, Arthur only accepts variations of "YES" and "NO" in his language model, but it isn't too difficult to make him more sophisticated if given the time. I'd like to think that this app is a solid mixture between a solid networking layer (or in this case, a persistence layer that pretends to be a networking layer), good native framework usage, beautiful frontend design, and clean and professional code. Go ahead and browse the repo!
Challenges we ran into
Dependency management. Switched between Carthage and Cocoapods for a bit before settling, which resulted in a stunted project before refractoring.
Accomplishments that we're proud of
The app itself is gorgeous, and frankly is fairly sophisticated under the hood. The codebase is clean aside from a few rushed features, and though we didn't get to everything we wanted to, it would be easy to get back into working on this app.
What we learned
Carthage or Cocoapods, just pick one and go.
What's next for Arthur
It's very easy to integrate pretty much any third party API into this app, and for the most part a lot of APIs make sense. We didn't ultimately make it to the point where we could warrant bypassing our core features to go for an API prize, but I'm thinking that we can expand on Arthur until he reaches the point that we thought he would be by the end of the hackathon. Definitely appstore material.