Inspiration
As busy college students, we consistently had a problem with effectively managing time and developing beneficial habits that would help us in the future. For example, one had trouble on studying efficiently, one failed to balance his diet, and one consistently missed his class due to sleep. Brought together from across the globe by the common desire to accomplish more with less, we possess a unifying yet diverse mindset to unlock the most potential within others by providing external motivation through our iOS application.
What it does
We believe people gain fulfillment when they achieve the goals they set. Accomplishing 100% of your goals is hard; however, there is a way: making a deposit just for your goal. Depositing money for your dream holds a lot of meaning. To not lose the initial deposit, people are incentivized to accomplish their goal with the prospect of small extra bonus for a 100% completion rate. We believe that the lack of motivation is a crucial problem nobody’s tackling effectively. By gaining the experience of achieving small goals through deposits with Everest, users are ready to reach the unreachable.
How we built it
We built the iOS application using Swift and Xcode 11, implementing Mapbox APIs, optical character recognition, and machine learning.
Challenges we ran into
Since we were learning Swift and Xcode 11 from scratch, we ran into multiple issues when we tried to implement various APIs, optical character recognition, and convolutional neural networks. We relied a lot on developer documentation, and this gave us an opportunity to improve as developers.
Accomplishments that we're proud of
Everest allows you to reach the unreachable through its core functionality. Everest’s private and public competitions allow you to set and accomplish goals with your friends and the world through a peer-based verification system. With machine learning (CNN) and computer vision techniques, competitions can be automatically verified through uploading photos or videos, such as checking wake up time. Additionally, we implemented optical character recognition (OCR) to verify the goal of limiting the screentime by extracting text from user screenshots. Lastly, we implemented Mapbox API to verify the user’s destined location for the task.
What we learned
We learned how to use Swift and Xcode 11 from the group up. We also learned how to implement current technologies in the industry for our application, namely optical character recognition, convolutional neural network, and implementing APIs.
What's next for Everest
We'll develop a fully functioning application and launch it in the Apple App Store. We are motivated to scale our product.
Built With
- api
- computer-vision
- google-cloud
- machine-learning
- ocr
- swift
- xcode


Log in or sign up for Devpost to join the conversation.