Inspiration

Young girls often have less opportunities in science, technology, engineering, and math (STEM). Being a young woman in high school myself, I feel the need to empower other young women to pursue their interests in STEM or to spur their interests in STEM because diversity is important in the STEM field. The evident negative stereotypes about women in STEM and the necessity to disprove them inspired me to create my iOS app called Inspiration. Through my app, I wanted young girls to be able to explore STEM careers in an interactive way.

What it does

In order to combat the negative stereotypes of women in STEM, my app displays influential females in STEM and engaging activities to do in order to emulate their passions in STEM. My app uses machine learning to allow girls to be immersed in information regarding day to day objects and some of the famous women pioneers in their respective fields. Inspiration allows young girls to explore different STEM careers through simple objects.

To use the app, girls can use their camera to point their phone at an object and to take a picture of the object. Using machine learning and object detection/image labeling, the app detects what object is in the photo. It then displays relevant careers in STEM involving the object and prompts the user to view an influential woman in the same career. The app displays accomplishments of the influential woman as well as interactive activities that young girls can pursue to spark their interests in the same career. For example, a girl may take a picture of a computer and the app will suggest a career choice as a web developer or software engineer. Then the app may suggest Grace Hopper as the influential female and show coding activities such as an Hour of Code. The girls can save the influential female’s profile with the activities in order to come back to the activities in the future. At the home page, the girls can view the influential females that they saved. Everyday, the home page displays a new influential female for girls to learn about.

How I built it

The iOS app is built using Xcode and SwiftUI. For the front end, I designed all the UI using Sketch. For authentication, data persistence, and the machine learning API, I used Firebase as the back end for my app. The machine learning API uses the ML Kit Image Labeling’s base TensorFlow model in order to predict the objects in the photos.

Challenges I ran into

Because SwiftUI is relatively new, it was hard to find the documentation when I ran into problems. The first problem was switching between Views using SwiftUI. I had used Swift and ViewControllers before but I wasn’t well versed in using SwiftUI. To switch between views, I found a tutorial about using an ObservedState object. While it worked to switch between views, I could not figure out how to pass data between views. There were several resources but none of them used a ObservedState object to switch between views, so it was hard to figure out how to pass data while switching between views. After playing around for a bit, I used the ObservedState object in order to pass data between views.

My main problem in this project was using Xcode. I wanted to be able to run the demo on my phone, but my Xcode version and iOS version on my phone are incompatible. My phone is on iOS 13.5 but my Xcode target version is 13.4. In order to use my phone as the simulator, I either had to downgrade my phone or upgrade my Mac. Unfortunately, I could not jailbreak my phone to downgrade it and I ran out of storage on my Mac so I couldn’t update my Mac. This was a learning experience for me and I am quite disappointed that I could not run Inspiration on my phone. This also meant tailoring my project to the Mac’s iPhone simulator. The simulator doesn’t have a camera, but the main point of my app is to have girls take pictures of objects in order to explore different STEM careers. Thus, I couldn’t implement using the iPhone’s camera to take a picture and I adapted by using photos from my Photo Library. While this was a setback, I still integrated the machine learning image recognition into Inspiration, which ultimately worked out in my app.

Accomplishments that I'm proud of

I’m really proud that I finished this app alone and made it user friendly with a clean and intuitive UI. Two people reached out to me in the beginning of the hackathon to be my teammates, but they ended up going to a different team. Thus, I was left alone to complete an app by myself. I wasn’t sure if I could do it in the limited time but I am really proud that the app works and I persevered through the challenges I encountered. This was my first time using ML Kit in Firebase and I’m really glad it worked out.

What I learned

I learned how to develop an app using SwiftUI, Firebase, and ML Kit API. I also learned how to use the Photo Library instead of the camera feature when using the simulator instead of a real phone.

What's next for Inspiration

I hope to integrate more influential women into a database as well as implement web scraping in order to get an influential woman’s accomplishments. Instead of using the ML Kit API, I would like to build my own machine learning model tailored for STEM objects. Since the ML Kit base model only has classic items, I could use transfer learning and drop the last layer of the model to implement my own layer that will detect specific STEM objects such as different lab equipment and animals.

Built With

Share this project:

Updates