We took inspiration from the various "Wear a mask" posters posted across the UB campus. We believed in order to promote a COVID-safe environment, whether it be on the UB campus or in an office building, all the occupants should wear a mask and remain socially distant. However, the last thing that anybody wants to do is patrol a building looking for mask-code violators. Or, even worse, get in a confrontation with a customer who refuses to wear a mask. Our mission was to develop a smart ecosystem to combat this problem.

What it does

EnterFace is a software suite ecosystem aimed at creating a COVID-safe environment. With our ecosystem, each room of a building and its number of occupants are registered and connected to a central database.

Room access is granted only to guests who are wearing masks when room capacity is not full. Using our iOS app and website, users can check the current occupancy of the room they would like to visit/enter.

At the door to each room there will a QR code that the user will need to scan. The QR code will open up the interface of the image classification integration. To gain access, the user will need to scan their face using our app to show that they are wearing a mask. Our app uses a machine learning model to classify the scan as wearing a mask or not. (And, of course, no scans are saved for your privacy :]). If the user is wearing a mask, the door will be unlocked for him/her and he/she will be tallied as an occupant in the database.

If the room is already at max occupancy, no new guests will be allowed in until someone inside the room leaves.

How we built it

Our ecosystem has three main components: the central database, a web-hosted map, and an iOS app with an image classification model.

The database is using Google's Firebase platform. The database contains entries for each room in our building, keeping track of their current and max occupancies. Our web-hosted map and iOS app both interface with Firebase to update their client-side data in real-time.

The web-hosted app uses Google Maps' Javascript API to display the floorplan of the building. A marker is placed at each room, displaying the room name, its current capacity, and a picture of the room. The occupancy of each room is updated in real-time by pulling data from Firebase. The website is hosted using Firebase's hosting service

The iOS app utilizes swift for all functionality. For the Barcode reader we used AVFoundation to capture the meta data and instantaneously convert it into a readable code. Next, we used createML and Masked Face - net, an open source data set of images of people wearing and not wearing masks. Finally, we integrated Firebase to connect our app with the web app and the internet, which will help with future scaleability of our ecosystem.

Challenges we ran into

The Google Maps platform originally automatically repeatedly tiled the image of our floorplan on the x-axis. While this is useful for geographic maps, it was confusing to look at for a floorplan. We solved this by increasing the default zoom of the map. The images are still tiled, however it is not noticeable unless the user deliberately searches for it.

We had some issues with embedding the website into the iOS app, as there were some corrupted sprites. The website worked fine on a browser, but not in the app. After some frustration we realized that we needed to rollback the website on Firebase hosting and manually clear the cache on the iPhone.

Furthermore, for our machine model we had to resize images taken by the phone and convert them into a CVPixelBuffer in order for them to be analyzed. This was a little difficult as iOS's native libraries do not support this very well.

Accomplishments that we're proud of

We are proud of our ability to interface together multiple different platforms into one coherent ecosystem. Our iOS app has our web floorplan embedded in it. And both apps are connected together through Firebase. We were also very proud of training our first Image Classification model and fully utilizing it in an app.

What we learned

We learned how to use the Google Maps platform on a web app using their Javascript API. We improved our ability at efficient managing our Github repository and development branches. We learned how to utilize CreateML in creating coreML Models. We learned how to use AVFoundation to capture sessions and detect QR Codes

What's next for EnterFace

Our last step to completing our IoT ecosystem is to interface our iOS app with smart-locks on each door so that the door automatically unlocks after the user shows that they are wearing a mask. Unfortunately we did not have access to any smart-locks during this hackathon so we were not able develop this feature this time :(. In the future we could incorporate so many subsystems into our ENTERFACE(interface) like smart speakers, sensors, and other smart technologies.


Share this project: