Virtual instruction for K-12 students quickly became the new standard format for education as a result of the COVID-19 pandemic last year. While many students, teachers, and families struggled to transition to this new way of learning, students with attentional disorders, such as ADHD, and learning disabilities have been negatively impacted to an even greater degree because teachers cannot easily see if they are off task or distracted. At the same time, parents have been expected to become classroom monitors while also working from home. What results is students with disabilities missing out on instruction, putting them at-risk of falling even farther behind their peers. Eyes Up Here is an app designed to prompt students when they are not looking at the screen during virtual instruction, providing the guidance and reminders that would typically be given by classroom staff in a traditional classroom. The idea came from one of our team members whose son, who has ADHD, has had difficulty staying focused and maintaining attention in the virtual classroom setting. The parent was “checking-in” as often as possible, but with meetings and other work commitments, the student was missing important information and both were frustrated because while they were doing the best they could, it wasn’t enough.
What it does
Eyes Up Here uses facial recognition software to identify when a student is not looking at the computer screen for an extended period of time. When triggered, an alert noise sounds, to draw the student’s attention back to the screen. The noise stops when the facial recognition sees the student’s face matching a range of acceptable parameters.
How we built it
We used React to create the app and Azure Face API for the facial recognition AI.
Challenges we ran into
Initially we planned to create Eyes Up Here as a Chrome extension but learned that we could not access the video camera, so we decided to create it as an app instead. In the process of developing the app, we had difficulty determining how to send data from the video feed to the Azure Face API. We also realized that while we had a great idea, we did not have sufficient time to complete all of the components of the app we had originally intended, which required us to make decisions on what to keep for the current prototype and what could be included in future development.
Accomplishments that we're proud of
We are all proud of completing our first hackathon. We learned new tech skills that we likely would not have otherwise, including how to use facial recognition data to trigger a function and how to use that function to activate an audio file. We persevered through frustrating challenges to develop an initial prototype of an app that can potentially assist many students with disabilities during virtual instruction.
What we learned
What's next for Eyes Up Here
There were several ideas that we were not able to include in the current version of Eyes Up Here due to time constraints that we would like to see included going forward. First, we would like to expand the app to include a way for parents, caretakers, and teachers can be notified when a student is not paying attention repeatedly or for an extended period of time. We also thought that developing a system where teachers or parents could have multiple students that were being observed by Eyes Up Here would be beneficial. Another idea was to find a way where parents/caretakers could access the video feed when they receive a notification so that they can see what the student is doing without having to interrupt the student if it wasn’t necessary. Finally, providing a way for the adults who are supporting the student to be able to communicate easily with each other regarding a notification, such as a way to respond to the notification by one parent so that the other parent knows that the student’s inattention has been addressed.