We were inspired by hundreds of millions of visually impaired people who are facing enormous challenges using the elevators and navigating public spaces. We wanted to minimize the risks of contracting COVID-19 among this group of people using Computer Vision and Arduino.

What it does

UV fixture detects if there is a person inside the elevator (through Motion Sensor); and if there is no, then it slides outside and disinfects the panel for the safety of the next person.

How I built it

We used 2cm wide aluminium contruction corners for the carcass. We've cut 4 details of each of 3 different lengths-15 10 and 5 cm. Then drilled holes in each. Using those holes and nuts and screws we attached them together froming the Parallelepiped shape. The sliding shelf to which the light emitters are attached is made from 2 layers of ply wood. When the sliding element and carcass were done-sensors and other elements soldered together and attached to Arduino and power relay. Everything was powered by a common 5V powerbank.

Using motion sensor we made a function to record a infrared light emission changes within a certain radius of a common elevator cabin. Then if doesn`t record, or "see" significant change and movement of infrared light within radius-arduino sends signal to servomotor, which moves the sliding element out, then arduing sends high voltage signal to power relay,which activates the NO(normally open) channel and so, the light emitters turn on. If there is a movement recorded-the opposite processes happen and the device turns off

We used C and Arduino IDE to make code for the project.

Challenges I ran into

Computer Vision was tough to implement during the duration of Hack3 event, so we used Motion Sensor, which was also cheaper for prototype. However, for the future we would like to use cameras in order to really ensure that the elevator is empty.

Accomplishments that I'm proud of

We fully developed the prototype, it works and the working in team was funny. We are proud for making the product that will make life of visually impaired people easier through helping in navigation.

What I learned

We learned to really think about people who are part of our society, about inclusion and the ways to achieve it to the higher extent.

What's next for R-UV

Implementation of cameras, make the product thinner and collaborate with NGOs (AFB, ACB, Lighthouse International).

Built With

+ 1 more
Share this project:


posted an update

R-UV evolved as we were eager to know how to help to our local community, therefore we tried to organize our work with current Research in Nazarbayev university, as well as target visually impaired people for proper social impact, as they are mostly insecure to spread of disease . Our thoughts lead us to conclusion that our society is not as inclusive for such people, and it can have negative consequences, especially for our population . Then, by looking on current solutions and applying our knowledge from Robotics, we learnt that we can help and modify existing approaches to problem. That's how R-UV emerged as a project for this hackathon .

Log in or sign up for Devpost to join the conversation.