Our computer's interaction with the RPi(Not including the microthreading system[Still Volatile])
We were inspired to pursue the feat of creating an accessibility system for the blind due to something we noticed in every piece of tech there is. There are accessibility systems in place to help the visually impaired, the hearing impaired, the deaf, and even people who can't move... but there is nothing that can allow those without the precious gift that is sight to utilize the extremely valuable resource that is the internet and computers.
What it does
iSight uses computer vision via tesseract, which takes any image file, and extracts the text from it. By taking this text and replacing it with string arrays associated with the motors in the ReadBox, we can take this seemingly simple picture that was until now inaccessible to the visually deprived, and turning the text within it into a format that now brings it within their reach, braille. This allows people who cannot see, including students, to be able to educate themselves with a plethora of new resources literally right at their fingertips.
How we built it
We used tesseract as our CV platform, terminal interface on the computer, raspberry pi as a signal hopper via ssh, and Arduinos used to process the data sent to it, and convert it into step-motor movement. The housing for the BlindBox was 3D printed, along with the rack and pinion systems within it. These produce a very simple yet effective transduction from image text to braille. Additionally we have a volatile version in which we used IBM Watson as a voice recognition utility to control the output of iSight to the preferences of the individual as well as producing potential for a system that allows a user to not only read text from image files that have been saved previously, but also from live images of their computer screen, allowing them to navigate the computer, and potentially the internet.
Challenges we ran into
Some of the challenges we ran across were the vast amount of hardware issues that were a result of faulty motors and Arduino pins, and a more than a few issues with wire maintenance due to the excess number of wires than the amount of pins available on an Arduino uno(We ended up using two).
Accomplishments that we're proud of
One of the things we are very proud of is our back-end, and the fairly effective hardware system we designed despite our lack of experience(AT ALL) working with hardware. Our back end works quite well, utilizing tesseract(slightly modified to suit the purposes of the product, the logic circuits we worked through, and consistently, accurately, and efficiently transmitting data between computer, raspberry pi, Arduino, and step motors. The hardware system despite its lack of wire management is something we are very proud of, given our lack of experience, further reinforced by all the malfunctioning materials we came across, yet still pulling through and creating a working product.
What we learned
We learned that hardware fails... a lot... and when i say a lot, i mean a lot. We also learned how to interface different systems between each other, as well as micro-threading processes in a way that maintains the efficiency of the hardware output, while not straining any system involved in the chain too much. One of the most important lessons we learned however was that when it comes down to it, even when you are about to pass out because it is 5:30 am your code won't compile, your motors won't turn, and your body has more energy drinks in it than water, if you work hard, and keep at it, you can do anything.
What's next for iSight
What's next for iSight? We plan to further develop this product into a better polished system that will enable the blind people across the world to access a technology they had been previously excluded from in an open source platform, as our way of giving back to those who need us most, our community, and the world.