TABLE 15: due to the nature of our code, we are unable to submit it through GitHub and thus would like to request that the judges come to our table.

Inspiration

As a team of two Mechanical Engineers, a Biomedical/Electrical Engineering and a Computer Scientist, we came to the hackathon driven to explore an interdisciplinary problem. After learning about the OpenBCI software from the sponsor representative, we were amazed and excited by the plethora of opportunities a device that could conduct a complete EEG exam could achieve. This led us to think about creating an interface that would enable mentally-controlled limbs, thus creating a more natural user-experience and increasing mobility for all of society.

What it does

By detecting a spike in brain activity, the robot arm gets activated and rotates. The user is able to trigger motion in a prosthetic arm solely through brain waves. Currently, the software tracks the spike in the EMG activity caused by the muscular movement of the head but can be further improved to detect changes in EEG waves.

How we built it

To build the arm we used two plastic "lightsabers" provided as a sponsorship freebie at the event. Each lightsaber was made of 4 separate pieces: 3 shaped as frusta and one as a cone. We used the frustum with the second largest diameter as the main arm. We attached the motor to one end of this arm such that the gears sit through a keyhole made into the side of the main arm. The second arm consists of a frustum and the cone and connects to the motor on the main arm using a servo horn. To increase the structural integrity of the arm we reinforced it with other pieces of the lightsaber and placed it in an outer casing.

We used the OpenBCI headgear and used its measurements of muscle activity as a control signal for the arm. We used an Arduino DUE to control the servo motors on the arm. The Arduino runs the motor when the control signal is detected. The arm moves between 20 to 110 degrees when run.

Connecting the OpenBCI to the robotic arm was a lengthy process. Just the initial stages, setting up the GUI in the Processing environment took a very long time in Ubuntu. We faced several obstacles, all of which we overcame, except for a nullpointer exception which we simply couldn't solve. Eventually we gave up and moved to Windows but faced the same nullpointer exception there as well. After several hours of struggling, we managed to download the older version of the GUI (which also didn't work) but managed to make it work by combining the newest version libraries with it.

Once the OpenBCI GUI was set up we had to figure out how to program it to make the robotic arm move. We realized that head movements caused the FTT plot for node 7 to jump in EEG amplitude. We scanned through 6000 lines of GUI code to find the place where the FTT plot was drawn. We then, added code to check if the plot jumps above a certain threshold. If it did we would write data to a file. We then wrote a bash script to check if that specific file changed. If the file changed the bash script would call a Python program which would establish a serial connection with the Arduino. The Arduino program was written in C and would make the motor for the robotic arm move if the Python program sent a bit to it through the serial connection.

Challenges we ran into

The primary challenge that we faced was working with the materials we were given and trying to make the most out of limited supplies. We tried to maximize the use of the OpenBCI headgear. The output torque of the motor generated a very high angular impulse which could not be withstood by the flimsy plastic arm, so we added rocks to the bottom to increase the mass of the system. We ran out of lithium-ion batteries during testing so we used a battery pack. To fasten it to the Ultracortex headset we used zip ties.

Accomplishments that we're proud of

Despite the lack of access to hardware equipment, we were able to capitalize on our collective ingenuity to find innovative solutions. Out of these many instances, the one we are proudest of would be creating our model prosthetic by deconstructing the lightsabers provided by one of the sponsors. To hold up the motor within the main arm, we used a zip tie to constrain the arm diameter (since the material is a flexible plastic) and thus prevent the motor from slipping.

What we learned

A ton of things to know about OpenBCI. We also really learned about colllaboration between multidisciplines because we were all different majors. Hence we didn't step on each other's feet and managed to work well together.

What's next for NARVes

Although we are extremely proud of how far we have come, NARVes needs to be able to create a more nuanced mental as opposed to motion-based trigger to create an interface for prosthetic limbs. Further enhancements would require deep dives with those currently facing mobility issues and have prosthetic limbs to understand the true user needs.

Built With

Share this project:
×

Updates