Inspiration

2 months ago, my dad's former boss - Mr.Mittal - came to Bangkok, Thailand for medical treatment. I soon found out that he suffered from motor neurone disease - a mild case of ALS specifically. I was awestruck by his condition. Just 20 years ago he ran large textile factories in Nairobi, Kenya alongside Indian businessmen. One morning, he and his wife came over to our house for breakfast. I witnessed the amount of assistance he required from his wife in order to perform basic tasks such as getting up, eating, drinking water, and even using the restroom. A few hours later, his wife decided to go to the market with my mother, leaving me alone with Mr.Mittal. Though he was able to partially move his hands and speak, most of the words came out to be gibberish. He was asking for something but I couldn't understand him. I wasn't able to ask his wife either because she would also have to guess and check what he wanted through the cell phone. After a few minutes of guessing, I finally brought him what he desired - a mere glass of water. Despite his medical condition, I was the one that felt disabled. I felt incompetent and powerless. At that moment, I realized that this was a problem worth tackling. Bridging the gap between ALS patients and their care takers was a paramount issue. I soon contacted my tech buddy - Mantej, who is extremely interested in studying biology and robotics - and decided to fix this problem!

What it does

The device is really quite simple. An ALS patient rests the arduino board on their hand. Whenever they touch a certain part of the device, the accelerometer is activated. They then move their hands in specified directions depending on what they need. As an example, if they move their hand a little to the right, our program translates that to wanting water. This data is then uploaded online for the caretaker of the patient to see (so they know exactly what their patient needs). The device uses a photoresistor to activate the accelerometer. Once the accelerometer is activated, it records the (x,y,z) coordinates. It then waits for an additional 6 seconds and remeasures the (x,y,z) coordinates. It calculates the delta difference between the respective coordinates (x2-x1, y2-y1, z2-z1) and checks for differences. All this data is sent to python scripts that analyze the motion. The delta values are then conditionally matched to some pre-defined gestures. From there, the gestures are printed online to a website. This allows both the person at the location to know exactly what the patient needs and for one that is in a remote place (ex. the wife of Mr.Mittal who was at the market).

How we built it

The device consists of 2 parts - hardware and software. For the hardware, we wired up an Arduino 101 with a basic breadboard, which consisted of a photo resistor, a 10k ohm resistor, and an accelerometer (adxl345). Whenever the patient would cover the photo resistor its voltage would change and send the data to the serial monitor. Whenever a change in the voltage was detected (below 850 in the case of the photo resistor), the accelerometer within the arduino was triggered. For the software side, the arduino would run a simple .ino file and receive the (x,y,z) coordinates. It would then wait for 6 seconds and then remeasure the (x,y,z) coordinates. This data was all sent to the serial monitor. From here, a python script using the serial library would use a baud rate of 9600 to connect to the serial monitor. It would save the data points in an array (of size 7, because 2 sets of (x,y,z) coordinates) and transfer this data into a .txt file (called 'output.txt' in this case). From here, it would also automatically run another python script that would analyze the delta change in the coordinates (see above) and based upon the changes form the closest gesture/desire. This gesture/desire would then be printed onto another .txt file (called 'input.txt' in this case). From here, PHP code was called that would transfer the .txt data to static web page. This was then modified a little bit using HTML5. The patient's desire was then printed to this web page and would be available for a person to view from anywhere (our's is limited to localhost for the demo).

Challenges we ran into

Some of the challenges we ran into included: patient research, gesture movement accuracy, and connecting scripts. Motor neurone disease is a rare disease and has multiple different forms (ALS being one of them). Each patient can suffer varying degrees of this disease and would thus be able to use their hands in varying amounts. We had to set a standard amount they would be able to move their hand muscles. We also had to understand how to translate simple hand movements into desires/needs of the patients. We boiled them down to the bare basics to keep it simple: food, water, and restroom. Different movements would belong in one of the categories (we certainly plan on adding more when we receive more data). Perhaps one of the hardest challenges we ran into was connecting Arduino, Python, and PHP scripts into a single project. We had to figure out how to efficiently use libraries to create a single unified product.

Accomplishments that we're proud of

We are extremely proud to have created an extremely cheap method of communication for the patients! The fact that we even made a small group of people's lives easier makes us overjoyed. Though our technology is not complex, it most certainly proves how each one of us can help those in need.

What we learned

We learned how to transfer data from a microprocessor, like the Arduino, to the web and allow people around the world to have access to the data. We also learned how to convert gestures into meaningful and accurate data. Most importantly, we learned how easy it can be to make a difference in the world!

What's next for Kinetica

If we do win any cash prize, we plan on using it to make the device more compact, performing more market research, and using machine learning to analyze gestures.

Built With

+ 12 more
Share this project:

Updates