Inspiration
In the US, 1 in 4 people will have a stroke in their lifetime. Working in healthcare, we have seen the devastating effects of strokes on our patients - lifelong, irreversible disability and death. The average time to critical, life-saving care in the event of a stroke is 3.8 hours, but treatment is most effective when started within 3 hours. Delayed treatment leads to long-term disability, decreased functionality, and worsened quality of life for the patient and their families. These effects are further felt financially, as costs of long-term care increase exponentially with permanent disability. This increased burden weighs on individuals and the US healthcare system as a whole.
We all know individuals that received care too late and were left to cope with their new normal. These consequences affect everyone - families, friends, caretakers, even employers. We aim to shorten the window and help people recognize strokes faster.
What it does
Our solution is an app that helps users recognize symptoms of a stroke and shorten the time to critical care. It will help the user figure out if something is wrong, and give them next steps, streamlining the next level of care and getting to a hospital.
How we built it
With rapidly progressive technological advancements, more and more people are gaining access to smartphones with facial recognition. We see industry trends moving toward utilizing smartphone technology in healthcare through Appleās Health app, Fitbits, smartwatches, etc. The proportion of elderly individuals using smartphones is also increasing, allowing us to utilize technology that folks already possess.
Challenges we ran into
In a survey of family and friends, most indicated that they would be hesitant to pay out of pocket for such an application. This posed a challenge of how we could finance such an endeavor. We realize there are many stakeholders to reducing morbidity due to stroke complications. Improved outcomes and long-term functionality saves cost in providing nursing care and treating further complications such as pneumonias associated with dysphagia, falls due to weakness, etc.
Additionally, while anyone can experience a stroke, there are well-described risk factors. We struggled with how to appropriately incorporate this knowledge into our application. We decided that we would create a "demographics" section to best understand our user and how to help them.
What's next for Scan&Sound
We hope to expand this technology to recognize other conditions with varying levels of urgency. For example, recognizing jaundice or scleral icterus may trigger an alert to make an appointment with a primary care physician.
Log in or sign up for Devpost to join the conversation.