Inspiration
In 2004, I was born. Five years later, my sister came into the world — and with her, a challenge our family never expected. She was deaf. Growing up, I didn’t immediately understand the weight of it, but as I got older, I realized communication for her wasn’t natural — it was a constant barrier.
She understood the world through American Sign Language (ASL), while most of the world spoke in ways she couldn’t hear. Simple moments — calling her name, sharing a joke, helping her learn — required effort that others often take for granted.
This wasn’t just her struggle; it became deeply personal to me. I didn’t want accessibility to be a privilege. I wanted it to be a standard.
That is why we built ELYRA — to ensure no one feels disconnected from the world simply because technology wasn’t designed for them. (ITS A EMOTION TO WORK FOR THIS PROJECT WHICH AM IN MAKING)
What it does
ELYRA is an AI-powered accessibility platform designed to bridge communication and learning gaps for individuals with sensory and cognitive challenges.
Our smart haptic device converts digital content into tactile feedback such as Braille, enabling visually impaired users to read independently. For the deaf community, the platform translates audio into understandable formats, while assistive tracing features support people with dyslexia and ADHD.
ELYRA transforms technology into an inclusive experience — making interaction, education, and communication accessible to all.
How we built it
We built ELYRA by combining artificial intelligence with embedded hardware to create a real-world accessibility solution.
At the core of our system is a continuously improving LLM trained with curated datasets to better interpret language and contextual cues. This powers our ASL intelligence layer, helping deaf users understand words, conversations, and environmental context more intuitively.
For visually impaired users, we engineered a smart haptic device capable of translating digital information into tactile output. The device integrates seven servo motors, an ESP32 microcontroller, motor drivers, three touch sensors for interaction, and an onboard speaker for multimodal feedback.
On the software side, we developed a robust pipeline that manages interruptions across the AI layer, backend services, and embedded hardware — ensuring low latency and reliable communication between cloud intelligence and physical response.
By merging generative AI, real-time processing, and embedded computing, we created a system designed not just to assist — but to empower independent interaction with the world.
Challenges we ran into
Building ELYRA meant solving problems across both software intelligence and physical hardware — and making them work together seamlessly.
One of our biggest challenges was training and refining our LLM to accurately interpret language and contextual cues for ASL assistance while ensuring the responses remained fast enough for real-time use. Balancing model capability with latency was critical, especially when accessibility depends on immediacy.
On the hardware side, developing a reliable haptic device required precise coordination between seven servo motors, touch sensors, motor drivers, and the ESP32 microcontroller. Even minor timing issues could disrupt tactile feedback, so we had to carefully optimize signal flow and responsiveness.
Another major hurdle was handling communication between the AI layer, backend infrastructure, and embedded system. Preventing interruptions, reducing delays, and ensuring consistent data transfer demanded thoughtful system design.
Beyond the technical challenges, we carried an emotional responsibility — building for people like my sister meant the solution had to be dependable, intuitive, and truly helpful. That expectation pushed us to design with empathy while engineering with precision.
Accomplishments that we're proud of
Transforming a deeply personal experience into a purposeful technology that addresses real accessibility gaps.
Successfully building a working prototype that combines AI intelligence with a functional haptic device — proving that inclusive innovation is achievable.
Engineering a system that bridges cloud-based AI with embedded hardware, enabling real-time tactile interaction.
Designing for multiple accessibility needs within a single platform without compromising usability.
Moving beyond the idea stage to create something practical, scalable, and capable of making independent interaction a reality.
Most importantly, we’re proud of building technology that doesn’t just assist people — it empowers them to engage with the world more confidently and without barriers.
MAIN ASPECT WE FIXED->we killed the heavy costing of the braille pads fromm 2000$ to just 8$ a heavy cost cutting with great technology not just this our device and ASL models solves problems of people who are deaf ,blind ,have ADHD and dyslexia.. 90% of the problem is finance ,people with hyperactivity have good intelligence but are low with physical means hence we fix the finance and present our software and tool in the market her the touch pads solves speed controls for the person.
What we learned
This journey began with something deeply personal — learning American Sign Language to better understand my sister and communicate in the way she experiences the world. Through that process, I realized that accessibility is not just about technology; it is about understanding people, their challenges, and the silent barriers they face every day.
Working on ELYRA taught us how empathy can drive innovation. What started as an emotional problem evolved into a practical, reliable solution capable of improving independence and communication.
Technically, we gained hands-on experience across multiple domains — from training and refining AI models to managing backend systems and building embedded hardware. We learned how to bridge the gap between intelligent software and responsive physical devices, while designing for real-world usability.
Most importantly, we learned that meaningful technology is built when compassion meets engineering.
Today, our haptic hardware is already patented, and we are actively progressing toward securing intellectual property for our LLM-driven system — reinforcing our commitment to building original, impactful accessibility technology for the future.
What's next for ELYRA
Its just a beginning extending and implementing it over our a series of platfroms including a NLP based Model to undrstand the data and send back to the Haptic device is the goal ie making it well and good for every platform out their is the next goal/////:))
Log in or sign up for Devpost to join the conversation.