Inspiration
I think the world forgets there are 500,000 people or more who cannot speak or move around. As an IT Support Specialist, I became a specialist at setting up their complicated, overbearing, unfriendly equipment and I thot this a perfect Hackathon to showcase a solution
What it does
ASL Command allows you to use hand signals to control anything and communicate with AI and the world. We use Next.Js, Vapi, Claude and Gemini to provide this service and the connection to the world including phone, sms, lights and an arm that can bring you things.
How we built it
I used lerobot and printed and assembled the arms then programmed them on datasets to recognize items and perform certain tasks then I made software that reads American Sign Language and fires tools, like Vapi, Claude or Lerobot, based on said Hand Signals.
Challenges we ran into
Training is huge and crashed the repo, took me 3 hours to recover. Other than this, just a matter of time... takes time to train in all the hand signals and actions.
Accomplishments that we're proud of
It freaking works man. I don't care if we win if we are the inception of something that helps someone without arms or voice interact with the world and be happy. I am proud of that all day.
What we learned
That I should have started two days early and found teammates to do the training while I was debugging. Hard for autistic synesthete to work with others... better to watch. :D
What's next for ASL Commander
I want this for me... full house control by hand clapping, signals and other items that are FAR more effective than voice! (plus they don't interfere with my voice convo or listening to music)

Log in or sign up for Devpost to join the conversation.