Inspiration
My grandma, Wai-Yul Kwan immigrated to Canada from Hoiping, China in 1981. She raised 4 children and worked several jobs in the Hamilton area - all while she could not understand or speak English. Barriers she faced that made learning English a struggle were due to time and financial constraints, age-related learning difficulty, lack of contextual and culturally relevant education, fear of losing her native identity, and pressure to navigate essential systems without accessible language support. Importantly, proficiency in the host country’s dominant language is a critical gateway to immigrants’ socioeconomic mobility, enhancing both economic productivity and integration into society. (Chiswick, Barry R., and Paul W. Miller. The International Transferability of Immigrants’ Human Capital. IZA Institute of Labor Economics, May 2018. https://docs.iza.org/dp11485.pdf)
What it does
immi is an assistive technology and personal education platform to support non-native language speakers' adaption to their new environment and learn, powered by ElevenLabs and a facial API. Therefore, translating and speaking into the user's native languages with progression to familiar accents naturally comforts and provides familiarity to the new language.
The full program is presented as a web page, providing explanation of immi's process, example uses, and frequently asked questions. Buttons direct you to a dashboard with immi's "Smart Lens" to combine ElevenLabs' conversational agent and real-time biometrics collected by a camera as the user's life aid. The role-play feature is the primary education source of immi, utilizing pre-loaded situations or prompts to create practice of real-world situations, essential to helping users prepare for their interaction with the world. This is intended to run on simple mobile devices, providing accessible access.
The hardware piece runs live translation with between 2 pre-loaded languages, that auto detects speech input from a microphone and outputs the exact message in the other language through a speaker. This is enables real, human engagement with others despite incompatible languages. This is intended to run on immi's own device, providing simpler/limited interfaces with essential functions.
How we built it
The team is made of 4 second-generation immigrants to whom the immigrant integration experience is familiar through our parents' stories. Despite our parents coming from different parts of the world, we found commonalities in how they all faced difficulties adapting to Canada, and that their experience could have been bettered. We wanted to create something for change, to improve personal lives and thus, the community.
We brainstormed from these personal experiences and asked other hackers at DeltaHacks12 to share their story, and landed on our key functions. A GitHub repository was set up and structured to accommodate for our requirements. An index.html and style.css allowed for the landing page to be created, and we used the GitHub marketplace extension of Live Server to view the page as we continue to update it through our design process. The facial API was modelled off of online resources and updated to accommodate to our modern technology in updated facial tracing, webcam compatibility, and integration into the .html page. An ElevenLabs agent was created in page.html and page2.html, and customized through rigorous prompt training. It was tested and multiple voices were used, to be integrated into the .html page. immi currently accommodates for English and Mandarin-Chinese languages, and the website can be viewed in English, Mandarin-Chinese, Hindi, and French, by lang.js. Hardware was created after a flashed x64 Debbian Trixie onto a Raspberry Pi 4; playing to the ElevenLabs API and connected microphone and speaker. script.js acts as the "spine" of our platform.
Challenges we ran into
Working through outdated tutorial videos and software, as well as repurposing semi-related tutorials brought serval challenges, but led us in right directions. It required us to further diagnose and understand our agents than typical.
Accomplishments that we're proud of
DeltaHacks12 is the first hackathon for 2 of our members to have participated in, and we are very proud of how much we have experienced and completed in the sprint. This is also the first time that all 4 of us have worked together in this manner - though we grew up in the same high school having interest in engineering and computer science, we have since gone to different universities/paths and are all studying different disciplines, we were able to connect our learning from personal life to produce a singular product that helps the world and brought us closer together during learning, diagnosing, debugging, trials and tribulations, and finally getting it all work.
What we learned
Building off of the challenges we ran into, agent training help us understand the APIs mind better, and integrating the agent into the .html page taught us the extensions and possibilities of technological functions even when using the APIs for the first time.
What's next for immi
immi looks forward to integrating more languages, being able to provide a map of the world such that it shows exact languages and dialects spoken in the area, and to further expand our audience to familial dynamics of immigrant families, people living with disabilities, or people wanting to learn a new language.
Built With
- canva
- css
- elevenlabs
- faceapi
- github
- html
- javascript
- machine-learning
- python
- raspberrypi
- terminal

Log in or sign up for Devpost to join the conversation.