Inspiration

The idea came from the realization that deaf people have the ability to speak; however, due to their conditions, it is extremely difficult, and they do not have the ability to receive auditory feedback for their pronunciation.

What it does

“Ah” is an iOS application that utilizes the iPhone X’s true-depth camera to map the shape of the mouth and extent of its opening and stretching in various directions and orientations. The iphone’s microphone would listen to the user’s pronunciation and provide them with a feedback.

How we built it

The group built the project using Apple’s swift IDE. Apple recently released the ARkit to its developers, and the group was able to find the physical orientation and position of the mouth with this kit and the iPhone X’s true-depth camera. The users’ phoneme (or pronunciation of various words) could be detected and quantified using open ears library and the iPhone’s microphone. The AVkit was implemented in the code to implement the instruction videos that displays the mouth shape, when saying a certain word or a phoneme.

Challenges we ran into

The first challenge the group faced was to detect and quantify the user’s mouth, lip, and jaw movement. The next, and the hardest challenge was the phonic recognition. By using a pure digital signal procession methods, it was hard to distinguish certain phoneme, since they shared a similar digital signal.

Accomplishments that we're proud of

The group is proud of the fact that we were able to distinguish the shape and orientation of a user’s mouth by quantifying the extent of their mouth opening, jaw drop, and lip movement using the iPhone X’s true depth camera. The next proud announcement was the use of openear API to quantify the user’s pronunciation of various phonemes.

What we learned

The group learned the potential of in-depth cameras and speech recognition.

What's next for Ah

E, I, O, U!

Built With

Share this project:
×

Updates