With so many different cultures, people and languages on this planet, it may be hard to understand one another

What it does

With a simply double tap at the back of their phone users will be able to translate the text on their phone's screen to a language of their choice

How it works

Firstly, the app uses some audio recognition algorithm to detect when a user has double taped their phone. Afterwards the app sends the image to an AWS Elastic Beanstalk which then uploads the image file to an AWS bucket. When the image has been uploaded the Elastic Beanstalk then calls the AWS lambda function that does all the processing.

In AWS lambda the image is first split into lines before sent to rekognitor as rekognitor can only recognize upto 50 words.

For each line AWS rekognitor is called to detect text in that line. The lambda function also detects the the background color, text color and font at this point. It also saves the positions of the text

After all the text detection is done, AWS Translate is then called to translate the text and returns the image in full

What's next for Trans

We'll see after the hackerthon

Built With

Share this project: