Coming from a family where nobody can see without prescription glasses, I was always interested in making technology more accessible. For me, it is very important to "see" without needing to use my eyes. Voice User Interfaces such as Amazon Alexa and Google Assistant are an excellent accessibility option for the visually impaired. Combine the problem with the tools at hand, and a solution emerges.

What it does

Simply put, Alien Browser enables the users to browse Reddit using Amazon Alexa. Users can "read" submissions, posts and comments - all with just their voice.

Users can also get information about images and such!

How I built it

Alien Browser has a bunch of moving parts. There's an Alexa Skill that serves as a main interaction point for the users. This skill, in turn, is powered by AWS Lambda at the back end. This Lambda talks to Reddit API to read data from Reddit.

The skill makes use of Alexa Presentation Language to display images and texts to the user. With the power of APL users can see images and long texts wherever they are - be it Echo Show, Echo Spot or the 52 inch Fire TV.

In addition to this, Alien Browser also uses various AI services like Amazon Polly, Amazon Rekognition, and Amazon Comprehend to provide more advanced features.

Challenges I ran into

Reddit features a rich plethora of content like news articles, jokes, memes, stories, etc. While voice narration works well for textual content, it doesn't scale well to the non-textual content - especially images. With subreddits like r/AdviceAnimals and r/DailyMotivation, images form a significant form of Reddit experience. Handling these images was a real challenge.

Fortunately, Amazon Rekognition provides handy APIs to extract text and other image information. Alien Browser can very well narrate quotes and memes.

This coupled with APL forms an excellent user experience.

Accomplishments that I'm proud of

Making images accessible was a great challenge to overcome. Providing inclusive user experience is an accomplishment in itself.

However, Alien Browser doesn't stop there. Many people use Reddit as a news source. Alien Browser provides a novel feature wherein users can ask to hear only positive (or happy) news. The skill will filter out the negativity - thanks to sentiment analysis provided by Amazon Comprehend.

What I learned

Providing a good user experience can be challenging. However, making that user experience accessibility friendly comes with entirely new set of unforeseen challenges. Overcoming these challenges is a great learning experience.

What's next for Alien Browser for Reddit

There are a bunch of usability improvements in the pipeline. For starters, Reddit supports markdown. Alien Browser will soon handle this and extract only the text for narration as opposed to full markdown.

Alien Browser also needs to come to more platforms, starting with Google Home.

Share this project: