- 2nd Place Independent Entrepreneurship Plan - $300 Scholarship
- Synopsys Science Fair Winner - $80 + Certificate
Crimson was formed by asking the simple question - How do the blind and visually impaired read online? After doing some research I realized that online content is read out through voice over, a slow and robotic text-to-speech that reads all the text, the menus, the ads, and more. All the Visually Impaired want is the main content of the article, that's it - And that's the problem that Crimson solves.
What it does
The concept of Crimson is very simple. Revolutionize online reading through audio and voice control. The glaring screens, miniscule text, and headaches/back strain that's currently caused by online reading goes away. Replaced with it is an app, that's entirely voice controlled and reads out the article content that you want to listen to. The app is able to listen to your commands and play articles without any manual interaction with the device.
The voice-control implementation is easy to use. You can navigate the entire app by using simple commands such as "Open CNN", "Select Tech", "Play Article" and so on. By incorporating IBM Bluemix, the voice is no longer robotic, instead, a friendly human voice reads out the article. The Audio is fully customizable, meaning you can change the pitch, volume, tone, and even the accent of the audio.
Crimson also has numerous features incorporated. Not only voice control and total audio customization but the concepts of playlists, play-all buttons and a backend database have been incorporated. The user can easily add articles he wants to read in the future, to his playlist, and play all the articles in a category through the command "Play All". All the articles added, will be stored in the Parse Database until required again.
How I built it
The iOS App is built completely in Swift 2.0. It incorporates 3 main frameworks. IBM Bluemix for text-to-speech of the article content and the human voices. OpenEars for voice control and recognition. And Parse for the backend database to store the user's articles. The app strives to simple, and have a clear UI that's easily navigatable. The project also utilized a Google App Engine server that parses the select content off websites, through Python's BeautifulSoup.
Challenges I ran into
The biggest challenge I ran into was incorporating IBM Bluemix into the application. Before, Crimson used AVFoundation that had a robotic voice but was relatively simple to incorporate. IBM Bluemix's framework relied heavily on Objective-C and it took me two weeks to add the framework into my Project and get rid of the errors. There hasn't been much documentation on IBM Bluemix in iOS Apps, and it was definitely time-consuming.
Another major challenge was passing data from the settings pane back to the play controller, and saving it for future uses. The user can always change the audio settings when listening to the article, and updating the audio with the new settings was difficult, as well as saving the new preferences for the next time the audio settings will be changed.
What's next for Crimson
Crimson's not going to stop here. I want to take Crimson to convert all sorts of texts to audio, including your own PDF's or Audiobooks. Also, the incorporation of language translation from IBM Bluemix would enable non-english natives to understand the content in the language of their preference