Life’s emotional roller coaster will ride through its highest pleasures and its lowest tragedies. In those lowest times, we often want to experience a story we can relate to. Heartbroken by similar occasions, our team wanted to help those in pain by creating a custom story generator to help you recover from your current aches. Yes you could go to a random blog and slog through pounds of other people's problems, and maybe find a kindred soul in the rough. We've pulled the best stories from around the web, and use advanced machine learning techniques combined with bleeding edge context semantic awareness algorithms to custom tailor you the perfect story.
What it does
You tell a story about you. Maybe you had a bad day, maybe you share your worst day ever, maybe you just rant. We analyze your key emotions and concepts in your life, and design a custom book of short stories that you can feel better. It's like a therapist, a favourite book, a good friend, and a nice bowl of chicken soup all wrapped into one. Oh yeah, and it's free.
Who we are
Three Canadian highschoolers who make really neat things and sometimes funny jokes.
How we built it
The whole ordeal is a few different sections. It's written (almost) entirely in [Python 3]. The public-facing server and website are written with Flask, a micro-framework for web development, so we have total control over what goes on. That makes it fast, light, and secure. The server connects to a machine learning framework, Tensorflow. We use a Recurrent Neural Network (RNN) to write sentences based on the input provided. Keywords are identified using the meaningcloud API. Once the RNN has generated a beginning, middle, and end of the story multiple times over, we find the best sentences in each category and assemble them into a work of art for the user. Each and every part of the system was carefully crafted, and the team knows our sub-system, and how it integrates with everything else, like they know themselves.
Challenges we ran into
So many challenges. Just all of the challenges. Not a single one of us had ever used ML before. Between the three of us, none of us had much experience with Python before. We had to learn about all that and more, like SSL (our site uses valid HTTPS certificates), foreign APIs, setting up servers, plus a hundred other things that can't be justly put in words. Before this, we'd never worked together before, yet by the end we were an undeniable team, eating, sleeping, and breathing together.
Time was also working against us. For an ideal train, we needed about 200 epochs. That would've required
People are telling us their stories, and we take that seriously. We can't let those personal details leaking onto the internet, so we don't let it happen. Every request to and from the website happens over HTTPS, with real signed SSL certificates. No user data is stored on the server either, so in the event our server were to get compromised, there would be nothing of value (except our tears). So you can check that little green lock in the corner of your screen and know that your data is safe.
Accomplishments that we're proud of
When most people see a beautiful diamond, they're utterly stunned by its radiant beauty. When the jeweller sees their diamond on display, they see not only the beauty but the delicate precision of every cut and facet. This project is our diamond, we are proud of every minute detail of this brilliant gem of a website. Each of us poured ourselves into the work, and we think it really shows through.
Jack: I'm most proud of the RNN. By feeding it word by word, instead of character by character, it feels more like a human writing, as it uses natural contractions and still manages to maintain proper grammar. It's actually three totally separate models that get threaded together at the end. Tensorflow wasn't designed to use three at a time, their documentation assumes that one will always be enough, so I had to actually run them as Unix sub-process. I'd also never touched ML before, so I learned a lot at every step
Carol: I learned so much, and I didn't get distracted! I wrote my first ever web-crawler, which was definitely an ordeal. Fun fact: people who make websites don't like bots. That means that when you make a web-crawler you have to make to make it look like not a bot. In this case, that meant changing the
user-agent to something that wasn't clearly a programming language. So a couple websites out there got visited by the
H4X0R user-agents this weekend.
Curtis: (He was sleeping, but just imagine he said some deep and moving stuff. He dealt with a lot of issues like strange encoding and undocumented APIs)
What's next for AlphabetSoup
Sellout to google for $30M