Inspiration

We really connected with the mission of the charity and liked the challenge of building a solution that delivered a personal experience but was quintessentially scalable from the outset.

What it does

Chat sessions are created between sponsor and child, whereupon the conversation is transcribed, and then analysed for potentially harmful words or phrases. The framework has been created to use a variety of filtering methods so as to provide the best result - keyword searching, sentiment analysis and manual intervention.

How we built it

The majority of the application uses S3 at the core, due to the large amounts of file processing needed. Audio files are recorded and saved to S3, transcriptions are saved to S3, at which point the Amazon Step Functions kicks off the lambdas running the various filtering options. Should a particular snippet in a given session pass the filtering, the text is translated and saved to S3, where it is picked up and converted back to audio.

S3 hosts both the recording application and the playback application.

Challenges we ran into

Amazon Lex was unable to parse paragraphs (free text) speeches. We also ran into issues with the code commit and code build (it was running into permissions issues). We attempted to reuse the Lex voice recorder javascript recorder to upload to s3 and then use AWS transcribe to extract text, however, the Lex voice recorder was storing the files in a weird format. We ended up ripping that out.

Accomplishments that we're proud of

Implementing an application using at least five Amazon products we had previously never used before.

What we learned

Step Functions looks like an amazing way of pulling together multiple serverless microservices. In our workplace we've found that isolated serverless microservices are hard to keep track of, so we've tended to stay away. However, we might look at them again.

What's next for Elliot

  • Implement machine learning on the second filter, and find a way to train the models based firstly on feedback from the child during and after chat sessions.
  • Implement Mechanical Turk, possibly once a conversation hits one or two strikes in order to additionally generate data to train the models.
  • Build out the UI for both sponsor and child.
  • Metrics, dashboards, reports.
  • Implement an escalation framework for chat sessions that are blocked.
  • Better feedback for the sponsor to indicate when strikes occur in a session, in order to steer them away from potentially harmful conversations.

Built With

Share this project:
×

Updates