Inspiration
Sometimes the best solution is just having someone to talk to. We wanted to make a chatbot-eqse skill for Alexa that listens, as actually, most therapists do something similar. It's good not judge, and not to come to instant conclusions when talking to someone about how they're feeling.
What it does
Our bot doesn't give any insightful opinions, advice or information - it doesn't need to. By simply processing the positivity/negativity and basic keywords of what the user says, it replies with an abstract response to encourage the speaker to keep going.
How I built it
We started with a skill created on Alexa Developer Services, which we then linked to a lambda function on Amazon Web Services. We then added a layer with some of the libraries we were going to use, also connecting the back end to Amazon Comprehend (another Amazon Web Services). This allowed us to process the speech.
Challenges I ran into
Connecting Comprehend to out Lambda function was really hard. We've never used the mentioned IDEs before, and so tasks that at first seemed easy panned out to be very time-consuming.
Accomplishments that I'm proud of
Actually finishing! This is every member's first Hackathon, and so we're really happy to have created something by the end of it. It's not perfect, but we do genuinely feel it could useful.
What I learned
Making an Alexa skill is challenging. AWS is a very powerful tool, which could allow for the creation of many different, interesting projects.
What's next for Active-Listening
Making it more human-like, and perhaps introducing more understanding of user input.
Built With
- adc
- amazon-comprenhend
- amazon-web-services
- lambda
- python
Log in or sign up for Devpost to join the conversation.