As college students, we've run into the common problem of losing out on sleep due to larger workloads and stress. We wanted a way to integrate existing technologies to make visualizing your sleep much easier, so we didn't have to spend time on our own plotting our sleep data. We needed an interactive way to improve our sleep habits one step at a time, which is why we created SleepSights.

What it does

SleepSights is an Alexa Multimodal skill that allows the user to view their sleep data through helpful graphs in addition to receiving auditory output about their sleep last night. The user can ask, "Alexa, ask sights how long did I sleep", and the skill will respond with the hours and minutes slept during the previous night, as well as a graphic that incorporates the user's sleep goal stored in their Fitbit account.

How we built it

The Alexa Skill is hosted on AWS Lambda. We attached the Alexa Skills Kit as an intent trigger and outputted to Amazon S3 and Amazon CloudWatch. The AWS Lambda function was written using Python and utilized multiple libraries including boto3 for the AWS Python SDK, requests-oauthlib for communicating with the Fitbit API, JSON for parsing data from the Fitbit API and data sent back to the Alexa Skill, and Plotly for programmatically generating graphs from our data. We also developed an APL template that allows for custom graphs to be displayed with custom text. This template was linked into the AWS Lambda function and was sent in the response JSON. In addition, our Alexa Skill supports account linking so users who use our skill can link their Fitbit account to get real-time data.

Challenges we ran into

All of us have used AWS in several of our hackathons, but we haven't incorporated so many different AWS resources to make both an auditory and visual output for the user. In addition, we're all relatively inexperienced in Graphic Design, so designing the charts to fit the multimodal screen was difficult for us. In addition, it took some time for us to integrate APL documents with the actual Alexa skill, but the process of learning this was very important.

Accomplishments that we're proud of

Using our prior knowledge and experience with JSON objects, we were able to learn and use a developing technology (APL). Additionally, we were able to retrieve Fitbit data from an API and convert it into a format that users can read in real-time.

What we learned

We learned how to use Alexa Presentation Language (APL) as well as the authorization framework OAuth2 with the Fitbit API

What's next for SleepSights

In the future, we hope to expand the sleep analysis that SleepSights offers. For one, we want to add more graphs and visuals for the user to be able to better understand and improve their sleep habits. We would also want to develop features that offer recommendations to the user, allowing them to further improve their sleep.

Share this project: