Nothing remotely mobile has kept track of a person's proximity to a screen or other reading medium over time.

What it does

Calculates the proximity between screen and face and records each individual data point in "sessions". This can then be analysed to find trends, ie when the proximity gets smaller, warning user to possibly see eye doctor

How I built it

Used a Raspberry Pi and hooked up an ultrasonic ping sensor for proximity measurement. Used a small button switch to start and stop sessions (for now). All data goes straight to AWS DynamoDB (design choice was half by accident)

Challenges I ran into

Amazon AWS was confusing despite how I have somewhat extensive experience with administering my own virtual servers without problem. Also, I initially rolled in a few hours into hacking due to an exam that got annoyingly moved to during dinner (and the conflict was on Saturday!).

Accomplishments that I'm proud of

The hardware worked for the most part

What I learned

Some DynamoDB

What's next for Insighted

A better user interface, especially data visualisation. But really, hopefully mobile devices will come with better proximity sensors to allow for more uses than detecting touching something.

Built With

Share this project: