On Friday night, we brainstormed until 1 am, at which point Jordan jokingly said, "What if you could hear graphs." At first, this concept seemed absurd, but after some further deliberation, we were able to look past the objective silliness of the idea. We decided that this project would not only be a super fun project to make but also has a use case with people who are vision-impaired or are auditory learners.
What it does
AAAAAAAAustralia produces a graph and auditory method of interpreting data. Using Node.JS, we can collect data, and display the data not only as a graph but also as sound! The ability to hear graphs is not only a fun prospect but can also be used by people who cannot see or learn best through auditory input.
How I built it
The core technologies used are React, Tone.JS, and Nivo. React is responsible for the front end of our web page, Tone.JS is used to produce audio based off of the graph, and Nivo provides graphs which display alongside the audio.
Challenges I ran into
Over the past three days, we have run into significant hurdles creating our front end, back end, and everything in between them! Our first obstacle was figuring out how to display the data as sound. After a long whiteboard session, we decided to match the frequency to the percent from the mean. We also had difficulty creating a web page since the members of the team responsible for the web dev knew little about HTML or CSS.
Accomplishments that I'm proud of
Over the past three days, we were able to create three core components, the web site, the graphing system, and the graph-based audio.
What I learned
What's next for AAAAAAAAustralia
The next steps for this project include adding more charts, allowing the user to input their voice for audio output, and graphing based on audio.