After more than a year of hardship caused by COVID-19, in 2021 Acting Mayor of Boston Kim Janey unveiled the “Joy Agenda”, an initiative that includes public art to heal the city. It uses joy as a tool that allows us to process challenging experiences in ways that bring hope and strength to our communities.
Before COVID, there was a public art project called Street Pianos Boston that brought an outpouring of joy. It involved 60 street pianos decorated by local artists and community groups, placed all around the city, and available for anyone to play and enjoy. Children and adults alike can make their first attempts at playing in public or rekindle a skill they had many years ago and more.
While many pianos still remain, with COVID rampant, it's not advisable to play public pianos at this time. So how do we add more music in our lives? We did so by creating a platform where you can go and create music with friends, AI, and anyone else who's artistically inclined or curious.
What it does
We created a web platform where you can enter your (user)name and a room ID to create music in. If others join the same room ID as you, you can create music together. There is always an option to add AI to be your music accompaniment.
You create music by clicking anywhere on the screen. Based on where your mouse is vertically, a musical note will appear on the right margin of the screen and move towards the left side of the screen. Once it hits the left margin of the screen, the note will be played and then disappear from the canvas. The note's pitch is determined by where it is on the staff. Additionally, you can control the duration of the note played (e.g. half note, quarter note, eighth note) by holding down the mouse for longer duration. You will see the notes you create and play, and you will also see notes created by other play on your client.
How we built it
Challenges we ran into
We started out as a team of 5 and ended up as a team of 3. Two members were present on the first night of the project and discussed how they wanted to figure out how to facilitate real-time communication. On the second day, however, they did not communicate that they were no longer hacking with the team, so that delayed our progress.
By the Saturday late afternoon, we decided that we had given enough time to respond if they were going to contribute. We then went ahead to redistribute the work so that one person would create the music app and started to implement socket logic, one person would run with it and create the rooms where people can collaborate and would finesse the UI, and the final team member would work on implementing AI accompaniment.
Accomplishments that we're proud of
We pulled together as a team to create a hack that brings joy!
What we learned
We learned a whole lot about drawing on HTML Canvas, ensuring that what's drawn can be scaled and rescaled when window size changes, translating frequencies into pleasant sounds (gainNodes and oscillators!), using sockets to connect clients on one platform, and audioGANs.
We stretched ourselves in this hack - for the team lead who came up with this idea, this is their first hack creating music as well as second attempt at implementing real-time communication. Implementing AI — a musical one at that — is tricky and also a gamble for a weekend project.
What's next for Code2Joy
First, we found that we also have to consider scaling horizontally - not just vertically. That is, if someone has a larger screen width than your screen, since notes travel at the same rate, the music note on your screen will play before others' notes because it has fewer pixels to traverse. As such, it requires that you have the same screen size to play in sync with one another time-wise.
We'd then create a user-friendly interface to allow people to jam with friends, AI, and anyone else who's artistically inclined or curious. We could then promote our project by encouraging organizations to use this for destressing or even teambuilding.
How to contact us
[Hacker] Anita Yip [Hacker] Ebtesam [Hacker] muntaser