In our initial brainstorming session, Ansh shared his experience using snapchat heat maps and how it allows for the opportunity to see whats happening in different areas across the globe. Given the current situation in the world, Ansh proposed for us to recreate a similar project to snapchat heat maps where we consider displaying the actions of those affected by covid-19. The goal for this project was to memorialize those effected by covid-19 by sharing pieces from their life with the world.

The ongoing COVID 19 pandemic has resulted in a total of 368,711 deaths worldwide, with more people dying from the virus every day. So much is going on in the world, and usually in times like this, people come together to support each other, to remember those lost, to cherish their memories. With our project, we wanted to put faces and stories to those numbers.

What it does

We have created two forms of media to display a map of all those that have lost their lives to covid-19, a mobile application and a website. Both the application and website allow users to scroll across the united states and read short bios of people that have passed away. Currently, we have already populated information but if users want to include pieces from loved ones that have suffered from this virus, that is also an option in either forms.

How I built it


The website was built using custom CSS/HTML/JS for the front-end experience. Then we embedded the GoogleForm as an iFrame within a separate page and also styled it to make the theme of the page to make it mesh together more easily. The maps page was built using the Google Maps JavaScript API, Ajax, JavaScript, and Google Sheets API.

The maps page first queries the google maps SDK and sets up everything and after it is ready to go calls our custom function. This function then queries the Google Sheets API and gets the data in a JSON format via AJaX with our API key that is setup to only accept requests from our domain. Upon successful completing of the request a callback is triggered that parses said data and converts it into a global location object that has all of the relevant properties of the columns from the spreadsheet. Once this object is done being created we pass it to our marker creation function that sets the bounding box to be within all of the pins(currently just the USA). After this is done we create an InfoWindow object that will display the contents upon someone clicking on the pin. We register an onClick handler with the Google Maps Library, we then call the location object and get the relevant properties and format it into an infobox showing all of the information about that person.

mobile application

@anshgwash - The Maps SDK made it fairly easy to add pointers to the map. The challenging part was to get the information such as name, location, message, age, from the Google Sheet to the android app. I used a script to get the Json file from the Google sheets. To implement this script I used volley for android. Once the Json file was parsed, I could use the data from the Google Sheets to add markers in the map with info window.

data aggregation

The data we collected initially is based on a new york times article that shared a 1,000 people across the united states that have lost their lives to covid-19 [1].

Normalizing Data

I, Macarthur Inbody/133794m3r, handled the back-end items. The data was all over the place in terms of formatting. Some didn't have states listed, some just had cities. So all of the data had to be normalized into a consistent format so that it could be imported to our google sheet. The first step was writing a large series of sed scripts to go through and parse and cleanup the data into a more normalized manner. Then I had to write a python script that took that data and added the extra fields as needed and reorganized them to be in-line with how the Google Forms form was setup. Once this step was done I added time-stamps so that the fields would be unique. I also modified the data to make each entry only contain initials so that there was not as much PII. I then used awk, sort, sed and some more RegEx to sort the fields based upon the timestamps that were generated for each row. Once this was done the data was imported into Google Sheets.

Filling out the missing links.

Once the data was in the google form I then had to create a new column that contained the entire address. After that column was created I wrote a Google AppEngine Script that would run on each form submission that would go through and concanate the three address fields into a single column and return that data to the update function. Once it had the full addresses it went through the spreadsheet and looked up the latitude and longitude of each address and wrote the data to the respective columns. This was done by querying the Google Maps GeoLocate API via my API key. The result was then given back to me as a JSON string that was parsed and relevant data extracted.

Automating the system

The google form was linked to a google spreadsheet so that I could install a trigger on formSubmit to go through and update/re-normalize the data in case something happened since the previous trigger such as someone deleting a column's contents. All of the rows' relevant data would be read from the sheet in one batch to reduce I/O overhead. It would run the "concatenate_addresses" script that would concanate the address fields from our google form. It would do this by reading through each row and seeing if the specific column was blank. If it was then it would concanate the columns taking special care in case the user submitting just their country so as-to-not have "null" entries after we concanated the cells into a string. Then it would query the maps GeoLocate API and return the geo information for the target address. After receiving this information it would create an object that would then wrote to the cells specified so that we were doing batch operations. If the cells already had a latitude and longitude element in the cell then the script would skip over them and not carry out the API call or write to the table and simply skip to the next row. Once this was all done the script would exit and the data would be updated in the sheet ready to be consumed by any endpoint.


The form also makes sure that the user doesn't enter an age that is less than 18 so that we can avoid having to deal with COPPA and similar technical issues. It also does validation on the user's input so that when they submit their data to our spreadsheet we know that it will at least be somewhat reliable before we query using it.

Challenges I ran into

The google addon that most people used would no trigger on form submission so we couldn't use it, thus requiring me to write the whole thing from scratch and learn all of the APIs to carry it out. Then we had to also normalize the data as none of it was in a standard format that could be easily parsed. Once the data was normalized we had to write triggers on the "Sheet" so that it would keep the data in a known-good state at all times so that anyone consuming it could rely upon the data being complete.

Accomplishments that I'm proud of


I went into this weekend not strongly versed in web scripting. I only had a little bit of experience with HTML from high school and that was about it. This weekend I decided to learn about and use some interesting JavaScript libraries that allowed me to build smaller programs along the way. Although I was not able to use some of the material I made, given time constraints and technical issues, I am very proud of using external JavaScript libraries to execute interesting mini projects.


My primary programming language is C++ before this hackathon because I programmed for my coding assignments at my college. I had little to no experience with HTML languagefrom my high school. I now have the experience of HTML language with notepad++, which can function will all programming languages. I created the website and later designed the website with the navigation bar and make the text scrolling around the screen. I learned to change background color and add an image for our background. This is an amazing experience for me.


Before this weekend, the only programming experience I had was through an intro to java class. Through this, I had learned the concept of coding but hadn't really gotten to applying it that much. I developed the Android app for our project, having never used Android studio before this.


Getting the whole thing to work to be honest. Parsing over 600 records and converting them to a normalized format. Then creating a google form that would validate user's information so that we can know it's in a good state. Then writing the App Engine script and learning the APIs on the fly as we went along. Having the whole thing orchestrate itself. Installing triggers to make the whole thing as automated as possible. It was certainly a wild ride.

What I learned


I learned quite a lot from resources, mlh mentors/participants, and from my own group. I learned how to use JavaScript external libraries, I learned interesting new aspect to GitHub (hosting domains using cname and dns!), and really I learned most about how well I do under pressure (not very well). This hackaton has taught me a lot, whether new technical skills from tutorials and mentors or soft skills from interacting with my team.


I learned about creating a website in html language. I had to go to tutorials for making our website look nice on HTML with notepad ++. Then I started to experiment with designing our website with changing our background image, change the color of our text, make the text moving around, changing the text color to fit the reader's visibility in a colored background, and made a navigation bar to store our internal links in our homepage. Since this is first time researching on various techniques on designing and the creation of our website in html, I feel like I learned a lot during this hackathon.


I learned a great deal about APIs and how they function to make software possible. I had to go through numerous tutorials and a lot of documentation to get the app to work and I'm glad that it finally did. Trial and error coupled with the my background in Java really helped me. This was also my first time working with GitHub repositories, where I feel like I have a lot more to learn.


Google Maps API, Google App Engine Scripting, using Google Sheets as a database, Google Maps JavaScript API. It was a lot of API learning and making sure that everything talked with each other as I was the back-end guy. I was responsible for making sure that the front-end guys could rely upon my data working so I had to double and triple check my code to make sure it was always keeping everything in a good state.

What's next for tales from covid?

The next big thing would probably be caching of the data. Making it so that the web page would cache the results to a LocalStorage string and also have a second value that would say when the data was fetched. If the time between them wasn't at least 30 seconds then it wouldn't call the AJAX code and would instead just read it from the cached copy and build the map from the location data that's already been parsed. Upon updating itself it would write the data to the cache and reset the timer. For the app it would be a similar thing caching the results to avoid needless API hits and wasted data.



Built With

Share this project: