Submission for Productivity Hack Track


Having volunteered for the Antietam-Conococheague Watershed Alliance ( for about 12 years, it is clear to us that community service reaches the best results when its results can be clearly visualized for everyone to see. But citizen science faces challenges is managing its decentralized volunteer force and in juggling the many different forms of data that those many volunteers produce. Over the last 3 years, we have witnessed countless man hours managing this data in hand-made spreadsheets and word documents imported into a rigid google maps structure. MapHub simplifies the process by allowing volunteers to input their data into a simplified, standardized form piped directly into the database and linking it straight to spacial analysis services - making data collection easier, results visualization instant, and your hassles few.

What it does

MapHub manages forms volunteers use to submit and manage stream temperature logger data and then maps that data live on the dashboard. It uses a custom API to interact with the database structure - adding data, editing and removing rows, and table conversion for exports. The data is then converted to a geocoordinated file which is hosted for automatic import to ArcGIS, the premier GIS analytics service.

How we built it

The website and API is built with Flask, using leaflet.js as an interactive map for automated coordinates collection. It's hosted on Computer Science House ( OKD via dockerfile, as is the PostgreSQL database. We built a table to geoJSON function from scratch (see at site/data.geojson) that ArcGIS queries each time a user opens the dashboard.

Challenges we ran into

The ArcGIS API was easily the most painful experience of this hackathon. The initial intention of the project was to use the api to update the feature layer and re-execute analysis steps to convert the points into a strong analytical model. We were hampered by service permission settings and logging in through the API (difficult because our account logs in via a tunnel to RIT sign in). Ultimately, we had to scrap the multilayer analysis we had intended in favor of a static map querying our web server for a translated database to get around the issues associated with the API. Additional layers were added, but are static until a user executes the reconstruction process.

Accomplishments that we're proud of

Creating our interactive maps for coordinate collection is an unexpected achievement of the project. It represents a major concern about the UX being too complex and if we could really develop MapHub as a standalone service. Before implementation, users had to use third party applications to find their coordinates and then manually add them back to MapHub. With the interactive feature of the maps, users can find where they are and still adjust the target location with a single click.

What we learned

In creating MapHub, we learned how to use Leaflet.js and had to carefully study SQLAlchemy syntax. We also learned about how large APIs handle special authorization requirements, which also solidified our understanding of the inner workings of spacial analysis - how data points become binned visualizations and the intensity of the steps in between.

What's next for MapHub

With more time, we would have liked to create special user classes for data security purposes. The current version allows anybody to add, edit, and delete data from any logger. We would have preferred to have established logger ownership and administrative roles, further preparing the application for production with actors with less than benevolent intentions.
Additionally, the live feature layer by itself is not enough. MapHub could be further integrated with ArcGIS services to schedule updates to analysis layers, moving past the analytical boundaries posed by a single point-based feature layer.

Share this project: