Air pollution is a silent, invisible killer, with over 4.2 million each year dying from ambient (outdoor) pollution alone link. While you can see overflowing junkyards, watch forests being cut down, and taste polluted water, many of the harsher pollutants in the air bypass the one defense mechanism we have for it; smell. Further, like so many other avenues of sustainability, the effort that has been made to address air pollution has insufficient for making the needed change. Although there are sensors in large cities, like Atlanta, that rate the air quality on an index, with only 5-6 of them for the entire metro area, how much value does this provide when you see a "Good" air rating, only to spend the rest of your day near the airport and industrial buildings?

The technology to be able to map an entire city with networked PM2.5 sensors (the most scrutinized particle size for air pollutions) plus other atmospheric metrics is not only out there, but is both available on Amazon and cheap enough for even a college student to build a few units (personal experience). However, there are two huge points of value that are provided by this increased detail of our air quality.

1. Personal Use/Internalization As people are going about their day-to-day, you can see exactly where in your environment the air is cleaner or dirtier, and make an informed decision from that. Especially for those in higher-risk groups, being able to make daily choices on how much air pollution you are exposed to surely presents long-term benefits. Additionally, I think many people find it hard to internalize the direct impacts of global sustainability problems, so being able to give an actual number for the air quality where someone is standing provides a lot of personal insight. At my university, I have been part of a deep-learning research project building models to forecast roadway pavement condition. During some of the investigations for the architectures in my research, I found papers such as this link that inspired me to apply what I learned as they use similar architectures.

2. Accountability I think this second point is the more important one for making long-term changes. Having highly detailed geospatial-temporal data on air pollution provides the resources to interpret the sources of our pollution in a non-invasive, external manner. The most direct example would be in being able to quantify the direct impact from major typical air pollution producers, like power plants and factories. However, one greater implication of these changes is understanding the impact of and in cities, such as the value of parks and other greenspaces, as well as increasing the value of areas that have less air pollution.

What it does

This project takes in data from an IoT device that collects the PM2.5 reading along with additional atmospheric metrics. Then, the data is concatenated with other geospatial data from Azure Maps Weather service. With the telemetry data, it creates two heatmaps using Azure Maps, one displaying the current conditions, and another displaying the forecasted conditions for one hour ahead by utilizing the concatenated dataset in the machine learning model.

How we built it

The IoT device was centered around the ESP8266 platform. Although LoRAWAN would technically be better in this application, it was too cost-prohibitive to be able to make multiple project iterations.

The main IoT platform for this project was Azure IoT Hub, but so many aspects of Azure were used to bring this project to life. Azure Functions enabled a lot of interconnectivity for the various services I was using, so I became pretty familiar with them. For the visualization in particular, Azure Maps was used extensively.

Challenges I ran into

Since this project covered the full suite of deployment, there were so many challenges I ran into. For example, getting the IoT SDK to work on the ESP8266 took me more time than I would have liked due to dependency version problems, and making the data pipeline from start to end forced me to really learn some of the deeper aspects of the Azure platform. I can recall that one of the worst problems I had was trying to deploy Azure Functions written in Python from VSCode, to the point that I abandoned them entirely and used JavaScript in the online portal instead.

Accomplishments that I'm proud of

I'm really excited to present this entire prototype since I feel I touched on almost every aspect of an IoT solution. In particular, getting the visualization working gave me a huge boost to keep working on the project, and getting a production-like setup for IoT devices has really inspired me to build smaller projects around my apartment (I'm thinking of an automated garden next).

What we learned

Getting comfortable with IoT cloud solutions and pipelines was the most valuable in my opinion. I'm a huge maker/DIYer, so getting familiar with these services has given me a big confidence boost for future products. Additionally, using the serverless Functions was something I had been meaning to explore for some time, so it was really valuable exposure into understanding how the connect other services together.

What's next for AirVizProject

My future goals for this project would be to add the connectivity to allow other people to plug in their own devices as well as make a tutorial on how to build/connect them. Additionally, improving the quality of some of the backend implementations is a big next step to making it a more self-sufficient project.

Built With

Share this project: