On September 2015, the White House had created a $160-million-dollar initiative for the propagation of smart cities where technology would be used to combat numerous key challenges such as traffic congestion, energy consumption, and more. In response, General Electric has planned in the coming years to make this possible through the innovation of intelligent LED light sensors through the Industrial IoT platform Predix.

Albeit the release of these intelligent LED light sensors mitigating a chunk of the numerous key challenges which the White House has pressed for being traffic congestion, parking planning, energy consumption, and city-scaled big data mining, one key point still prevails: we are not taking advantage of the fact that streetlights and traffic signals are so abundant in city streets that they are capable of so much more rather than simply being communal data miners.

Rather, numerous critical concerns still stand such as there being a lack of access points to emergency services, transportation resources, and more which in theory, can be made possible now that in the future we’ll be having streetlights and traffic signals interconnected together through the Internet of Things.

So in order to combat a plethora of key challenges and at the same time make these city technologies even more interconnected and useful to society, I’d like to introduce my hackathon project, Hearth.

What it does

Hearth is an intelligent city platform that connects modern street lights, traffic signals, and sensors together through the cloud in order to create a real-time city-scale social network that provides city planners artificially intelligent real-time traffic & land management systems, and provides citizens access to first-class emergency services, transportation resources, and essential city & community information from almost anywhere without any need for an Internet signal.

By simply tapping your mobile phone onto any NFC chip which are to be implemented into these streetlights and traffic signals accompanying General Electric’s intelligent LED sensors, you’ll be able to access a plethora of services and information in your city or any other cities around the world without any need for an Internet signal at all.

Given that these NFC chips cost less than a dollar to implement and purchase per streetlight, Hearth remains to be extremely feasible especially with numerous smart city initiatives going on around the world.

Emergency Services

Many solutions in the past have attempted to simplify emergency protocols within cities for the sake of improving public safety, but nonetheless disregard one major factor: accessibility. Through Hearth however, users without any Internet or phone carrier signal can simply log in to a streetlight access point with their smart phones, and get connected to the closest public safety answering point as soon as possible. Calls are made by tapping into Predix’s intelligent LED big data cloud network available on any streetlight to create a Bluetooth call linked from a users phone to the closest public safety answering point possible.

Upon entering the emergency call screen, an audio clip starts being recorded in case of the user being within the vicinity of an imminent threat, which can be accessed by police officials through the QR code generated on Hearth’s network. In case of the user requiring an ambulance due to an emergency, in amidst the user’s call, Hearth will access all street cameras provided by Predix’s public safety platform and scan for ambulances in order to notify both the user and his closest family members and friends about the ambulance’s location alongside his scenario.

In order to label and classify ambulances within the street photos, I made use of Andrej Karpathy's NeuralTalk model for image classification (awesome machine learning PhD candidate with lots of interesting reads!)

Traffic Congestion Optimization

Hearth also provides an option in order to hasten a traffic signal’s timing right next to the Emergency Report button and the local community content and transportation resources provided by Hearth manages this by controlling a streetlight’s brightness based on ambience and street conditions, and also by controlling a traffic signal’s timing through a traffic flow model proved effective in both London and the city of Copenhagen in Denmark.

Given that traffic congestion is a non-linear model, a reduction of 20% of vehicles stuck in traffic could lead to there being no congestion at all according to Jonas Eliasson on TedX Talks. On the other hand, MIT professor Berthold Horn also stated that since traffic congestion can be modelled as a fluid in which vehicles oscillate rather than flow smoothly in a straight path, simply having vehicle flow over pedestrian flow prioritized on streets can make a significant impact to reduce traffic congestion.

As a result, Hearth makes use of a ‘vehicle-first’ approach towards traffic signal timing, which will have a larger vehicle to pedestrian pass-through ratio on intersections depending on the level of traffic throughout the day, alongside severity of congestion. On important days when an abnormal amount of pedestrians pass by, Hearth will take into account a crowdsourced input for the hastening of a traffic signal which Hearth learns over time with a deep learning model. For Hearth to learn the trends of traffic and pedestrian flow, data is run through a financial trend algorithm by John Ehlers, simplified through the Ramer Douglas Peucker algorithm, and afterwards trained through a stacked variational auto-encoder every 10 minutes.

Anomalies are also detected by the 68-95-99.7 statistics rule alongside the deep learning model to report possible cases of traffic/speeding incidents with screenshots sent to administrators on Hearth’s platform.

Streetlight Energy Consumption & Public Safety Optimization

Hearth’s street light optimization system also further enhances public safety and energy consumption throughout the night on top of General Electric’s intelligent LED light sensors by employing a ‘no-lights-on’ consensus. Hearth calculates the ideal light throughput for every single street light by determining the amount of traffic/pedestrians nearby during any time of day.

Street lights are only on at night, with time factoring in as a quadratic factor from sunset to sunrise determined by a subset of location intelligence API’s by Pitney Bowes. To improve public safety, upon spotting a person/vehicle, maximum light throughput will be shined onto the person/vehicle, with less output being given depending on whether or not there are more vehicles/people within the vicinity so that incoming vehicles/people can realize that there are users around a corner. The more traffic, the less light throughput will be given as vehicles already produce a great amount of light through their headlights during the night. This calculated luminosity is thereafter distributed towards streetlights facing the cars/persons direction, reducing energy consumption and improving public safety by a significant amount.


The whole reason why I always sought for much more easily accessible emergency services & city resources is because many members of my family, and my friends have been in emergencies, or have been lost and placed in traumatic situations which could have easily been solved if the city simply improved their accessibility to resources.

I even recall one time where I have had a fever peaking up to nearly 40 degrees celsius, and certainly calling the ambulance when you're 200m out in the middle open with nothing but empty road and streetlights was near impossible.

Hence why Hearth I believe is a key solution in solving many of the problems we face in both developing and developed countries today for both local and foreign citizens.

How libraries were used

  • Predix's Traffic & Pedestrian Planning API's were used in order to aggregate all data regarding vehicles and pedestrians around the streets. The MongoDB database and custom time-series based schema used for the deep learning model to work on was aggregated with 12 live websocket connections coming from these API's. Seeding/migration of the data for initial feature preparation, and the enlisting of sensors was done using the REST interfaces from these API's.
  • Predix's Time Series API was used as another point where Predix's Traffic & Pedestrian Planning real-time data was aggregated upon for the sake of feature preparation for the deep learning model. Least squares regression and trending mode proved extremely useful for simplifying down the dataset making all trending algorithms plus Hearth's deep learning model runnable at modest time intervals being every 10 minutes.
  • Predix's Public Safety API was used in order to collect real-time camera feed from cameras all over Hearth's city network, which afterwards was fed into Andrej Karpathy's NeuralTalk model for real-time image/video classification. This API allowed for Predix's emergency-response system to locate incoming ambulances around designated streetpoints, and allows city-planners and the police to detect traffic/speeding incidents.
  • Pitney Bowes GeoEnhance API was used in order to provide geolocation and timezone data so that Hearth would be implementable in any city apart from the simulated data coming from San Diego.
  • Pitney Bowes Geo911 API was used in order to locate the closest PSAP for Hearth's emergency protocol system, with information including their whereabouts and phone number for Hearth's emergency-response bluetooth communication system.
  • Google Maps & Directions API were used in order to provide directions to transportation & retail stores, plus give an entire map detailing where sensors are located within Hearth's vicinity.
  • Yelp's API was used as a source of transportation and retail/commercial/local information for people accessing Hearth's network from streetlights available around the city.
  • Sunrise-Sunset's API was used in order to determine sunset/sunrise times for different cities and countries for use in determining a quadratic time factor which is incorporated into Hearth's traffic & pedestrian optimization systems.
  • Crossfilter & DC.js & NVD3.js & D3.js were used in order to provide the multi-dimensional big-data visualization on historical and real-time data given from all of Predix Current's real-time data streams. Hacking out DC.js was quite a pain as typically these visualizations were made for the intention of static data; not real-time data. Had to implement an asynchronous version of the Crossfilter API through JavaScript's WebWorker API.
  • Meteor.js & MongoDB were used in order to provide the entire real-time-first web experience for Hearth's entire network.

Challenges I ran into

One of the biggest challenges would have to be that I had to rewrite the entire Predix API from Java to JavaScript as I used MeteorJS as a framework for the entire project. MeteorJS was used because I wanted to aggregate, make use of my own deep learning and anomaly analysis models, and also provide all of Predix's historical and present data involving their sensors and analytics in real-time.

I had to test out all sorts of different algorithms and database architectures/optimization methods which would fit well for periodic time series data on MongoDB. At the same time, making it plausibly re-active and real-time for city planners and clients given Predix's synchronous API's over a pub/sub API was nonetheless difficult.

In addition, I eventually had times where I thought of switching frameworks 3 days before the end of the submission deadline despite all the work done on the deep learning models and trend algorithms which nonetheless I thought was crazy. I'd like to thank Devpost and Predix so much for the exemption which I've stated in the Remarks at the end of this post.

Overall, making the video with only 2 hours left before the submission due date would have to be a bonus too :').

Accomplishments that I'm proud of

I can't believe I managed to do this project 8 days right after a contract. Biggest accomplishment is that this is my first time using a Cloud PaaS and API provider, being incredibly successful and I learned so much more about how security is handled on the cloud side as a result. What's more is that this overall is one of the biggest projects I have done throughout the year, and overall it got to be done with futuristic technology seeking to advocate for huge societal and architectural problems.

I also eventually came to learn a bit more into deep learning by utilizing Andrej Karpathy's Neuraltalk, alongside some more realistic API's like the GeoEnhance Pitney Bowes API's alongside Predix's plethora. Awesome work guys :).

Oh, and I also got leverage work I did from previous contract having to do with multi-dimensional big data visualizations implemented using Crossfilter. Really proud of that considering the volume of data at hand.

What I learned

I came to notice how the reality of the fact that so much of this technology has actually been out for a decade now, yet many cities are not insistent on its implementations due to obvious financial and timely matters. Many cities also have some controversies from security concerns over automation of cities, which nonetheless I've come to encounter too from talking to my friends throughout this project.

What's more, people I've talked to from different countries always held different opinions on things, so I decided to unify them together and research them up online to create a few unique optimization models that enables better access to emergency services, city resources, energy usage reduction, and even traffic congestion optimization.

What's next for Hearth

Obviously, this project is definitely not done yet :). I've done so much throughout the past 10 days, but in total Hearth consists of its three components as I've mentioned in the headings above, and so I'd like to work on Hearth's core.

All testing so far was done with a NFC P2P simulator which I made using a STM32 board alongside the RS232 protocol, which I'd like to upgrade into making use of an actual NFC-P2P chip for a small buck. Sadly would've ordered one to make the real deal in time for the competition, but ordering takes time given that I'm in Japan.

Apart from that however, I ran out of time by the end of the hackathon before I could make use of Predix's Parking Planning API and Pitney Bowes' Demographics API. I wanted to further improve the traffic lane model by providing better traffic maps in comparison to Google Map's traffic feed which gives ideal lanes despite congestion at peak hours depending on the user's health condition which is learned through a mobile user-learning SDK known as Neura and the user's transport destination. By queuing users heading towards similar destinations based on the time of day, and their own personal condition, in essence the entire city's roads would act as freeways which are prioritized by user condition to mitigate potential risks to traffic accidents and horrible traffic signal waiting times.


The server for hosting Hearth's MongoDB instance is hosted on DigitalOcean with permission given directly from Devpost and Predix competition staff. The site and URL nonetheless proxies and provides the entire Hearth website through Predix's cloud servers.

Built With

Share this project: