Inspiration

More than half of the world’s population now live in urban areas. By 2050, that figure will have risen to 6.5 billion people - two-thirds of all humanity. Increased urbanisation means that a higher number of people will be relying on their local town/city councils for support when it comes to resolving problems.

What it does

Tackling eParticpation isn’t a particularly new problem, in fact there are already well established systems such as 311. The problem however is these systems have a high barrier of entry due to the requirement to download & install a mobile app to report problems. This requirement potentially isolates some users who might not have the ability to perform this step, and could be tempted to not bother at all due to the effort required.

This is where MyCity will play a vital role in facilitating community participation in the reporting of problems found throughout the community.

Messenger Bot

MyCity is accessible entirely through Facebook Messenger in the form of a Bot and captures all the required data in a seamless way using emerging technologies such as Machine learning and Natural language processing. This means that no extra apps need to be downloaded and kept on the individuals phone for when an opportunity arises that needs to be reported.

We have also gamified MyCity to incentivised users to report more and more issues using a point and level scoring system.

Contributions

Reporting Dashboard

The data collected will help city councils to identify and act on issues. However, showing all the data can be overwhelming. To tackle this, we focus on the main need “What are the biggest issues now, and where?” by showing a list of prioritised issues and visualised map. Users are also able to see individual report and notify citizens when something has been done.

Apart from reactive work, we would also like to enable city councils to be proactive in preventing issues and measuring efforts. We have the idea to provide generated insights for them, for example telling them if issues are increasing or decrease, current trending issue and the progress of handling issues.

How we built it

Architecture Diagram

The high level architecture that has been used to build MyCity can be seen above. The core pieces are:

Elasticsearch

Elasticsearch was chosen due to its speed and simplicity when trying to query large datasets. Due to the large about of free text data that is collected by MyCity we needed a way to effectively perform simple NLP (Natural Language Processing) to extract key words from large custom text fields.

Facebook Messenger (Bot)

Facebook Messenger serves as the easy entry point for the community. Our bot handles the 3 step process of:

  • Uploading an image
    • Processing this image with object detection and returning a set of helpful tags
  • Prompting the user for more information including their location
  • Confirming if they'd like to receive a follow-up notification when the issue is resolved
  • Displaying the users contribution stats (points!)

React Native Navigation

We used React Native to develop the iOS and Android application for the city agent. They will be able to visualise to issues locations, select the one they are interested in and the application will open the native navigation to this location.

React Native Navigation

Open Data

We created an end-point to publish the anonymised data we are collecting. The data we aligns with the standards set by 311, and the intent is to eventually feed the data we collect into their database;

Bucket Hosting for Dashboard

The frontend dashboard is hosted out of an AWS S3 bucket due to the static nature of the app. Our Frontend is written in ReactJS

Machine Learning

Our machine learning backend is powered by Google Cloud's Vision AI which analyzes and returns helpful tags from the uploaded image for the user to select, thus reducing the amount of effort from the user's side.

Challenges we ran into

We ran into some complexity when dealing with object detection, Initially we wanted to make use of the ResNet-50 model for PyTorch and planned use that for inference.

We found we had issues getting a model up and running and exposed over an API due to time constraints. This meant that we opted too instead make use of the Google Cloud Vision AI API.

Messenger Bot is providing an image_url of the photo attachment. In order to send this attachment to Google Cloud Vision, we first tried to send the url to Google Cloud Vision. It couldn't extract the labels because Facebook is using Image CDN. URL of images are constantly changing. A solution is to send the image data as base64.

Accomplishments that we're proud of

We’re really proud that we were able to incorporate so many interesting pieces of technology without losing focus of our goal.

The use of Cloud APIs to perform object detection allows us to return meaningful tags to help improve the users experience when trying to describe the application.

Object Detection Example

What we learned

Lowering the barrier to entry by leveraging existing widely-adopted tools like messenger can enable massive participation of people from all walks of life and increase the accessibility of services.

Also, Gamifying mundane tasks can incentivize people to use any tool more often.

What's next for MyCity

Deep integration with the existing 311 systems is an important next step for MyCity. The information we collect from the community using MyCity can be used to further enhance the already existing dataset.

Design Mocks

We've spent time mocking the future design state of MyCity and based on feedback from user tests have come up with the following

Design Mock Up

Navigation App

We're keen to improve the navigation app and build out as better mechanism for scheduling the response and resolution of individual tasks. This might include a system where the general public could be crowd sourced for some tasks reported (that are safe to perform by members of the general public).

+ 3 more
Share this project:

Updates