In 2013, 172 pedestrians were struck and killed in traffic in New York City. More and more, pedestrians are looking down at their phones, lost in their music, crossing the street carelessly, and not actively aware and engaged with their city as they traverse the streets.
At the same time, every intersection has existing infrastructure that could be put to use to inform, engage and interact with these pedestrians: the good old pedestrian signal.
Sentient Streets re-tools existing Pedestrian Signals across NYC to help engage pedestrians and cyclists, raise awareness to the dangers of traffic, and provide useful safety tips. With Sentient Streets, each pedestrian signal becomes imbued with real human emotions and personality — based on real-time data of the intersection and sentient analysis — and interacts with people as they walk by. Are there too many jaywalkers? How crowded is the corner? Is it rainy? Bad traffic? The street signal is aware of all of this data and interacts accordingly.
How it Works - The Concept
Sentient Streets is a new network of connected pedestrian signals imbued with personality and emotion based on the real-time data of each intersection, which in turn determines how the signal interacts with pedestrians and cyclists passing by. Through our research and user testing, we've learned that by giving a pedestrian signal human emotion and voice (beyond 'walk' / 'don't walk'), pedestrians and cyclists passing by, particularly on a regular basis like during their morning commute, are much more likely to pay attention to its prompts and want to make the signal happy.
In Sentient Streets, each pedestrian signal has a personality and mood generated and updated from several real-time data sources. First, at every intersection cycle, the system takes an image of the intersection and uses basic image processing to evaluate parameters like the number of people and cars using the intersection, and the number of pedestrians jaywalking. We initially prototyped this data source by asking the online crowdsourcing site Mechanical Turk to analyze a live feed of the intersection that had been set up inside an adjacent building (and then Turkers would watch the live feed and help quantify the current data of the intersection). Second, the system also crunches and parses data including real-time weather (Dark Sky API), news (Patch API), and traffic incidents (Bing Maps API).
Based on all this real-time data, we then use sentient analysis to compute a 'current' personality and mood for the signal. There are five general states --"happy," "attentively upbeat," "a bit down," "sleepy," and "distressed"-- with a deliberate skew towards the happier side of the equation. However, for example, while the signals are usually upbeat and attentive, too many jaywalkers will make them distressed and especially bad weather will tend to deflate their mood.
How it Works - The Interaction
The physical setup includes two screens, which are 'to-scale' replicas of the existing yellow signal boxes on street corners. These screens broadcast messages based on the real-time conditions at the intersection at any time.
As pedestrians and cyclists approach a corner, they can see the pedestrian signals' current emotion on one screen (represented by an emotive face) and personalized feedback for that intersection on the other screen. The signal does more than tell you when it’s okay to cross the street — it will say things like “Come on! Keep those eyes no the road and off your cell phone!” (if he's currently distressed), “I’m so happy to see you being safe!” (if he's currently happy), and “Be safe! Look left, right and left again” (if he's currently down).
As pedestrians move through the city each day, they can see how each corner is currently feeling, and become aware of the fact that they can help improve the signal's mood by being safer and not jaywalking. As well, pedestrians can interact with the signal by text messaging it. A textual prompt on the signal post will encourage the pedestrian to "Chat with me!", further emphasizing the human sentiment of the street signal. The signal will respond based on how it currently feels, offering a fun fact about its neighborhood and reminding the New Yorker of a traffic safety tip. For example:
"How are you? I’m OK, although I wish people would stop jaywalking. I'm sure the day will get better though. Did you know that a block away was the first meatpacking plant to open in NYC? Keep your eyes on the traffic and stay safe!"
How it Works - The Technology
For the challenge, we built our own to-scale, working pedestrian signal with custom hardware and software (and installed it at a Broadway intersection for several hours).
For the physical build of the pedestrian signal, we constructed custom LED matrix panels, each powered and updated by an Internet-connected Arduino Yun microcontroller. One panel depicts the street signal's current emotional state and the other panel shows the corresponding text that is communicated to the passer-by.
The build of the signal panels consists of pine, masonite and black plexiglass to make it as similar as possible to a real pedestrian signal.
The Web app is a node.js app that collects real-time data through basic image processing and several hyper-local APIs. We prototyped the intersection's data feed using Mechanical Turk. Every few minutes, the app pings the Mechanical Turk community to watch a live feed, from which it gets back quantitative data on what they see happening at the intersection (importantly, the camera is set up so faces cannot be detected). The app then combines this data with other APIs (real-time weather (Dark Sky API), news (Patch API), and traffic incidents (Bing Maps API)), and does sentient analysis to come up with a current emotion. This can include "happy," "attentively upbeat," "a bit down," "sleepy," and "distressed".
When a new emotion is computed, it is automatically updated on both the physical pedestrian signal (via the Web-enabled Arduino Yuns) and on the Sentient Street website (via Web sockets).
The text message interaction is done through the Twilio API and basic natural language processing.
Currently in beta: The Sentient Streets website is a visualization of all the active 'Sentient Streets' pedestrian signals in NYC, providing anyone interested in the project with an overview of locations and sentiments across neighborhoods. Details on the state of our individual metrics will be provided for each signal, essentially creating a new measurement for the safety and wellbeing on NYC streets - emotional wellbeing.
Keeping Pedestrians and Cyclists Safe and Engaged
Through our research, we've discovered that by making a traditional object sentient and giving it human voice and personality, we can more effectively prompt people to perform actions that will make the object happier. This is even more so the case when the user sees the connected object every day and starts noticing that its personality and mood changes based on how safe people are being on the street. We have seen through user testing that people WANT to make their intersection's sentient pedestrian signal happy by being safer on the streets, and that by changing the traditional pedestrian signals to have a broader human voice and personality, people are much more likely to look up, heed its suggestions, and engage with it.
On a broader level, while most of the conversations around connected devices have centered on utilitarian purposes (for example, the quintessential refrigerator that knows when you’re out of milk and orders more), we believe that connected objects can be used in cities to brighten life and help people be safer. Sentient Streets is an initial design experiment that re-imagines pedestrian signals in this way.
While our initial prototype is one pedestrian signal, we've designed the system in a way where pedestrian signals throughout New York could be re-tooled and made sentient in a similar fashion. What we eventually see is a network of connected pedestrian signals that can entertain, engage and inform pedestrians and cyclists as they move throughout the city.