To observe patterns in emotion from NY Times images as a function of time or article category.

How we built it

All data processing was done using Python 3. Images were extracted using the NY Times Archive API. Facial analysis was done using the Microsoft Emotion API.

Challenges we ran into

Defining the scope of the project, such as what factors to evaluate.

Accomplishments that we're proud of

Integration of multiple APIs, definition and completion of an entire research project from scratch.

What's next for Emotion in NY Times Images

Relate Emotion data to the text content of each article.

Share this project: