I got the idea for StorySphere after the Malaysian Airlines passenger jet crashed in Ukraine. Much of the wreckage fell in some fields in Ukraine. Although pro-Russian separatists prevented the investigators from getting to it for several days, journalists were able to get there and see some pieces, which is unusual.
I wanted to be there, to experience the scene and see what the wreckage was like. Reporters for the New York Times visited the scene and did an interesting story. They took photos of the wreckage and sent them to experts in bombs and explosives. The experts could see from the photos that the wreckage had tell-tale signs that the plane had been hit by a missile, contrary to the claims of the separatists who insisted they had not shot it down.
The wreckage in the field told an important chapter in the tale of MH Flight 17. But the only people who could really experience that tale were the people who could visit the field.
That gave me an idea: I wanted to create a tool that would allow anyone to visit a scene like the field in Ukraine. I came up with the idea of using a full 360-degree by 180-degree super panoramic sphere of photos to allow someone to stand in the field and explore. They could look around the field and learn about different things the same way that the New York Times reporters did.
Super panoramic photos can be captured easily using various smartphone apps, or camera rigs. The examples provided in the demo were captured using 6 GoPro cameras and a 3D printed tripod rig. The images were stitched together using an open-source software library (Hugin).
A super panoramic image is much better than a traditional photo, or even a panoramic one, because it shows the scene exactly as the photographer experienced it. So you can not only look in front and behind, but you can look above and see trees or sky and you can look down and see the ground beneath you. As you look at various items, you learn about them through embedded annotations, which can be popup HMTL (iframes), slideshows, and video. StorySphere provides an easy point-and-click interface to create and embed annotations using only the browser. StorySphere can be included in an online report just like any other image.
StorySphere can be used for any major news events where people want to explore a scene they would never be able to visit: earthquakes, hurricanes, riots. It will allow you to walk through the rubble of a building that has been destroyed in a major earthquake, and through embedded annotations, allow you to watch interviews and read about points of interest.
It would let you stand on the street in Ferguson, Missouri, where Michael Brown was killed and explore the scene — the place where his body lay for four hours and where people have put up memorials to him. You would see the local police, with the weapons and armed vehicles of an army, on one side, and the protesters on another.
It would let you stand on the field where a World Cup match was played. You could experience the field like a player does — you could see the turf beneath your feet, the goal in front of you and crowd all around you.
StorySphere is a unique way to tell the narrative of a major news event — often the last chapter of the story. Instead of being bound to a traditional narrative, it allows the reader to explore the things that interest them in a non-linear way. They are not limited by the chronological order of the narrative. They can go deeper on the things that interest them.
To explore StorySphere further and for technical details, visit: https://github.com/aaronkrolik/storysphere