Telling The Rest of the Story

Project Mentor:
  Evan Golub (Computer Science, HCIL)

Pilot FIA Spark Grant Student Team:
  Tom Hausman (Journalism)
  Hannah Klarner (Journalism)
  Abby Mergenmeier (Journalism)
  Jordan Mess (Computer Science)
 

Today we can read a story and see a photo or video clip, but we miss anything not in the frame. This can remove valuable context, and even credibility and trust, in some situations.

We are a team at the University of Maryland exploring the question of, "What if we could augment the reader experience to allow them to see and explore what was going on around the photojournalist at, and perhaps even just before and after, the moment presented?"

Through this pilot work, with funding from the FIA Spark Grant program, we are developing techniques and tools to use 360° photos and video to augment news articles to help tell The Rest of the Story. We have explored different ways to create and utilize this 360° content and written an article Regatta Reality that presents some observations and experiences in the context of covering a high school regatta.

Posting 360° videos created to show the point of view of photojournalists covering protests around DC on Inauguration Day, and in the Capitol Building the evening of the President's address to a joint session of Congress can help media consumers obtain an overall feel for the process and setting.

Example news articles with embedded links to 360° augmented content for protests around the inauguration of Donald Trump, President Trump speaking at CPAC 2017, Vice President Pence and Kellyanne Conway speaking at the 44th Annual March for Life, the 2017 March for Science, and Gymkana performing at Maryland Day demonstrate different ways that a traditional online article could provide extra context for a written story to a skeptical media consumer.

Watching and looking around a narrated 360° walk-through of a place or situation where journalists find themselves but the average media consumer might not, such as the Press Corps area in the West Wing of the White House, can provide a reader with a more detailed mental image of the context of certain events. A 360° timelapse video over the course of an event such as a snowball fight on McKeldin Mall can also provide a different take of a scene for interested readers.

Another example of the application of this approach can be seen in a story covering an FIA event about the future of VR, AR, and immersive storytelling that was hosted at the Phillips Collection. The 360° augmented content has been added in an unobtrusive manner, allowing the reader to bring it up as desired. Rather than decide which images would cause the reader to want to see the context in which they were taken, all images in the story were given that ability without significantly altering the look of the page.

The techniques used in the above examples that demonstrate how 360° content can be integrated into web-based articles have been designed to minimize the overhead in terms of code and resources. There is a preliminary "How To" page available that discusses some approaches to capturing, extracting, and using 360° content while taking photos with a traditional camera at the same time. If you are interested in more details regarding these techniques, or in beta testing software that has been designed and prototyped to support the matching of 360° photos taken close to when a traditional still photo was, as well as being about to extract frames from a 360° video recorded while traditional still photos were being taken, please contact Evan Golub.



This page last modified on Thursday, 17-Aug-2017 11:34:51 EDT.



Feedback about the site or page?
Comments welcomed!