Visualizing Event Dynamics with Narrative Animation

Support:

NSF IIS-1352893 (PI: Jing Yang); NSF IIS-1352927 (PI: Ye Zhao)

Team members:

UNCC: Jing Yang, Yueqi Hu, Tom Polk, Yang Chen

Kent State University: Ye Zhao, Xiaoke Huang, Cheng Zhang, Chao Ma, Yingyu Wu

Project objectives:

Discovering and understanding event dynamics hidden in temporally evolving datasets is a complex yet critical task for knowledge discovery. Although many algorithms have been developed for mining event dynamics, the general public is still in dire need of communicative tools to explore the evolutionary trends and patterns. To address this need, this project has two objectives:

1. To explore how to borrow film art techniques to leverage animated visualizations for narrating, exploring, and sharing event dynamics conveyed in temporally evolving data.

2. To develop applications in the field of evolving text document visualization to demonstrate the broader impacts of the new techniques developed.

Project activities:

1. We have experimented integrating a set of film art techniques into animated visualizations, such as track shots, Framing, Movement, Motif, Flashback, Fade, Cross cut, and Subtitle.

2. We have collected a set of city-relevant text document collections and developed a working prototype of evolving text document visualization on them. In particular, we retrieved and organized the news items of 30 U.S. major cities across more than 10 years from the well-known New York Times repository. A citystory dynamic visualization system is developed.

3. We have developed a visualization testbed for narrative animation of various streaming data. Over the testbed, we conducted extensive experiments on streaming data visualization.

Publications:

Yueqi Hu, Jing Yang, Ye Zhao, Tom Polk, and Shixia Liu: Spot-tracking lens, a zoomable user interface for animated visualization. Conditionally accepted by IEEE PacificVis 2016. (pdf) (video)

Yang Chen, Jing Yang, and Ye Zhao: Toward Effective Narrative Animation in Streaming Text Visualization, extended abstract and poster, In IEEE Symposium on Pacific Visualization, March, 2014. (pdf)

Xiaoke Huang, Cheng Zhang, Ye Zhao, Yingyu Wu, Chao Ma, Jing Yang: Visualizing Evolving City Stories from Streaming News. In preparation, to be submitted to IEEE VAST 2016.

Yang Chen, Xiaoke Huang, Ye Zhao, Jing Yang: TStreamMonitor: Visually Monitoring Streaming Data with Animation Techniques. In preparation, to be submitted to IEEE InfoVis 2016.

Project outcome:

1. Spot-tracking lens, a zoomable user interface for animated visualization

Traditional zooming technique, which magnifies a region in the view to allow users to examine details, does not work well for animated visualization, since moving objects can easily get out of the region. In movie arts, a track shot refer to that the camera is moved sideways, parallel to a moving object, to record its movement. Inspired by the track shot, we develop the spot-tracking lens. It integrates zooming with automatic panning to follow a focal object of interest, and thus allows users to examine its movement and the dynamics of its context during the animation. A set of novel auxiliary techniques are provided to the spot-tracking lens: the reference frame to help users be oriented and sense the speed of the objects during automatic panning, the spotlight to reduce change blindness, and the automatic labeling to reveal semantics of interesting patterns in a timely manner. A fully-working prototype of the spot-tracking lens has been developed. Our preliminary user studies have revealed that the spot-tracking lens not only allows users to follow an object and examine its context, but also promotes ego-centric explorations: users follow an object in a long time period and discover many insights relevant to the object and its neighbors. Such long chains of insights are desired in the reasoning process.

Publication: Yueqi Hu, Jing Yang, Ye Zhao, Tom Polk, and Shixia Liu: Spot-tracking lens, a zoomable user interface for animated visualization. Conditionally accepted by IEEE PacificVis 2016. (pdf) (video)

A Screenshot of the spot-tracking lens.

A Screenshot of the spot-tracking lens.

2. City stories: a dynamic visualization system to tell city stories extracted from massive city news in multiple years

The news items continuously arriving along time form a text stream, which is rendered by gradually evolving visualization aimed to help users observe and understand the dynamical topics, events and trends of urban cities. The dynamic visualization is implemented based on an incremental clustering scheme over an evolving graph consisting of the streaming news in a moving time window. The changing clusters discover thematic evolution of the city over time. They are visualized based on text summarization so that users can easily explore salient and changing focuses of the text stream which narrates the city's chronological progression. The system can be extended to other text streams to visualize stories from emails, blogs, and so on.

Publication: Xiaoke Huang, Cheng Zhang, Ye Zhao, Yingyu Wu, Chao Ma, Jing Yang: Visualizing Evolving City Stories from Streaming News. In preparation, to be submitted to IEEE VAST 2016.

City stories system overview

City stories system overview

A Screenshot of the dynamic visualization of city stories

A Screenshot of the dynamic visualization of city stories

3. Narrative StreamIT, a streaming text visualization system enhanced by a variety of film techniques

We developed Narrative StreamIT, a preliminary prototype for dynamic event visualization for streaming text collections based on StreamIT, our previous work on streaming text visualization. In particular, we leverage StreamIT using the following film art techniques: framing, motif, fading, flashback, cross cut, and subtitle. Some interesting initial results are reported in a poster of IEEE PacificVis 2014.

Publication: Yang Chen, Jing Yang, and Ye Zhao. Toward Effective Narrative Animation in Streaming Text Visualization, extended abstract and poster, In IEEE Symposium on Pacific Visualization, March, 2014. (PDF)

A screenshot of flashback in Narrative StreamIT.

A screenshot of flashback in Narrative StreamIT.

4. TStreamMonitor testbed: Developing a streaming text visualization system

We have developed a fully working evolving text visualization testbed for experimenting and evaluating narrative animation techniques. In particular, we have significantly extended our previous work on streaming text visual analytics. The new system we developed, called TStreamMonitor, is able to provide real time event monitoring on large text streams. New techniques, such as a multilevel online clustering scheme, have been developed in this system so that it can scale to event detection in fast evolving text streams. We have tested a set of narrative animation techniques on TStreamMonitor including (1) dynamic multi-screen technique (split, merge, and synchronization) for conducting cross-time comparison; (2) custom editing techniques (cut, focus, label, etc.) to enhance storytelling in animated visualization; (3) various animation attributes such as flashback, speed, zoom, lighting in visualization.

Publication: Yang Chen, Xiaoke Huang, Ye Zhao, Jing Yang. TStreamMonitor: Visually Monitoring Streaming Data with Animation Techniques. In preparation, to be submitted to IEEE InfoVis 2016.

A Screenshot of multi-screen animation

A Screenshot of multi-screen animation

A Screenshot of flashback animation

A Screenshot of flashback animation

Updated on 11/27/2015