Visualization Research

Data visualization in the context of massively parallel simulations on supercomputers requires innovative techniques. We have designed a parallelized in-situ data extraction component as part of a networked visualization framework. Currently we are working on the integration of the developed techniques into climate research workflows.

Research Context

Coming along with the computing power of modern supercomputers numerical simulations can produce a huge amount of data. In typical approaches of visualization, the mapping of the raw data into 3D  geometries and the rendering are done in a separate post-processing step after the simulation. This may lead to a capacity bottleneck on the storage side as well as a bandwidth bottleneck on the network side.

The Distributed Simulation and Virtual Reality Environment (DSVR) is a 3D in-situ visualization framework, which implements the visualization pipeline as networked instances bypassing these bottlenecks. Since DSVR only supports the input of raw data defined on rectilinear grids, our ongoing work is focusing on extending the library with reading capabilities of other grids specific to current climate models.

Research Activities


For requirements analysis and demonstration purposes a DVSR-Postprocessor has been developed. In contrast to the typical DSVR approach of doing the data mapping in-situ with the simulation the DSVR-Postprocessor reads existing NetCDF files for visualization mapping using DSVR. To fit DSVR-library’s needs the dataset has to meet some constraints:

  • The underlying grid has to be a 3 dimensional rectilinear longitude-latitude-level grid.
  • The data must be localized on the grid points, not within the cells.
  • Vector data has to be de-rotated.

Currently the DSVR-Postprocessor supports the visualization of volume data using isosurfaces and visualization of unsteady flow data using pathlines. The visualization can be configured by config files. The DSVR-Postprocessor as some example config files will be available here shortly.

The Distributed Simulation and Virtual Reality Environment (DSVR) is a 3D visualization framework, which implements all parts of the visualization pipeline as networked instances. These components are the 3D generator (data source, filter, mapper), the 3D streaming server, and the 3D viewer (rendering, presentation). The 3D generator consists of a software library called lib-DVRP to be called directly out of the numerical simulation application. The data extraction and creation of 3D scenes, which represent features of the raw data, are efficiently implemented by parallel processing of the data parts based on the application’s parallelization scheme.

The extracted 3D scene sequences can be explored interactively by means of the 3D viewer.

The distribution of the visualization process chain leads to two advantages:

  • The data volume to store is significantly reduced by storing the 3D geometries instead of raw data. This avoids the storage and bandwidth bottlenecks.
  • Storing 3D geometries instead of storing rendered images supports still highly interactive scenarios like 3D exploration of the scenes and immersive virtual reality.

Click on image to enlarge

Parallel Isosurface Extraction with Integrated Polygon Simplification

The innovative idea of the close integration of two well-known methods – on one hand the marching cubes (MC) algorithm to generate polygonal isosurfaces of scalar fields, on the other hand vertex clustering to simplify the polygon complexity – was implemented and evaluated. With regard to parallel scaling as well as data reduction factors this approach is successful. An approach for adaptive reduction leads to a further quality improvement.

Parallel Pathline Extraction with Property-Based, Interactive Post-Filtering

Parallel library functions for efficient flow visualization were developed. By interleaving the resulting data stream that originally mapped the lines as 3D graphics with properties of spatially and temporally associated raw data, an innovative approach was realized. It supports batch computing scenarios through automatic data extraction at the back-end, and explorative visualization scenarios by integrating interactive post-filtering at the front-end.