PROJECT DESCRIPTION
AR, VR, Physicalization & Interaction
One of the vislab’s primary thrusts is the intersection of augmented reality, virtual reality, physicalization, and interactivity. As computing capacity continues to advance, accessibility to big data is also improving. This broadens the potential for visualizations that move beyond two-dimensional screens and into the embodied, physical world.
Significant research has demonstrated that tactile and three-dimensional visualizations improve memory, comprehension, and learning potential, especially in lay audiences. We are working to explore these promising avenues for visualization through tightly integrated digital + analog methodologies.
Virtual Reality
The image above shows a demonstration at the Texas Advanced Computing Center’s Visualization of the use of virtual reality to view and explore a three-dimensional, multivariate visualization of the Gulf of Mexico, encoded using hand-crafted glyphs and colormaps in the Artifact-Based Rendering system. Here, PI Francesca Samsel and graduate student Claire Fitch assist Michael Dell as he explores the visualization in the VR headset.
Augmented Reality
Physical visualizations, or physicalizations, are useful for many tasks such as communication and display, but are often insufficient as research tools without the addition of some sort of digital element.
We are therefore working to integrate augmented reality components with aspects of the physical world in order to broaden their potential for research-oriented use, such as data exploration, interrogation, and group science.
This project explores how data physicalizations (3D printed terrain models, anatomical scans, or even abstract data) can naturally engage both the visual and haptic senses in ways that are difficult or impossible to do with traditional planar touch screens and even immersive digital displays. The rigid 3D physicalizations produced with today’s most common 3D printers are fundamentally limited for data exploration and querying tasks that require dynamic input (e.g., touch sensing) and output (e.g., animation), functions that are easily handled with digital displays.
We introduce a novel style of hybrid virtual + physical visualization designed specifically to support interactive data exploration tasks. Working toward a “best of both worlds”solution, our approach fuses immersive AR, physical 3D data printouts, and touch sensing through the physicalization.
We demonstrate that this solution can support three of the most common spatial data querying interactions used in scientific visualization (streamline seeding, dynamic cutting places, and world-in-miniature visualization). Finally, we present quantitative performance data and describe a first application to exploratory visualization of an actively studied supercomputer climate simulation data with feedback from domain scientists.
Physicalization
Incorporating physical components into research practice provides numerous benefits, including expanded avenues for bringing artists and members of the lay public directly into the visualization process.
In the image above shows a series of layered pieces of plexiglass, laser-engraved with layers of Gulf of Mexico bathymetry and scattered with beads, wires, and other material components showing the general flow of water and other particles.
Physical objects provide opportunities to integrate artistic practice with research praxis and visualization production.