A primary goal of the art-sci-vis lab is to consider the potential not only of static physicalizations or visualizations, but also how interaction might improve both communication and analysis for the public and scientists alike. How can we improve on existing interaction-based technologies, such as VR and AR? How might physical media factor into this exploration? To what extent does immersion improve interactive experiences? How might theory from the arts improve upon these efforts? These questions are the basis for the projects below, which remain ongiong and draw from interdisciplinary expertise at both the University of Texas at Austin and the Texas Advanced Computing Center.
Projects
Physicalization
One of the vislab’s primary thrusts is the intersection of augmented reality, virtual reality, physicalization, and interactivity. As computing capacity continues to advance, accessibility to big data is also improving. This broadens the potential for visualizations that move beyond two-dimensional screens and into the embodied, physical world.
Sub-projects under this project include virtual reality, augmented reality, touch-based querying and interaction, and artistic physicalization.
Planetariums
In partnership with The Bell Planetarium and Museum in Minneapolis, Minnesota, and The Los Alamos Nature Center Planetarium in Los Alamos, New Mexico, we collaborative is working to translate mult-modal interactive data visualizations and our Artifact-Based Rendering software for use in public planetariums and science museums around the country.
Bodily-Interaction-Based Interfaces
As technologies improve, developers are increasingly turning toward interactive interfaces in gaming, the arts, museum studies, and data visualization. Interactivity has been shown to improve connection, understanding, and above all, engagement with complex topics or materials.As simulation data grows more complex and dense, interactivity outside of the traditional screen and mouse set-up may have the potential to drastically improve both data exploration through visualization and science communication to a lay public or an audience of stakeholders.
Our vislab is interested in developing both, and toward this end, are working to more closely couple bodily movement, physical artifacts, and screen-based content to provide semi-immersive experiences for researchers using big data simulations.
Sub-projects under this project include large-scale, sensor-based bodily interaction and V-Mail, a framework of cross-platform applications, interactive techniques, and communication protocols for improved multi-person correspondence about spatial 3D datasets.