Media Arts and Technology
JoAnn Kuchera-Morin - Director, AlloSphere Research Facility, professor of Media Arts and Technology.
The activity of visualizing and exploring complex multi-dimensional data provides insight that is essential to the practice of science, engineering, design, and art, where the amount and complexity of the data overwhelms traditional computing environments. Researchers in UCSB's Media Arts and Technology program, led by Professor JoAnn Kuchera-Morin, are developing an infrastructure that will provide powerful methods for detailed analysis, synthesis, and manipulation of such data by integrating multimodal representations of large-scale data with human-scale visualization and interaction techniques in a novel immersive environment at UCSB known as the AlloSphere. The facility serves as computing research infrastructure in two primary ways: (1) as a platform for driving computing research needed to create the future versions of the AlloSphere with enhanced video, audio, and interaction capabilities, and addressing challenging problems in storage, networking, rendering, software virtualization, real-time simulation, human-computer interaction, and other areas of computing; and (2) as an immersive visualization environment for computing research in areas such as scientific and information visualization, visual analytics, collaborative human-computer interaction, complex design, large-scale performance debugging, cloud computing, quantum computing, and data mining.
Image capture from the Multi-Center Hydrogen Bond Project
The AlloSphere is a three-story, near anechoic cubic space in the new California Nanosystems Institute (CNSI) at UC Santa Barbara, housing a large perforated aluminum sphere, ten meters in diameter, that serves as a display surface. A bridge runs through the center of the sphere from the second floor of the building, holding up to 25 participants. The AlloSphere will include several high-resolution stereo video projectors to illuminate the complete spherical display surface, hundreds of speakers distributed outside the surface to provide high quality spatial sound, a suite of sensors and interaction devices to enable rich user interaction with the data and simulations, and the computing infrastructure to enable the high-volume computations necessary to provide a rich visual, aural, and interactive experience for the user. The AlloSphere will be one of the largest visualization/exploration instruments in the world, and it will serve as an ongoing research testbed for several important areas of computing. In addition, the AlloSphere will serve as an environment for experimental media creation, experience, and performance, as a tool for scientific discovery in areas such as nanosystems, neuroscience, earth science, quantum computing, and biochemistry, and as an instrument for education and outreach. The mix of computational, scientific, and artistic uses for the AlloSphere makes it a unique interdisciplinary instrument and environment.
Photo: Paul Wellman
The basic physical infrastructure of the AlloSphere (the space, spherical display surface, acoustical tiling, painting to minimize visual artifacts, air vents, and the bridge walkway) was completed in February 2007, through state funding for construction of the new CNSI building.
The AlloSphere is differentiated from conventional virtual reality environments, such as a CAVE or a hemispherical immersive theater, by its seamless surround-view capabilities and its focus on multiple sensory modalities and rich interaction: high-quality spatialized audio and novel interaction techniques, complementing the high-quality stereo video projection. Audio is often a neglected or underappreciated modality in immersive systems, and in visualization systems in general. High quality audio, and especially 3D spatialized audio, can give rich and useful cues along several dimensions, such as pitch, tone, tune, duration, and location. While a user is focusing on visual search or details in a particular area of the display, audio can indicate the direction, identity, and severity of an individual event, or it can communicate time-varying cumulative information from many sources. Tools for constructing effective audio visualization ("sonification") are critical.
Photo: Paul Wellman
The AlloSphere will have a significant research and educational impact beyond UCSB. We envision a facility that is accessible to UCSB engineers, scientists, and students; to academic collaborators who come to UCSB for a range of visits, from afternoon usage to sabbaticals; to graduate students who come for domain training; to industry users and partners to showcase their technologies. In addition to its potential for use in undergraduate and graduate education at UCSB, we will leverage the novel experience of the AlloSphere in a range of K-12 and other educational and outreach programs, as it has already proven to be a great draw for young people. To extend its impact far beyond the physical location at UCSB, we will collaborate with public organizations, beginning with planetariums, to eventually transfer design knowledge, software infrastructure, and domain-specific applications that can be used in other large (but scaled-down) display and computing environments. A longer-term research interest is to create display and interaction environments that scale appropriately from the AlloSphere to display walls to desktop to mobile (cell phone and wearable augmented reality) environments.
Photo: Paul Wellman
In addition to its primary use for computing and scientific purposes, there is also tremendous interest in and potential for the use of the AlloSphere as an artistic and aesthetic environment. Artistically, the AlloSphere is an instrument for creative design, the creation and performance of avant-garde new works, and the development of entirely new modes and genres of expression and forms of immersion-based entertainment, fusing future art, architecture, music, media, games, and cinema.
Simultaneous use by multiple people is a key feature of the AlloSphere as a collaborative instrument, and different usage strategies are likely to be effective at different times. When the head position of a single user is tracked in the AlloSphere (as the user moves along the 20m x 2m bridge), a full stereo experience is possible, limited only by the tracking accuracy and audio/visual rendering resolution in time and space. Multiple users in close proximity (e.g., a group of five collaborators) may also have a highquality stereo/spatial rendering experience, if the application does not depend on precise stereo or spatial perception. In this scenario, each user could be interacting with the application (perhaps adding personalized annotations) while one is the primary navigator or controller. Alternatively, each could have complete control of a certain portion of the spherical display space, exploring the data as they desire, with their own video rendering personalized to their location. For a large group - e.g., a 25 person class or a training session - the video rendering will generally be monoscopic with one user controlling the overall system, but still a compelling immersive experience for all. The rest of the group could be observers, or may also have the ability to annotate the experience for immediate effect or for later reference. Exploring these various models of interaction and collaboration will be an important research topic once the initial surround-view AlloSphere version is complete.
Our ultimate goal for the AlloSphere is to provide "sense-limited" resolution in both the audio and visual domains. This means that the video spatial resolution should match human acuity limits and, to the degree possible, human visual dynamic range. The spatial resolution of the audio output should allow us to place virtual sound sources at arbitrary points in space with convincing synthesis of the spatial audio cues used in psychoacoustical localization. Complementary to this, the system must allow us to simulate the acoustics of measured or simulated spaces with a high degree of accuracy. Undergirding these requirements is the computational infrastructure to process, store, and deliver large volumes of data at the necessary rates.
For more information about the AlloSphere, please visit: www.allosphere.ucsb.edu.