Media Arts and Design Minor
Starting Summer 2023
For more information, visit the
UCSB Summer Sessions website
Emotion-aware Creativity Tools: Research on Emotions and Painting Through Creativity Tools with an Interactional Approach
Abstract
In affective computing, there exist various interactive systems allowing users to easily express their emotion by adapting in real time to reflect the perceived emotional state of the viewer. For instance, “empathic painting” created by Maria Shugrina is a painter-rendering system that automatically creates digital paintings with parameterized painting features. The system can estimate the viewer’s purposefully displayed emotional state through facial expression recognition. However, almost no attempt has been made to provide materials for a painting application to allow users to easily express their emotions in their works. As artists use formal elements such as forms and colors to express desired emotions, we need tools that help the users to add emotional conditions for these elements while creating paintings.
Through the introduction of emotion-based brushes, the following research question was explored: how do we incorporate emotions into drawing/painting tools and enable users to control expression through the use of these tools? Especially since an immersive virtual reality (VR) environment is more effective in producing emotions than less-immersive displays, the application has been designed in the immersive VR environment. This dissertation examines how the incorporation of emotions into a VR painting application with a provided range of formal elements affects awareness and expression of human emotions and presents two contributions: design of prototype drawing/painting applications for incorporating emotion with an interactional approach, and an analysis of user evaluation results.
Emerging Ecosystems: Transmodal Sensory Experiences in Immersive Environments
Abstract
This master's project thesis delves into the dynamic intersection of human sensory perception, computational media, environmental design, and emerging technologies. Within this comprehensive inquiry, my contributions span several interconnected team projects. Notably, 'Hydnum' delves into the synthesis of olfactory stimuli and emotional states within the immersive realm of extended reality. 'Echoes' explores the enigmatic liminal space between perception and meaning, while 'Synaptic Time Tunnel' pays tribute to the evolutionary trajectory of computer graphics by delving into synaptic interconnections.
As my exploration unfolded, it became evident that each project built upon the lessons of the previous one, progressing from 'Hydnum,' a low-level interactive network translated into tangible forms, to 'Echoes,' an intermediate project that translates networks into both tangible artifacts and virtual environments with medium interactivity, and ultimately to 'Synaptic Time Tunnel,' where the focus shifts to a fully immersive virtual network experience with high-level interactivity.
These pivotal projects stand as the embodiment of my extensive explorations within the realm of worldmaking in Extended Reality (XR), the expressions of transmodal data in various formats, and the intricate nuances of interaction design. To delve into specifics, in Hydnum, my primary contribution revolves around the procedural design of physical and digital structures that emerge from biodata, with a particular emphasis on the role of olfactory stimuli in shaping spatial presence. In ‘Echoes’, my central involvement centers on the translation of biodata into an olfactory glass artifact. Lastly, in the case of ‘Synaptic Time Tunnel’, my contribution takes the form of an immersive interconnected system, serving as an expressive medium that bridges the gap between data and interaction.
In essence, the unwavering goal is to create innovative, immersive experiences that connect users with the natural world and explore its intricate traces, guided by the profound principles of morphology and cognition.
Suleyman will also give a lecture for the UCSB Arts and Lectures series in Campbell Hall on October 5th, 2023 at 7:30pm.
artsandlectures.ucsb.edu/events-tickets/events/23-24/mustafa-suleyman
Mustafa Suleyman CBE is a British artificial intelligence researcher and entrepreneur who is the co-founder and former head of applied AI at DeepMind, an artificial intelligence company acquired by Google and now owned by Alphabet. He is currently the CEO of Inflection AI, and author of the book "The Coming Wave: Technology, Power, and the 21st Century's Greatest Dilemma."
Suleyman recently was recently interviewed on MSNBC, and discussed the future of AI and it's impact on humanity.
Dynamic Theater: Location-Based Immersive Dance Theater, Investigating User Guidance and Experience
Abstract
Dynamic Theater explores the use of augmented reality (AR) in immersive theater as a platform for digital dance performances. The project presents a locomotion-based experience that allows for full spatial exploration. A large indoor AR theater space was designed to allow users to freely explore the augmented environment. The curated wide-area experience employs various guidance mechanisms to direct users to the main content zones. Results from our 20-person user study show how users experience the performance piece while using a guidance system. The importance of stage layout, guidance system, and dancer placement in immersive theater experiences are highlighted as they cater to user preferences while enhancing the overall reception of digital content in wide-area AR. Observations after working with dancers and choreographers, as well as their experience and feedback are also discussed. Co-authors are Joshua Lu, and Tobias Höllerer.
Reality Distortion Room: A Study of User Locomotion Responses to Spatial Augmented Reality Effects
Abstract
Reality Distortion Room (RDR) is a proof-of-concept augmented reality system using projection mapping and unencumbered interaction with the Microsoft RoomAlive system to study a user’s locomotive response to visual effects that seemingly transform the physical room the user is in. This study presents five effects that augment the appearance of a physical room to subtly encourage user motion. Our experiment demonstrates users’ reactions to the different distortion and augmentation effects in a standard living room, with the distortion effects projected as wall grids, furniture holograms, and small particles in the air. The augmented living room can give the impression of becoming elongated, wrapped, shifted, elevated, and enlarged. The study results support the implementation of AR experiences in limited physical spaces by providing an initial understanding of how users can be subtly encouraged to move throughout a room. Co-authors are Andrew D. Wilson, Jennifer Jacobs, and Tobias Höllerer.
Synaptic Time Tunnel, SIGGRAPH 2023.
Sponsored by Autodesk, the Synaptic Time Tunnel was a tribute to 50 years of innovation and achievement in the field of computer graphics and interactive techniques that have been presented at the SIGGRAPH conferences.
An international audience of more than 14,275 attendees from 78 countries enjoyed the conference and its Mobile and Virtual Access component.
Contributors:
Marcos Novak - MAT Chair and transLAB Director, UCSB
Graham Wakefield - York University, UCSB
Haru Ji - York University, UCSB
Nefeli Manoudaki - transLAB, MAT/UCSB
Iason Paterakis - transLAB, MAT/UCSB
Diarmid Flatley - transLAB, MAT/UCSB
Ryan Millet - transLAB, MAT/UCSB
Kon Hyong Kim - AlloSphere Research Group, MAT/UCSB
Gustavo Rincon - AlloSphere Research Group, MAT/UCSB
Weihao Qiu - Experimental Visualization Lab, MAT/UCSB
Pau Rosello Diaz - transLAB, MAT/UCSB
Alan Macy - BIOPAC Systems Inc.
JoAnn Kuchera-Morin - AlloSphere Research Group, MAT/UCSB
Devon Frost - MAT/UCSB
Alysia James - Department of Theater and Dance/UCSB
More information about the Synaptic Time Tunnel can be found in the following news articles:
Forbes.com: SIGGRAPH 2023 Highlights
Abstract
Complex systems in nature unfold over many spatial and temporal dimensions. Those systems easy for us to perceive as the world around us are limited by what we can see, hear, and interact with. But what about complex systems that we cannot perceive, those systems that exist at the atomic or sub-atomic? Can we bring these systems to human scale and view this data just as we do in viewing real-world phenomena? As a composer working with sound on many spatial temporal dimensions, shape and form comes to life through sound transformation. What seems to be visually imperceptible becomes real and visually perceptible in the composer’s mind. As media artists we can now take these transformational structures from the auditory to the visual and interactive domain through frequency transformation. Can we apply these transformations to complex imperceptible scientific models to see, hear, and interact with these systems bringing them to human scale?
About the SPARKS session:
Our understanding of the world is limited by the capacity of our senses to ingest information and also by our brain’s ability to interpret it. Through the use of technology, we know that the universe we live in is far more complex and rich with information than what can be perceived by humanity. From microscopic to cosmic, information that transcends our lived experiences is difficult to comprehend. Our ability to augment our senses with technology has resulted in an accumulation of vast amounts of data, often in a form that needs to be translated to be understood. This SPARKS session explores the conceptual and creative aspects of scientific visualization.
https://dac.siggraph.org/the-art-of-scientific-visualization-perceiving-the-imperceptible.
DAC SPARKS - The Art of Scientific Visualization: Perceiving the Imperceptible - April 28, 2023.
Released in March 2023, Xenos is a virtual instrument plug-in that implements and extends the Dynamic Stochastic Synthesis (DSS) algorithm invented by Iannis Xenakis and notably employed in the 1991 composition GENDY3. DSS produces a wave of variable periodicity through regular stochastic variation of its wave cycle, resulting in emergent pitch and timbral features. While high-level parametric control of the algorithm enables a variety of musical behaviors, composing with DSS is difficult because its parameters lack basis in perceptual qualities.
Xenos thus implements DSS with modifications and extensions that enhance its suitability for general composition. Written in C++ using the JUCE framework, Xenos offers DSS in a convenient, efficient, and widely compatible polyphonic synthesizer that facilitates composition and performance through host-software features, including MIDI input and parameter automation. Xenos also introduces a pitch-quantization feature that tunes each period of the wave to the nearest frequency in an arbitrary scale. Custom scales can be loaded via the Scala tuning standard, enabling both xenharmonic composition at the mesostructural level and investigation of the timbral effects of microtonal pitch sets on the microsound timescale.
A good review of Xenos can be found at Music Radar: www.musicradar.com/news/fantastic-free-synths-xenos.
Xenos GitHub page: github.com/raphaelradna/xenos.
There is also an introductory YouTube video:
Raphael completed his Masters degree from Media Arts and Technology in the Fall of 2022, and is currently pursuing a PhD in Music Composition at UCSB.
ACM SIGGRAPH is the premier conference and exhibition on computer graphics and interactive techniques. This year they celebrate their 50th conference and reflect on half a century of discovery and advancement while charting a course for the bold and limitless future ahead.
Burbano is a native of Pasto, Colombia and an associate professor in Universidad de los Andes’s School of Architecture and Design. As a contributor to the conference, Burbano has presented research within the Art Papers program (in 2017), and as a volunteer, has served on the SIGGRAPH 2018, 2020, and 2021 conference committees. Most recently, Burbano served as the first-ever chair of the Retrospective Program in 2021, which honored the history of computer graphics and interactive techniques. Andres received his PhD from Media Arts and Technology in 2013.
Read more from the ACM SIGGRAPH's website and this article on the ACMSIGGRAPH Blog.
The next ACM SIGGRAPH conference is in August 2023 and will be held in Los Angeles, California s2023.siggraph.org.
EmissionControl2 is a granular sound synthesizer. The theory of granular synthesis is described in the book Microsound (Curtis Roads, 2001, MIT Press).
Released in October 2020, the new app was developed by a team consisting of Professor Curtis Roads acting as project manager, with software developers Jack Kilgore and Rodney Duplessis. Kilgore is a computer science major at UCSB. Duplessis is a PhD student in music composition at UCSB and is also pursuing a Masters degree in the Media Arts and Technology graduate program.
EmissionControl2 is free and open-source software available at: github.com/jackkilgore/EmissionControl2/releases/latest
The project was supported by a Faculty Research Grant from the UCSB Academic Senate.
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Media Arts and Technology (MAT) at UCSB is a transdisciplinary graduate program that fuses emergent media, computer science, engineering, electronic music and digital art research, practice, production, and theory. Created by faculty in both the College of Engineering and the College of Letters and Science, MAT offers an unparalleled opportunity for working at the frontiers of art, science, and technology, where new art forms are born and new expressive media are invented.
In MAT, we seek to define and to create the future of media art and media technology. Our research explores the limits of what is possible in technologically sophisticated art and media, both from an artistic and an engineering viewpoint. Combining art, science, engineering, and theory, MAT graduate studies provide students with a combination of critical and technical tools that prepare them for leadership roles in artistic, engineering, production/direction, educational, and research contexts.
The program offers Master of Science and Ph.D. degrees in Media Arts and Technology. MAT students may focus on an area of emphasis (multimedia engineering, electronic music and sound design, or visual and spatial arts), but all students should strive to transcend traditional disciplinary boundaries and work with other students and faculty in collaborative, multidisciplinary research projects and courses.