MAT End of Year Show

June 1st at SBCAST
June 9th at UCSB

Click here for more info

Media Arts and Design Minor

Starting Summer 2023

For more information, visit the
UCSB Summer Sessions website

Media Arts and Technology

Graduate Program

University of California Santa Barbara

Events

GDJ: A Tempo Synchronized Granulator

Abstract

Music technology has opened up possibilities for highly sophisticated sound design and composition. Composers are able to build temporally dense and sonically eclectic phrases that are not feasible for humans to play in real time. How do we blend the world of offline, complex composition and performance? I look towards the paradigm of a DJ, where one becomes an expert at taking composed, static (typically grid-based) music and recontextualizing it in a live setting via chopping, filtering, remixing, etc. This idea can be taken a step further through deconstructing composed music via granulation. My contribution is the GDJ (Granular DJ); a granular synthesis tool with guard rails to keep sonic output on tempo, allowing for the liquid sound design associated with granulation while constraining it to a grid used in popular songwriting and dance music. The GDJ leverages the granular tool kit to completely transform source material, while also using systems of staying in time and key, as seen in DJ workflows, to keep the tool constrained enough for solo or multi-performer contexts. GDJ is written in Max/MSP and is currently being used in a live performance system with the author and friend.

Equivalence: An analysis of artists’ roles collaborating with Image Generative AI from the perspective of Conceptual Art through an interactive installation design practice.

Throughout the past year, the public has witnessed a multitude of high-performance text-to-image Generative AI models pushing the boundaries of image synthesis. These advancements have reshaped the art domain and sparked a great debate surrounding the role of artists and the nature of creativity in artwork created with Image Generative AI. This master's project aims to analyze artists' roles and their relationship with machines when creating artwork with Image Generative AI. Drawing inspiration from Rhodes' 4P model of creativity, an analytical framework of 5P+E (Purpose, People, Process, Product, Press, and Evaluation) has been developed to compare the art-creating processes of Conceptual Art and Image Generative AI. To exemplify this framework, a practical case study titled "Equivalence" has been conducted. Equivalence is a multi-screen interactive installation that converts users' speech input into continuously evolving paintings constructed with Natural Language Processing Algorithms and Stable Diffusion Model. Instead of just using users' text prompts, this installation analyzes the emotion, grammar structure, and word choice, converting this information into architectural structures to investigate the relationship between language and image. Through comprehensive analysis and the execution of the case study, this master's project aims to broaden the understanding of artists' roles and foster a deeper appreciation for the creative aspects inherent in artwork created with Image Generative AI.

Developing Audio Plugins

Abstract

Since the beginning of 2022 I have developed four audio plugins using the C++ framework JUCE. In doing so, I accumulated various strategies for enhancing the dependability and efficiency of my programs; for example, I learned to avoid memory allocation in the audio thread, refuse third party libraries, and refrain from invoking system calls. However, I also discovered specific circumstances where each of these principles was no longer desirable. Rather than relying on rules of thumb and general guidelines, I sought to develop a set of first principles for developing production-level audio software.

I argue that many good audio programming practices can be derived from the following fact: audio plugins are multi-threaded programs subject to a firm real-time constraint. With this framing in mind, I present The Template Plugin: a starting point for new plugin projects that integrates the best practices and creative solutions I have implemented in my own work. I justify the design of The Template Plugin by discussing effective strategies for thread synchronization, optimization, program state management, user interfaces, and build systems within the context of multi-threaded and real-time applications.

News

Abstract

Complex systems in nature unfold over many spatial and temporal dimensions. Those systems easy for us to perceive as the world around us are limited by what we can see, hear, and interact with. But what about complex systems that we cannot perceive, those systems that exist at the atomic or sub-atomic? Can we bring these systems to human scale and view this data just as we do in viewing real-world phenomena? As a composer working with sound on many spatial temporal dimensions, shape and form comes to life through sound transformation. What seems to be visually imperceptible becomes real and visually perceptible in the composer’s mind. As media artists we can now take these transformational structures from the auditory to the visual and interactive domain through frequency transformation. Can we apply these transformations to complex imperceptible scientific models to see, hear, and interact with these systems bringing them to human scale?

About the SPARKS session:

Our understanding of the world is limited by the capacity of our senses to ingest information and also by our brain’s ability to interpret it. Through the use of technology, we know that the universe we live in is far more complex and rich with information than what can be perceived by humanity. From microscopic to cosmic, information that transcends our lived experiences is difficult to comprehend. Our ability to augment our senses with technology has resulted in an accumulation of vast amounts of data, often in a form that needs to be translated to be understood. This SPARKS session explores the conceptual and creative aspects of scientific visualization.

https://dac.siggraph.org/the-art-of-scientific-visualization-perceiving-the-imperceptible.

DAC SPARKS - The Art of Scientific Visualization: Perceiving the Imperceptible - April 28, 2023.

Image

Released in March 2023, Xenos is a virtual instrument plug-in that implements and extends the Dynamic Stochastic Synthesis (DSS) algorithm invented by Iannis Xenakis and notably employed in the 1991 composition GENDY3. DSS produces a wave of variable periodicity through regular stochastic variation of its wave cycle, resulting in emergent pitch and timbral features. While high-level parametric control of the algorithm enables a variety of musical behaviors, composing with DSS is difficult because its parameters lack basis in perceptual qualities.

Xenos thus implements DSS with modifications and extensions that enhance its suitability for general composition. Written in C++ using the JUCE framework, Xenos offers DSS in a convenient, efficient, and widely compatible polyphonic synthesizer that facilitates composition and performance through host-software features, including MIDI input and parameter automation. Xenos also introduces a pitch-quantization feature that tunes each period of the wave to the nearest frequency in an arbitrary scale. Custom scales can be loaded via the Scala tuning standard, enabling both xenharmonic composition at the mesostructural level and investigation of the timbral effects of microtonal pitch sets on the microsound timescale.

Image

A good review of Xenos can be found at Music Radar: www.musicradar.com/news/fantastic-free-synths-xenos.

Xenos GitHub page: github.com/raphaelradna/xenos.

There is also an introductory YouTube video:

Raphael completed his Masters degree from Media Arts and Technology in the Fall of 2022, and is currently pursuing a PhD in Music Composition at UCSB.

The MAT alumni that were selected to participate are:

Yoon Chung Han
Solen KIratli
Hannen E. Wolfe
Yin Yu
Weidi Zhang
Rodger (Jieliang) Luo

The International Symposium on Electronic Art is one of the world’s most prominent international arts and technology events, bringing together scholarly, artistic, and scientific domains in an interdisciplinary discussion and showcase of creative productions applying new technologies in art, interactivity, and electronic and digital media.

isea2023.isea-international.org

ACM SIGGRAPH is the premier conference and exhibition on computer graphics and interactive techniques. This year they celebrate their 50th conference and reflect on half a century of discovery and advancement while charting a course for the bold and limitless future ahead.

Burbano is a native of Pasto, Colombia and an associate professor in Universidad de los Andes’s School of Architecture and Design. As a contributor to the conference, Burbano has presented research within the Art Papers program (in 2017), and as a volunteer, has served on the SIGGRAPH 2018, 2020, and 2021 conference committees. Most recently, Burbano served as the first-ever chair of the Retrospective Program in 2021, which honored the history of computer graphics and interactive techniques. Andres received his PhD from Media Arts and Technology in 2013.

Image

Read more from the ACM SIGGRAPH's website and this article on the ACMSIGGRAPH Blog.

The next ACM SIGGRAPH conference is in August 2023 and will be held in Los Angeles, California s2023.siggraph.org.

EmissionControl2 is a granular sound synthesizer. The theory of granular synthesis is described in the book Microsound (Curtis Roads, 2001, MIT Press).

Released in October 2020, the new app was developed by a team consisting of Professor Curtis Roads acting as project manager, with software developers Jack Kilgore and Rodney Duplessis. Kilgore is a computer science major at UCSB. Duplessis is a PhD student in music composition at UCSB and is also pursuing a Masters degree in the Media Arts and Technology graduate program.

EmissionControl2 is free and open-source software available at: github.com/jackkilgore/EmissionControl2/releases/latest

The project was supported by a Faculty Research Grant from the UCSB Academic Senate.

Past News  

Showcase

Exhibition Catalogs

End of Year Show

About MAT

Media Arts and Technology (MAT) at UCSB is a transdisciplinary graduate program that fuses emergent media, computer science, engineering, electronic music and digital art research, practice, production, and theory. Created by faculty in both the College of Engineering and the College of Letters and Science, MAT offers an unparalleled opportunity for working at the frontiers of art, science, and technology, where new art forms are born and new expressive media are invented.

In MAT, we seek to define and to create the future of media art and media technology. Our research explores the limits of what is possible in technologically sophisticated art and media, both from an artistic and an engineering viewpoint. Combining art, science, engineering, and theory, MAT graduate studies provide students with a combination of critical and technical tools that prepare them for leadership roles in artistic, engineering, production/direction, educational, and research contexts.

The program offers Master of Science and Ph.D. degrees in Media Arts and Technology. MAT students may focus on an area of emphasis (multimedia engineering, electronic music and sound design, or visual and spatial arts), but all students should strive to transcend traditional disciplinary boundaries and work with other students and faculty in collaborative, multidisciplinary research projects and courses.

Alumni Testimonials