2005 W



Instructors
TA





MAT 256 Visual Design Through Algorithms: Explorations of Visual Perception



Jerry Gibson, George Legrady
August Black


Tuesday 1-2 pm - Estudio
Thursday 12-3pm - Estudio




Course Description
(Click for more detail)





 

MAT 256 will focus on the development of an installation wherein the participant/visitor interactswith multiple real-time video signals, and which, while maintaining the context of the artistic experience, allows the participant's choices and responses to be correlated with video quality to determine human perceptual models. Aesthetic and technical issues in the production of an interactive telematic visual environment will be addressed. Topics to be covered include visual perception, delivery of real-time televisual signals, design and implementation of 3D devices to control remote cameras, and to record viewer actions, insertion of controlled distortions, and the measurement of significant human responses.


Course Schedule  

[wk 1]         T 1.4 Overview of course details, production planning

Th 1.6

Projects Reviewed
 

GL: Interactive Processes | Interactive Basics

Paul de Marinis | Jeffrey Shaw
[1]| (Terrain) Ulrike Gabriel | Barcode Hotel (Hoberman)


[wk 2]       T 1.11

Image
Processed Image
Interaction/Feedback
Filter Processes
Architectural Space

References

GL: Televisual presence in art, architecture, and telecommunications

Inverse Technology [bang bang] [spy plane] (Jeremichenko)
Rokeby: [Watch] [Seen] [Taken] [Guardian Angel ] [Watched & Measured]
Dan Graham [1] [2] [3] [4] | Bruce Nauman | Paul Sermon | Julia Scher | Steve Mann [Fung]
Nam June Paik [1] | Marie Sester |
Swisshouse [1] (Huang, Waldvogel) An augmented reality space | Paracite (bits&spaces)

Ctrl-Space | Future Cinema |


Th 1.13

JG: Introduction to human perception in voice, audio, still images, and video, with applications to compression and data hiding

JG: Filtering, sampling, digitization, and reconstruction of media


[wk 3]        T 1.18  

JG: Frequency domain representations and effects | Nyquist Sampling Theorem


Th 1.20


 

GL: Background History: The Cave and Other Virtual Environments


[wk 4]         T 1.25 JG: Randomness and Random Walks (Markov Chains)

Th 1.27

  GL: System Assembly: motion sensing of multiple televised camera scenes
Haptic Interfaces, Function of narrative

[wk 5]            T 2.1
JG: Autoregressive and moving average models and their effects

Th 2.3

  Lab


[wk 6]            T 2.8

Barcelona
Cornerhouse, Manchester
Hull Time Based Arts
GL: Conferences Report

MetaNarratives | Li Zhensheng | del.icio.us | flickr |
Cornerhouse | pocketsfullofmemories.com | C.Sandison |
Hackitectura | noborder | fadaiat| E.Wohlgemuth [body scan][2] |
S.Biggs
[babel]| P.Bentley |

Th 2.10
  Lab


[wk 7]          T 2.15   JG: Perceptual distortion and models for human perception

Th 2.17
  JG: Perceptual distortion and models for human perception
Haptic device feedback, synchronization with noise insertion


[wk 8]          T 2.22 Haptic device feedback, synchronization with noise insertion
Interaction between audio and video and relation to perception

Th 2.24
  Project Production


[wk 9]            T 3.1
Th 3.3
Project Production


[wk 10]           T 3.8
Th 3.10
Project Production


Project

Description





Production Team
FM

FM uses interactive video to collapse two spaces into one. A participant enters one of two rooms and sees a video projection on the wall. The projection shows an image of the participant intermingled with the image of other participants in the opposite room. Through interaction, a third space is shaped by the overlapping movements of the participants "in between" the two rooms.

FM, Wesley Smith (Project Description)
Information, Meaning and Interaction in Digital Media, Carlos Castellanos
FM, Eunsu Kang (Timeline, Collaboration, Project Development)
Quiet Demo, Brian Springer (Broadcast Media Analysis, MPEG7)
Elements, Sofia Larsson


Project

Description




Production Team

LM

LM transforms live video into a 3D intensity representation. A hand-held controller allows the user to move through color, resolution and intensity spaces, producing different levels of abstraction. The 3D video mirrors the user's gestures in real-time, providing feedback for interaction.

MT9 3D Accelerometer, Will Wolcott
MAX to OpenGL, Zach Davis
3D Parameter Interaction / Visualization, Heawon Kang


Technical Links

[Max/Jitter Patches] [3D Xsens MT9 Inertial Measurement Unit]