Handout for Live Visuals

“Live” Visuals

VJ’ing stems from “Projection Design,” a term chosen by scenic artists to refer to designers who worked with film and video projectors in theatre. The title has since grown beyond the technical. It now includes LED fixtures, TV monitors, computer screens, interactive visual instruments, Pepper’s Ghosts and Eyeliners to creative holographic images, and so on. Our field now generally involves putting an image on the stage that is ephemeral and changeable.

The input and output of your images will have a great influence on content, context, aesthetics, and meaning. It is also a great way to play with “practical” effects, by having the materiality of the “screen” and/or the inherent aesthetics of the input device help further define your art.

Lesson 1: Input To Output

In this lesson we will explore the three most basic parts of our VJ toolkit – inputs, filters and outputs.

The inputs, or sources, are the materials that contain the content of our work, whether they be pre-rendered files, live feeds or real-time computer generated visuals.

This imagery is then processed by “filters” also commonly known as effects or FX, which modify the stream of pictures to change the aesthetics and overall feeling associated with them.

Finally the result is sent to a place where it can be viewed, output to a screen, projector, a movie file on a hard drive or streamed to the Internet.

One of the most powerful aspects of working with “live” video is the ability to experiment with the addition of live-camera feeds into your project. Consider how the live presence of your face/body influences the meaning of your imagery. You are now part of the subject matter, so consider your relationship to the pre-existing forms.

Videos and Slides

Lesson Overview

What are visuals? A historical context. Inputs / Sources

  • Pre-rendered files
  • Interactive generators
  • Live feeds FX and Composition
  • Adding filters between input and output
  • Adding layers
  • Adjusting layer opacity and blend modes Outputs
  • Fullscreen output
  • Recording movies
    • Capture to h.264 for uploads to YouTube, Vimeo
    • Capture to PhotoJPEG / HAP for live remixing
  • Perspective correction
  • Using projectors / displays
  • System Preferences: Displays
  • Color calibration

Lecture Notes

  • What are visuals? A historical context.
    • Theater
    • Light projections
    • Film
    • Video
  • What is video?
    • What is a video signal?
      • How is video different from film?
      • Analog (eg NTSC, PAL) vs Digital
    • How is video transmitted and stored?
      • Types of cables
      • Physical media and digital files
  • What is a media codec?
    • How do image / video codecs work?
    • What image / video codecs are commonly used today?
  • What is video videoinstrumentalism?

Resources

Homework

  • Record 5 clips using a live web-cam with different sets of FX applied.
  • Connect to an external display (such as a projector, TV or monitor) using an HDMI, DVI, VGA or similar connection.

Lesson 2: Responsiveness

VJ’ing design is a cybernetic art form; we are essentially creating a visual instrument. The goal of many visual performers is to have a close and immediate interface with computers, to make them expressive. Their goal is to use these machines to emote, thereby making the computer’s presence invisible.

For live performance, particularly for live music, the element of improvisation and responsiveness matches the energy and ephemeral quality of the performance in a way that pre-rendered and cued/time coded imagery cannot.

In addition, the imagery and its delivery systems (playback software, MIDI controllers, analog mixers, and so on) can be refined and tweaked over time, similar to way music may evolve during rehearsals on a tour—fusing a symbiotic relationship between the musicians and the visualist.

The presence of real-time effects and audio-responsive imagery increases the synaesthetic relationship between image and sound, thereby creating a more “live” experience for the audience.

This week, we will explore interactive concepts that extend the moving image beyond the timeline to real-time interactive expression, using data mappings from physical interfaces such as keyboards, MIDI, OSC and DMX lighting boards.

Videos and Slides

Lesson Overview

  • What is responsiveness?
  • What is a cybernetic artform?
  • What are physical and sensory inputs?
  • What are MIDI, DMX and OSC? In what ways are they different?

  • Using MIDI and OSC instruments / controllers
  • Using the Control Surface plugin to create a virtual instrument

  • Add audio reactivity to FX and layer parameters
  • Add MIDI control to FX and layer parameters
  • Add OSC control to FX and layer parameters
  • Media bin setup
    • Keyboard / MIDI / OSC triggers
    • Enabling media preloading
  • Control surface plugin
    • Adding UI items
    • Custom layouts
    • Control from webpage

Lecture Notes

  • What is responsiveness?
  • What is a cybernetic artform?
  • What physical interfaces are used to “perform” with computers / machines?
  • What are MIDI, DMX and OSC? In what ways are they different?
    • What is MIDI?
    • What is DMX?
    • What is OSC?
    • A comparison of MIDI / DMX / OSC
    • Choosing a MIDI / DMX / OSC controller
  • What are sensory inputs?
    • Sound
    • Visual

Homework

  • Record 5 short “gestures” as a single movie file, using any single source type (eg live web-cam), with different sets of FX applied while using audio analysis, MIDI or OSC to control the parameters. Each gesture should be no longer than 4-16 seconds in length with a short pause (“rest”) in between each section.
Tags: