“Live” Visuals / Lesson 2 / Responsiveness
Lesson Overview
VJ’ing design is a cybernetic art form; we are essentially creating a visual instrument. The goal of many visual performers is to have a close and immediate interface with computers, to make them expressive. Their goal is to use these machines to emote, thereby making the computer's presence invisible.
For live performance, particularly for live music, the element of improvisation and responsiveness matches the energy and ephemeral quality of the performance in a way that pre-rendered and cued/time coded imagery cannot.
In addition, the imagery and its delivery systems (playback software, MIDI controllers, analog mixers, and so on) can be refined and tweaked over time, similar to way music may evolve during rehearsals on a tour—fusing a symbiotic relationship between the musicians and the visualist.
The presence of real-time effects and audio-responsive imagery increases the synaesthetic relationship between image and sound, thereby creating a more “live” experience for the audience.
This week, we will explore interactive concepts that extend the moving image beyond the timeline to real-time interactive expression, using data mappings from physical interfaces such as keyboards, MIDI, OSC and DMX lighting boards.
Videos and Slides
- Responsiveness: History and basic concepts.
(Responsiveness Lecture Video)
- Responsiveness: VDMX Demonstration
(Responsiveness Demonstration Video)
Lecture Notes
(Responsiveness Lecture Slides)
Reference Links
- Responsiveness Sample project
- BlackHole (BlackHole replaces Soundflower; also note that VDMX6 can receive audio streams directly from other applications)
- Facetracker / FaceOSC (Note that VDMX6 now has built in face and hand tracking)
- TouchOSC
- MIDI Controllers discussion thread on forums
Related Tutorials and Case Studies
- Creating slider presets
- 4 Layer Korg nanoKontrol Template
- Enabling MIDI and OSC Echo ModeSetting up MIDI Bin sync with the APC20
- Using game controllers in VDMX
- The ECLECTIC METHOD REMIX, Part Two - Jamming
- PZYK SKAN – EEG Controlled Sound and Visuals
- FaceOSC
- Making custom Face Tracking FX in Quartz Composer
Homework
Assignments
- Record 5 short “gestures” as a single movie file, using any single source type (eg live web-cam), with different sets of FX applied while using audio analysis, MIDI or OSC to control the parameters. Each gesture should be no longer than 4-16 seconds in length with a short pause (“rest”) in between each section.
Review Questions
- What are sensory inputs?
- Sound
- Visual
- Smell
- Touch
- Taste
- What is responsiveness?
- What are physical and sensory inputs?
- What physical interfaces are used to “perform” with computers / machines?
- What is a cybernetic artform?
- What are MIDI, DMX and OSC? In what ways are they different?
- Choosing a MIDI / DMX / OSC controller
- What are some examples of control data?
- Using MIDI and OSC instruments / controllers
- Using the Control Surface plugin to create a virtual instrument
- Adding audio reactivity to FX and layer parameters
- Adding MIDI control to FX and layer parameters
- Adding OSC control to FX and layer parameters
- Keyboard / MIDI / OSC media triggers