This project revisits the animated film Continuous Sound and Image Moments, made in 1966 by Jeffrey Shaw, Willem Breuker, and Tjebbe van Tijen. The original film consisted of a sequence of hand drawings, each shown for only a few seconds. In Latent Embeddings, a machine learning algorithm constructs a generative model from digitized versions of those drawings. New images are then produced by exploring the latent space of the model.
This project is a collaboration of artists Jeffrey Shaw and Hector Rodriguez, with coding by Sam Chan and technical advisor Mike Wong. It premiered at the Jeffrey Shaw retrospective exhibition entitled WYSIWYG in the Osage Gallery, Hong Kong. This work uses unsupervised machine learning techniques to revisit the animated film Continuous Sound and Image Moments, made in 1966 by Jeffrey Shaw, Willem Breuker, and Tjebbe van Tijen. The original film consisted of a sequence of hand drawings, each shown for only a few seconds. In Latent Embeddings, digitized versions of those drawings are processed by a Vector Quantized Variational Autoeconder (VQ-VAE), an algorithm that uses variational inference techniques to construct a generative model. New images are then produced by exploring the latent space of the model. These images resemble the hand-drawn originals to varying degrees. The artists control the extent of resemblance by constraining the algorithm either to remain close to the training dataset or to venture into more remote areas of the latent space, fluctuating between recognition and surprise, between memory and discovery.
Shown at: Art Machines Past and Present, curated by Richard Allen and Jeffrey Shaw, Indra and Harry Banga Gallery, City University of Hong Kong , 24/11/2020 – 2/5/2021
https://www.hkact.hk/act9
EXHIBITION RECORD
2020/11/24 - 2021/05/02 : Art Machines Past/Present, Indra and Harry Banga Gallery, City University of Hong Kong, Hong Kong, China
2019/11/12 - 2020/06/21 : HKACT!, Osage Hong Kong, Hong Kong, China