360 Ings
Video performance, 360 machine-learning-enhanced rotoscoping, Python
Carnegie Mellon University
2020
In March of 2020, a stay-at-home order was enforced in Allegheny County. Though residing in Pittsburgh at that time neither my country was the United States nor Spain. My apartment circumscribed a temporal nation. What used to happen outside of my walls, short was happening virtually all over my place. My house became the control room for the production, consumption, and biopolitics of my life, “the heart of virtual consumption and telecommuting” (Paul B. Preciado, 2020).
Returning home indefinitely produces an unsettling sentiment of oddness. For some, this felt identical to the surprise of that man in Chantal Akerman’s Le déménagement, that stands bewildered to the asymmetry of his apartment. For others, a perpetual suite of small happenings similar to the piano pieces that the composer Henri Cowell titled Nine Ings in the early 20th century, all gerunds:
floating, frisking, fleeting, scooting, wafting, seething, whisking, sneaking, swaying.
In 360 Ings those small gestures and their variations are surveilled in the 360-degrees where they happen, isolated, cropped, and overlapped automatically to portray the strangeness of the continuous and anxious patterns of the self fading in one room.
This video performance was recorded with a 3.5K 360-camera at 15fps, segmented and rotoscoped using CUDA 10.0, Pytorch, and Tensorflow with a deep neural network method to predict a high-quality matte from a single frame, and artisanally assembled as an interactive spherical composite.
–
Awarded the Frank-Ratchye STUDIO for Creative Inquiry's Residency-In-Your-Room Fellowship
– Showcased at The Piper CMU Community News as part of the course Experimental Capture led by Golan Levin and Nica Ross in 2020.