I wanted to make a machine learning installation that felt accessible to all backgrounds and friendly for all ages. The Butterfly Room is an interactive projection installation where hand-drawn butterflies are animated to fly around the room. The setup involves a camera that allows visitors to draw their own butterfly and take a picture of it. Computer vision is used to read in the image and animate it, making the butterfly to flap around. Swarms of butterflies accumulate on the projection, interacting with each other in a display of automated technology and organic forms. The work was a way to prioritize organic and hand-drawn imagery in the context of advanced ML, and challenge pervading aesthetics in ML art that feel austere and mechanical.
In the program, the image is first fed into a neural network trained to recognize its orientation (the rotation of the butterfly). I had trained this image classifier on a dataset I created by rotating stock drawings of butterflies. Next, it is fed into a contour finding program that identifies its perspective (front-view versus side-view) and number of wings (either 2 or 4). I then manipulate the wing contours to simulate flapping, using varied timing on the top versus bottom wings for 4-wing drawings.
While working on this project, I also experimented with alternative methods to generate AI-based animations using latent space interpolation and a post about my research can be read here:
Shown at The Midway (San Francisco).