The BUTTERFLY ROOM is an interactive projection installation where hand-drawn butterflies are animated to fly around the room. The setup involves a camera that allows visitors to draw their own butterfly, then take a picture of it. The project uses computer vision to read in the image, then animate it, allowing the butterfly to flap around. Swarms of butterflies accumulate on the projection, interacting with each other in a display of automated technology and organic forms.
This documentation describes the iteration of the project developed in August 2016, with a new iteration currently underway that will incorporate more complex animation and greater visual design in implementation.
The original implementation allowed users to draw a butterfly digitally using a paint program. The dataset of freely-drawn butterfly images is sampled here:
The image outline identified using edge detection via Open-CV. The features of the butterfly are identified using a neural network classifier, and the heuristic of calculating the image’s greatest axis of symmetry (the central axis between the two wings). This allows us to divide the image into the wings of the butterfly and generate the following animations:
Side views or other rotations of the butterfly can be created by overlapping individual wing segments.
The generated animations were clustered onto a projection in real time:
The initial implementation was performed in black and white although I have experimented with techniques of automatic color generation. With edge detection, the image may also be colored for pen and ink drawings then animated. Here are sample image that have been automatically colored, using either(left) random colors via Python or (right) via the digital application Paints Chainer (which uses neural nets to generate a watercolor effect):