( See videos of the machine in action, as well as photos of more paintings at http://bengrosser.com/projects/interactive-robotic-painting-machine/ )
Our everyday interactions are increasingly mediated by technology, be they mobile phones, chat systems, or social networking sites. These systems are designed to anticipate and support our needs and desires while facilitating those interactions. As these systems grow in complexity, or intelligence, how does that intelligence change what passes through them? Further, how does that intelligence evolve to make its own work for its own needs?
This last question served as the launching point for my Interactive Robotic Painting Machine. Does an art-making machine of my design make work for me or for itself? How does machine vision differ from human vision, and is that difference visible in its output? Is my own consciousness reinforced by the system or does it become lost within? In other words, is this machine alive, with agency as yet another piece of the technium, or is it our own anthropomorphization of the system that makes us think about it in these ways?
What I’ve built to consider these questions is an interactive robotic painting machine that uses artificial intelligence to paint its own body of work and to make its own decisions. While doing so, it listens to its environment and considers what it hears as input into the painting process. In the absence of someone or something else making sound in its presence, the machine, like many artists, listens to itself. But when it does hear others, it changes what it does just as we subtly (or not so subtly) are influenced by what others tell us.
See videos of the machine in action, as well as photos of more paintings at http://bengrosser.com/projects/interactive-robotic-painting-machine/ .