Precisely Imprecise // Hyperreal Material Aesthetics  

An award-winning entry in the RobotArt.org Mechanical Brush Competition of 2017  

Team Leader: Austin Samson (WIT ’12)  

With: Stefan Burnett, Lindsay Dumont, and Peter Comeau  

Supported by the Center for Applied Research at the Architecture Department  

The Center for Applied Research (CfAR) in the Department of Architecture at Wentworth Institute of Technology is a student-focused collaborative environment for investigating emerging fabrication technologies and methods. CfAR supports the department’s core principle of thinking through making by providing a dynamic network of spaces for prototyping and applied research. Students have access to equipment, expertise, and guidance across many areas of fabrication, including CNC milling, 3D printing, woodworking, laser cutting, and robotics.  

Precisely Imprecise // Hyperreal Material Aesthetics 

With the introduction of robotic technology into the art and design realm, artists and designers alike can utilize the subtleties of unforeseen material properties as a driver for developing different aesthetic techniques and workflows. This has the potential to create new hyperreal images specific to the translation of discovered material properties into techniques with a high degree of virtuosity. 

Robotic systems are often assumed to be the perfect building machines capable of high precision, speed, and repetition. While this is true in many of its usual applications, when a robot is forced to interact with tools or processes that are anything but precise, one begins to see that the robot's ability to be precise can fall apart. Take a paintbrush, for example. There are many properties involved with painting that are not so exact: how the bristles disperse when they hit the page can depend on speed, pressure, distance, angle, time, type and density of the bristles, and numerous other factors. In addition to that, the amount of paint, type of paint, and the surface it is being applied to all come into play. When you try to account for these variables, it can be very difficult to repeat any one stroke with a high level of precision. 

This is where robotic systems can be a huge advantage. Because the robot can repeat the same movement over and over with such subtlety, many of the variables listed previously are no longer variables but become controlled aesthetic techniques. Repetition is the key term here when exploring the physical properties of the brush. By using a dab painting technique, we can turn the unpredictability of how the bristles disperse when they hit the page into a hyperreal painting technique with high degrees of resolution. 

The robotic system we used consists of a Kuka Agilus KR6 R900 6-Axis Robotic Arm and a custom-made 3D printed tool holder combined with a standard paintbrush. Rhino3D and Grasshopper3D along with Kuka PRC (developed by Robots in Architecture) were used to simulate and develop the code to run the arm. We also used Autodesk Remake to 3D scan students’ heads for the final exercise of paintings. The custom 3D printed paintbrush holder was modeled in Rhino3D and consists of a few parts. First is the 2-piece holder itself. Second are the 2 magnetic disks that allow for a quick disconnect from the tool to the arm. This magnetic disconnect is important for safety as it allows for the tool to fail without applying pressure to the arm in case of a collision. It also allows for the potential of a quick tool exchange if necessary. 

Different series of images were used to explore the newly found aesthetics of dab painting. Gradients were developed to test the limits of the brush, from a very light touch to the most extreme pressure that can be applied. Portraits were used to test clarity and understand how the technique can be used to re-create hyperreal versions of certain images. Alberto Seveso’s Paint Drop photographs were used to test how far we could push the resolution of the technique to see how much detail we could really achieve. Some of the images were photoshopped to achieve certain effects within the image. Andy Warhol was used as a precedent when looking at the ability to use subtlety to explore or perfect certain techniques by studying the minutiae of repetitive technique and imagery. 

To produce the tool paths for the robot, each image was applied to a sampling grid. The grid can be edited to have lower or higher densities to add or reduce resolution (a 10x10 grid samples 100 points or a 100x100 grid samples 10,000 points). Each point on the grid represents where the brush presses against the page. The amount it presses is dependent upon the grayscale color value of the image at that point, white being the hardest press and black being the lightest press. A multiplication factor is used to determine the overall ratio of lightest press to hardest press to control the resolution of each image. For example, a ratio of 1mm to 10mm presses multiplied by 2 becomes a ratio of 2mm to 20mm presses, thus enlarging the ratio of small to larger dabs. An addition factor is then applied to control where the ratio begins and ends (whether it starts from 1mm to 21mm or 5mm to 25mm).

The final series of images, however, looks to break the grid approach. In this case, we 3D scanned students’ heads using Autodesk Remake. Once the heads were scanned into a 3D model, we contoured the heads unto a series of 2D curves. The curves were then divided by a set length into points, and those points were then used as locations for where the brush presses against the page. 

The decision to use only black paint was made because it allowed us and the viewers to focus more closely on the material effects of the paint and the brush rather than on the color, which would have been an unnecessary distraction in these tests. The application of paint to the brush was left to the user to open the possibilities of disrupting the process at any moment for a more real-time effect. As we ran the script, we began to understand how the amount of paint on the brush affected the look of the painting. As the process moved forward, we could manually interject our own intuition into each painting by deciding when and where to add paint. 

One of the major concerns with robotic systems is the notion that they could replace humans in many tasks, causing fear of job loss or resulting in the loss of human intuition. By strategically leaving certain variables up to real time user decision making, we hope to overcome those fears by showing that robots will not replace humans, but heighten their skills and abilities to create new and improved processes and technique. 

Learn more about this project and the creation of Portrait 02:  https://www.youtube.com/watch?v=4EnGnoEMLBI