Dynamics-Aware Trajectory Generation for Artistic Painting using Diffusion
Gerry Chen, Frank Dellaert, and Seth Hutchinson
2024 RSS Workshop: Generative Modeling meets HRI (GenAI-HRI) 2024, in press
Click here for Live Demo!
Draw your own shapes and watch the model modify it!
Note: This live demo may or may not be available depending on the current state of the server. Please reach out if you believe it is not working and you would like me to turn it on.
Click here for paper pdf
Click here for the poster pdf
Abstract
In this work, we seek to generate robot trajectories for artistic painting which exploit the dynamics unique to a robot embodiment. Denoising Diffusion Probabilistic Models (DDPM) have been shown to be effective at generating not only images, but also many other continuous signals including robot trajectories and stroke-based drawing paths. While existing works generating stroke-based art using DDPMs produce computer renderings of drawings, many roboticists and artists have previously identified the value in creating physical artwork with an embodied AI. One notable quality of artwork is the particularities of the medium and tools used. Therefore, we seek to combine artistic stroke generation and dynamics-aware trajectory generation using DDPM to generate strokes that capture both the artistic qualities of the training data and of the robot embodiment. We compare several approaches to extending stroke generation DDPMs to respect robot dynamics, including alternative parameterizations, training on modified data, classifier guidance, and classifier-free guidance. We qualitatively show that classifier-free guidance most effectively exploits the robot embodiment to generate visually pleasing yet dynamically feasible painting trajectories.
Links:
[PDF]
[Poster]
[Live Demo]
Please note: some of these links may be broken.
The links on the main publications page should always work, though.