Learning to Rasterize Differentiable

Chenghao Wu, Zahra Montazeri, Tobias Ritschel

Research output: Working paperPreprint

18 Downloads (Pure)


Differentiable rasterization changes the common formulation of primitive rasterization -- which has zero gradients almost everywhere, due to discontinuous edges and occlusion -- to an alternative one, which is not subject to this limitation and has similar optima. These alternative versions in general are ''soft'' versions of the original one. Unfortunately, it is not clear, what exact way of softening will provide the best performance in terms of converging the most reliability to a desired goal. Previous work has analyzed and compared several combinations of softening. In this work, we take it a step further and, instead of making a combinatorical choice of softening operations, parametrize the continuous space of all softening operations. We study meta-learning a parametric S-shape curve as well as an MLP over a set of inverse rendering tasks, so that it generalizes to new and unseen differentiable rendering tasks with optimal softness.
Original languageEnglish
Publication statusPublished - 23 Nov 2022


  • cs.GR
  • cs.CV


Dive into the research topics of 'Learning to Rasterize Differentiable'. Together they form a unique fingerprint.

Cite this