Neural Appearance Model for Cloth Rendering

Guan Yu Soh, Zahra Montazeri

Research output: Contribution to journalArticlepeer-review

53 Downloads (Pure)

Abstract

The realistic rendering of woven and knitted fabrics has posed significant challenges throughout many years. Previously, fiber-based micro-appearance models have achieved considerable success in attaining high levels of realism. However, rendering such models remains complex due to the intricate internal scatterings of hundreds of fibers within a yarn, requiring vast amounts of memory and time to render. In this paper, we introduce a new framework to capture aggregated appearance by tracing many light paths through the underlying fiber geometry. We then employ lightweight neural networks to accurately model the aggregated BSDF, which allows for the precise modeling of a diverse array of materials while offering substantial improvements in speed and reductions in memory. Furthermore, we introduce a novel importance sampling scheme to further speed up the rate of convergence. We validate the efficacy and versatility of our framework through comparisons with preceding fiber-based shading models as well as the most recent yarn-based model.

Original languageEnglish
Article numbere15156
JournalComputer Graphics Forum
Volume43
Issue number4
DOIs
Publication statusPublished - 24 Jul 2024

Keywords

  • CCS Concepts
  • • Computing methodologies → Reflectance modeling

Fingerprint

Dive into the research topics of 'Neural Appearance Model for Cloth Rendering'. Together they form a unique fingerprint.

Cite this