A Multi-scale Yarn Appearance Model with Fiber Details

Apoorv Khattar, Junqui Zhu, Emiliano Padovani, Jean-Marie Aurby, Marc Droske, Ling-Qi Yan, Zahra Montazeri

Research output: Working paperPreprint

39 Downloads (Pure)

Abstract

Rendering realistic cloth has always been a challenge due to its intricate structure. Cloth is made up of fibers, plies, and yarns, and previous curved-based models, while detailed, were computationally expensive and inflexible for large cloth. To address this, we propose a simplified approach. We introduce a geometric aggregation technique that reduces ray-tracing computation by using fewer curves, focusing only on yarn curves. Our model generates ply and fiber shapes implicitly, compensating for the lack of explicit geometry with a novel shadowing component. We also present a shading model that simplifies light interactions among fibers by categorizing them into four components, accurately capturing specular and scattered light in both forward and backward directions. To render large cloth efficiently, we propose a multi-scale solution based on pixel coverage. Our yarn shading model outperforms previous methods, achieving rendering speeds 3-5 times faster with less memory in near-field views. Additionally, our multi-scale solution offers a 20% speed boost for distant cloth observation.
Original languageUndefined
Publication statusPublished - 23 Jan 2024

Keywords

  • cs.GR

Cite this