Abstract
Neural reflectance models are capable of reproducing the spatially-varying appearance of many real-world materials at different scales. Unfortunately, existing techniques such as NeuMIP have difficulties handling materials with strong shadowing effects or detailed specular highlights. In this paper, we introduce a neural appearance model that offers a new level of accuracy. Central to our model is an inception-based core network structure that captures material appearances at multiple scales using parallel-operating kernels and ensures multi-stage features through specialized convolution layers. Furthermore, we encode the inputs into frequency space, introduce a gradient-based loss, and employ it adaptive to the progress of the learning phase. We demonstrate the effectiveness of our method using a variety of synthetic and real examples.
Original language | English |
---|---|
Article number | e15116 |
Journal | Computer Graphics Forum |
Volume | 43 |
Issue number | 6 |
Early online date | 15 May 2024 |
DOIs | |
Publication status | Published - 24 Sept 2024 |
Keywords
- BTF
- appearance modelling
- multiresolution
- neural networks neural rendering