GGX-GAN: A Generative Adversarial Network for Single-Image Material Appearance Editing with Physical Parameters

Shengyao Wang, Hongsong Li*

*此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Material appearance editing methods based on generative adversarial network (or GAN) use perceptual parameters to control material appearance but are limited by the rating quality of training data. The data annotators were often confused by ambiguous perceptual parameters, leading to poor labeling or rating results. Instead, we choose physical parameters to achieve more interpretable and measurable material appearance editing. The image dataset rendered by a physically based renderer were used for GAN training and model validation, and the GGX BRDF model was chosen to control the glossiness of rendered objects. Then we embedded the BRDF model into the latent space of GAN and establishes a physical parameter control space of editing, allowing continuous material appearance tuning. Inverse rendering results of the images generated by the proposed GAN were analyzed to illustrate and evaluate its editing performance. It is shown that the proposed GAN can provide accurate, intuitive material appearance editing on both real-world photographs and rendered images.

源语言英语
主期刊名Eighth International Conference on Computer Graphics and Virtuality, ICCGV 2025
编辑Haiquan Zhao
出版商SPIE
ISBN(电子版)9781510689213
DOI
出版状态已出版 - 2025
已对外发布
活动8th International Conference on Computer Graphics and Virtuality, ICCGV 2025 - Chengdu, 中国
期限: 21 2月 202523 2月 2025

出版系列

姓名Proceedings of SPIE - The International Society for Optical Engineering
13557
ISSN(印刷版)0277-786X
ISSN(电子版)1996-756X

会议

会议8th International Conference on Computer Graphics and Virtuality, ICCGV 2025
国家/地区中国
Chengdu
时期21/02/2523/02/25

指纹

探究 'GGX-GAN: A Generative Adversarial Network for Single-Image Material Appearance Editing with Physical Parameters' 的科研主题。它们共同构成独一无二的指纹。

引用此