Please use this identifier to cite or link to this item:
https://elib.vku.udn.vn/handle/123456789/2729
Title: | Few-Shots Novel Space-Time View Synthesis from Consecutive Photos |
Authors: | Mai, Van Quan Nguyen, Duc Dung |
Keywords: | NeRF View synthesis Few-shot view reconstruction |
Issue Date: | Jul-2023 |
Publisher: | Springer Nature |
Abstract: | Despite the remarkable result of Neural Scene Flow Fields [10] in novel space-time view synthesis of dynamic scenes, the model has limited ability when a few input views are provided. To enable the few-shots novel space-time view synthesis of dynamic scenes, we propose a new approach that extends the model architecture to use shared priors learned across scenes to predict appearance and geometry at static background regions. Throughout the optimization, our network is trained to rely on the image features extracted from a few input views or from the learned knowledge for reconstructing unseen regions based on the camera view direction. We conduct multiple experiments on NVIDIA Dynamic Scenes Dataset [23] that demonstrate our approach results in a better rendering quality compared to the prior work when a few input views are available. |
Description: | Lecture Notes in Networks and Systems (LNNS, volume 734); CITA: Conference on Information Technology and its Applications; pp: 240-249. |
URI: | https://link.springer.com/chapter/10.1007/978-3-031-36886-8_20 http://elib.vku.udn.vn/handle/123456789/2729 |
ISBN: | 978-3-031-36886-8 |
Appears in Collections: | CITA 2023 (International) |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.