5 Apr 2024 | Albert J. Zhai, Yuan Shen, Emily Y. Chen, Gloria X. Wang, Xinlei Wang, Sheng Wang, Kaiyu Guan, Shenlong Wang
The paper "NeRF2Physics: Dense Prediction of Physical Properties from Language-Embedded Feature Fields" presents a novel approach to estimating the physical properties of objects using a collection of images. Inspired by human visual reasoning, the method integrates object-level semantic reasoning with point-level appearance reasoning. It leverages large language models (LLMs) to propose candidate materials for each object and constructs a language-embedded point cloud. The physical properties of each 3D point are then estimated using a zero-shot kernel regression approach. The method is accurate, annotation-free, and applicable to any object in the open world. Experiments demonstrate its effectiveness in various physical property reasoning tasks, such as estimating mass, friction, and hardness. The code for the method is available at <https://ajzhai.github.io/NeRF2Physics>.The paper "NeRF2Physics: Dense Prediction of Physical Properties from Language-Embedded Feature Fields" presents a novel approach to estimating the physical properties of objects using a collection of images. Inspired by human visual reasoning, the method integrates object-level semantic reasoning with point-level appearance reasoning. It leverages large language models (LLMs) to propose candidate materials for each object and constructs a language-embedded point cloud. The physical properties of each 3D point are then estimated using a zero-shot kernel regression approach. The method is accurate, annotation-free, and applicable to any object in the open world. Experiments demonstrate its effectiveness in various physical property reasoning tasks, such as estimating mass, friction, and hardness. The code for the method is available at <https://ajzhai.github.io/NeRF2Physics>.