[2025.03/18] * This Website Is Still Under Construction. More infomation and videos will be updated soon.
Obtaining high-quality, practically-usable 3D models of biological plants remains a significant challenge in computer vision and graphics. In this paper, we present a novel method for generating realistic 3D plant models from single-view photographs. Our approach employs a neural decomposition technique to learn a lightweight hierarchical box representation from the image, effectively capturing the structures and botanical features of plants. Then, this representation can be subsequently refined through a shape-guided parametric modeling module to produce complete 3D plant models. By combining hierarchical learning and parametric modeling, our method generates structured 3D plant assets with fine geometric details. Notably, through learning the decomposition in different levels of detail, our method can adapt to two distinct plant categories: outdoor trees and houseplants, each with unique appearance features. Within the scope of plant modeling, our method is the first comprehensive solution capable of reconstructing both plant categories from single-view images.
These videos shows how the boxes are decomposed progressively into the final 3D plant models.
a. Modeling Results.
b. Geometry Quality.
This figure further visualizes the underlying geometries of our method and a recent state-of-the-art method (i.e., One-2-3-45++). Our method not only preserves significantly finer geometric details but also produces structured topology.
a. Animation in games.
Our method produces CG-compatible structured geometry, enabling direct use in other downstream applications like Games. For example, The following figure simulates a dynamic wind-driven animation of our reconstructed geometry under strong wind, showcasing the high practical applicability of our method.
b. Shape Interpolation.
The box hierarchies are implicitly encoded in the latent feature space, enabling us to (a) obtain continuous variations between two given plant structures by directly performing linear interpolation on their feature vectors, and (b) to produce new plants by sampling on feature space. The two figures are trained on two dataset seperately.
More results and toolkits will be updated.
1. Plant Dataset Generator: I will release the generation tool that I developed for automatically generating 3D plant dataset over here. Basically, it will be released in the form of binary program (which can be directly executed in Windows PC).
2. Online Demo: A WebGL demo will launch soon. You can use directly in Browsers at that time.
3. Code Release: Code is progressively prepared now, and will be avaliable at this Github page.
@article{liu2025boxplant, author = {Liu, Zhihao and Cheng, Zhanglin and Yokoya, Naoto}, title = {Neural Hierarchical Decomposition for Single Image Plant Modeling}, journal = {Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR)}, year = {2025}, }