IPENS Enables Rapid Label-Free 3D Plant Phenotyping

By combining radiance-field reconstruction with SAM2 segmentation, IPENS enables users to obtain precise organ-level geometry from ordinary multi-view images using only a few prompts. In rice and wheat, the system accurately measures voxel volume, leaf surface area, and leaf dimensions, operating non-destructively and at high speed.

Plant phenotyping technologies underpin the development of genotype-phenotype association models and guide trait improvement in modern breeding. Traditional 2D imaging methods struggle to capture complex plant structures, while field phenotyping often requires manual sampling and destructive testing. Recent advances in 3D reconstruction-including Neural Radiance Fields (NeRF) and 3D Gaussian Splatting-have demonstrated strong potential for non-invasive trait evaluation, but most models require large annotated datasets, perform poorly on occluded organs like rice grains, or demand repetitive user interaction per target. Unsupervised approaches lack precision at grain-scale resolution, and multi-target segmentation remains inefficient.

A study (DOI: 10.1016/j.plaphe.2025.100106) published in Plant Phenomics on 15 September 2025 by Youqiang Sun's team, Chinese Academy of Sciences, provides researchers and breeders with rapid, reliable phenotypic data to accelerate intelligent breeding and improve crop productivity.

To evaluate the performance of IPENS, the researchers first designed a quantitative segmentation experiment using MMR (rice) and MMW (wheat) datasets, where 30% of the data served as a validation set and the remaining portion was used for comparative algorithm training. The segmentation task was conducted by manually placing two positive and two negative prompts on both the first and rear video frames, allowing the model to perform unsupervised 3D instance segmentation based on prompt guidance. Segmentation quality was assessed using IoU, precision, recall, and F1 score, and results were compared with existing mainstream algorithms including the unsupervised CrossPoint, the supervised interactive Agile3D, and the fully supervised state-of-the-art oneformer3D. A time-performance evaluation was also conducted by measuring segmentation time for single- and multi-target scenarios, benchmarking IPENS against SA3D and analyzing how efficiency scales with target quantity. Beyond segmentation, phenotypic accuracy was verified through voxel volume estimation and leaf-trait measurement, examining how multi-stage point cloud processing (convex hull → mesh → mesh subdivision) influences error and model stability. Results show that IPENS achieved IoU scores of 61.48%, 69.54%, and 60.13% for rice grain, leaf, and stem, and 92.82%, 86.47%, and 89.76% for wheat panicle, leaf, and stem, respectively. Its mean IoU surpassed the unsupervised CrossPoint (Rice 23.41% / Wheat 16.50%) and exceeded Agile3D's first-interaction performance, demonstrating competitive accuracy without labeled data. Time analysis revealed ~3.3× acceleration compared with SA3D, with single-organ segmentation taking ~70 seconds and multi-organ inference scaling linearly with target number. Trait estimation further confirmed model reliability, with rice grain voxel volume reaching R²=0.7697 (RMSE 0.0025) and wheat panicle voxel volume R²=0.9956. Leaf area accuracy improved progressively after subdivision (rice R²=0.84; wheat R²=1.00), and leaf length/width estimation maintained millimeter-level errors (rice R²=0.97/0.87; wheat R²=0.99/0.92), validating that higher segmentation quality directly supports stable phenotypic prediction.

IPENS offers a scalable, non-invasive and label-free tool for field and greenhouse phenotyping. By rapidly generating accurate 3D trait data, it provides breeding programs with efficient support for yield-related evaluations, genomic selection, and organ-level trait screening. The method improves throughput for grain counting, biomass measurement, and plant architecture assessment while reducing reliance on expert annotators. With strong cross-species generalization demonstrated in rice, wheat, and other crops, IPENS has potential for integration into automated phenotyping chambers, robotic imaging platforms, and future smart-agriculture pipelines. Its capacity to link phenotype data to genomic models may significantly accelerate trait improvement and breeding decision-making.

Source:
Journal reference:

Song, W., et al. (2025). IPENS: Interactive unsupervised framework for rapid plant phenotyping extraction via NeRF-SAM2 fusion. Plant Phenomics. doi: 10.1016/j.plaphe.2025.100106. https://www.sciencedirect.com/science/article/pii/S2643651525001128?via%3Dihub

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoLifeSciences.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Light-Based Sensor Continuously Detects Vaccine Quality