Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published March 23, 2021 | Version v1
Other Open

Local Z Projector - 3D projection method comparison toolbox

  • 1. Institut Pasteur

Description

A toolbox to compare projection method performance.

Purpose.

We give and document here the set of practices and scripts that enabled us to perform the comparisons of several projection methods, published in:

Herbert, S., Valon, L., Mancini, L., Dray, N., Caldarelli, P., Gros, J., Esposito, E., Shorte, S. L., Bally-Cuif, L., Levayer, R., Aulner, N., & Tinevez, J.-Y. (2021). DProj: A toolbox for local 2D projection and accurate morphometrics of large 3D microscopy images. BioRxiv, 2021.01.15.426809. https://doi.org/10.1101/2021.01.15.426809

The 8 methods we tested are:

  • Local-Z-Projector

  • Stack Focuser

  • PreMosa

  • Extended-Depth-Of-Field

  • Min-Cost-Z-Surface

  • Fast SME

  • SurfCut

  • the Maximal Intensity Projection (MIP)

Each method offers some parameters to tune to achieve a desirable solution. When possible, we did a systematic parameter sweep for each method, in order to obtain the absolute best projection for the comparison. By 'absolute best', we mean the closest to a ground truth in the root-mean-square-error (RMSE) sense.

Running this comparison takes significant time. You will have to generate a ground truth for the projection, the height-map and the segmentation results. Also, running parameter sweeps on all methods takes long.

We aim at measuring the following four metrics:

  • Projection RMSE - How close are we to the projection ground-truth?

  • Height-map RMSE - How close are we to the height-map ground-truth?

  • Segmentation Object-Consistency-Error (OCE) - How does the quality of the projection affects subsequent analysis?

  • Timing - How long does it take to get the projection?

See the associated article and in particular its Supplementary Material for details.

Organization.

This toolbox is organized in several subfolders, one for each step of the comparison. Each subfolder contains a README.md file where the comparison process is documented. You just have to follow the instructions they contain.

The recommended order of processing is the following:

  1. GroundTruth - Generate ground-truth projection and height-map files.

  2. Methods - Run the parameter sweep for all methods and get the absolute best projection and height-map for each. Them measure the RMSE value for the projection and time each method. This folder has one subfolder per method, documented individually.

  3. HeightMap - Measure the RMSE metric for the height-maps.

  4. Segmentation - Measure the segmentation performance metrics.

  5. Collate all metrics in the MATLAB script PlotMetrics.m present in this top-level folder. Then run it and it will generate the 4-panels figure with the comparison results.

Requirements.

This toolbox requires familiarity with MATLAB and Python scripting. The scripts should run turnkey, but you will have to edit them with the proper file names for your own comparison.

You also need the following software:

The test image.

We distribute this toolbox along with a test image, used as Supplementary Data in the article. It is a single 3D image (one channel, one time-point) of a Drosophila pupa notum, captured with a laser-scanning confocal microscope (LSM 880, Zeiss). Pixel size 0.188 µm, z step 1 µm. The data was acquired by Leo Valon in Romain Levayer lab, Institut Pasteur.

This specific image was used to investigate how the projection methods fare again an image of high quality, with little spurious structures (the cuticle is faint and there are almost no dead bodies). This is an image of exceptional quality. For the Table 1 in the article we use an image where there are more spurious structures.

Normally there should already be result files and data for the comparison of this image in this toolbox.

 

 

Files

ProjectionMethodComparisonToolBox.zip

Files (65.0 MB)

Name Size Download all
md5:e6406027e1af9bf4176d5dbf61a13a4a
65.0 MB Preview Download

Additional details

Funding

MechDeath – Study of the mechanical cues driving cell competition and its role in pretumoral cell expansion 789573
European Commission
CoSpaDD – Competition for Space in Development and Diseases 758457
European Commission