Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: ZENODO
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: Datacite
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2023
License: CC BY
Data sources: ZENODO
versions View all 3 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

Phase unwrapping using Deep Learning in Holographic Tomography - dataset.

Authors: Gontarz, Michał; Dutta, Vibekananda; Kujawińska, Małgorzata; Krauze, Wojciech;

Phase unwrapping using Deep Learning in Holographic Tomography - dataset.

Abstract

This dataset contains two types of data: phase images and trained model files. Real phase images - these phase images are contained with the files named with the prefix "real_". The type of the data files is ".npz", to be loaded with NumPy (np.load()), as a dictionary. The data is stored within the key ["arr_0"]. The images depict cells [1], organoids [2], phantoms [3-4] and regular 3D printed structures with high scattering properties [5]. The images have been augmented in order to expand the volume of the training dataset. All images are of shape (256,256,1). It is a big dataset containing 27,189 images of each type for training the unwrapping model: unwrapped - continuous phase distribution (float32) wrapped - phase wrapped into mod2\(\pi\) (float32) wrapcount - wrap count phase maps coded in the integer form (0,1,2...) (uint8) Synthetic phase images - phase images in these files were generated algorithmically in the MATLAB programming language. The files containing this dataset have a prefix "synthetic_". The type of the data files is ".npz", to be loaded with NumPy (np.load()), as a dictionary. The data is stored within the key ["arr_0"]. Phase images contained in the synthetic dataset can be split into 3 types by their type: spherical distribution, simulated cells w/ spherical background and simulated cells w/ introduced linear tilt. All images are of shape (256,256,1). This dataset contains 10,000 images of each type for training the unwrapping and denoising models: unwrapped - continuous phase distribution (float32) wrapped - phase wrapped into mod2\(\pi\) (float32) wrapcount - wrap count phase maps coded in the integer form (0,1,2...) (uint8) noised - wrapped phase images w/ synthetic noise (float32). Trained models - trained model files. These model files are in the format ".h5", which contains the model architecture and the weights. They have been developed and saved with the keras library, and are loaded with the keras.models.load_model() function. The models list: Unet_Denoising.h5 - U-Net model used for denoising as an image translation task. The input is a wrapped phase image with noise and the output is the same wrapped phase distribution, but denoised. Model is trained on the synthetic phase dataset. Attn_Unet_Unwrapping.h5 - U-Net model with Attention Gates and Residual Blocks trained for the semantic segmentation task. The input of the model is the wrapped phase image and its output is the wrap count map. Model is trained on the real phase dataset. [1] M. Baczewska, W. Krauze, A. Kuś, P. Stępień, K. Tokarska, K. Zukowski, E. Malinowska, Z. Brzózka, and M. Kujawińska, “On-chip holographic tomography for quantifying refractive index changes of cells’ dynamics,” in Quantitative Phase Imaging VIII, vol. 11970 Y. Liu, G. Popescu, and Y. Park, eds., International Society for Optics and Photonics (SPIE, 2022), p. 1197008. [2] P. Stępień, M. Ziemczonok, M. Kujawińska, M. Baczewska, L. Valenti, A. Cherubini, E. Casirati, and W. Krauze, “Numerical refractive index correction for the stitching procedure in tomographic quantitative phase imaging,” Biomed. Opt. Express 13, 5709–5720 (2022). [3] M. Ziemczonok, A. Kuś, P. Wasylczyk, and M. Kujawińska, “3d-printed biological cell phantom for testing 3d quantitative phase imaging systems,” Sci. Reports 9, 1–9 (2019). [4] M. Ziemczonok, A. Kuś, and M. Kujawińska, “Optical diffraction tomography meets metrology — measurement accuracy on cellular and subcellular level,” Measurement 195, 111106 (2022). [5] W. Krauze, A. Kuś, M. Ziemczonok, M. Haimowitz, S. Chowdhury, and M. Kujawińska, “3d scattering microphantom sample to assess quantitative accuracy in tomographic phase microscopy techniques,” Sci. Reports 12, 1–9 (2022).

Related Organizations
Keywords

QPI, convolutional neural networks, phase unwrapping, deep learning, holographic tomography

  • BIP!
    Impact byBIP!
    citations
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 49
    download downloads 121
  • 49
    views
    121
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
citations
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
1
Average
Average
Average
49
121