-
In this section, we discuss a number of application domains that are in need of efficient hologram representation and compression technologies, starting with biomedical imaging that has experienced already a vast deployment of holographic imaging techniques.
Quantitative phase imaging (QPI)6, is a label-free approach to imaging cells and tissues, that combines qualities found in microscopy, holography and light scattering techniques: nanoscale sensitivity to morphology and dynamics, 2D, 3D and 4D non-destructive imaging of transparent structures, and quantitative signals based on intrinsic (phase) contrast.
DHM and HT are the most popular and powerful techniques applied in QPI7-9. It should be noted that microscopy, which deals with small objects, is the best suited application for digital holography, as the space bandwidth product (SBP) of these objects can easily be matched to the SBP of available detectors. All these facts are fully confirmed by the success of many commercial DHM (e.g. Lyncée Tec and Ovizio) and HT systems (Nanolive10 and Tomocube11). Also, it should be noted that the big players on the optical microscopy market (e.g. Carl Zeiss AG, FEI Company, Hitachi High Technologies Corporation, JEOL Ltd, Leica Microsystems, Nikon Corporation, Philips Healthcare, Olympus Corporation) are moving towards digital and quantitative microscopy in which digital holography is a natural future choice.
DHM can efficiently overcome the most severe limitations of microscopy, namely the very limited depth of field as well as enabling the imaging of phase (transmissive) micro-objects. Holographic microscopy records the light wavefront originating from an object instead of the projected image of the object recorded in common microscopy. The viewable image and quantitative (phase) representation of a micro object is created using a numerical reconstruction algorithm12. The numerical reconstruction of the digitally recorded holograms makes it possible to obtain the whole complex diffracted wavefront, i.e. both its amplitude and phase, thus offering the possibility of propagating it further. Such numerical propagation provides many advantages including accurate measurements of an integrated phase of an object (dry mass of a cell), performing autofocusing, extending depth of focus and correcting any kind of aberrations. Examples of life science applications of DHM include cell dry mass extraction and monitoring13, cell culture inspection14, automated cell counting, recognition, and classification for diagnostic purposes with the use of artificial intelligence15, digital phase histopathology16, assessing cellular responses induced by new drugs17, and performing label-free high content screening18 to name just a few.
Due to the integrated character of the data delivered by digital holography (each phase value laterally represents the accumulated phase along all points along the optical axis), QPI realized through DHM has limited capabilities for providing full 3D structural information about cells or tissues. To overcome this limitation, tomographic versions of QPI are applied6. Among these methods, optical diffraction tomography based on holographic projections9 demonstrates a wealth of possibilities for 3D imaging in biomedical applications9,19, not possible with conventional forms of tomography such as computed tomography (CT)20. CT scan techniques make use of X-rays to provide an accurate representation of a 3D object. However, since CT is based on ray optics, it assumes non scattering objects where object features are larger than the imaging wavelength. On the other hand, holographic tomography (HT) is based on the diffraction of light and can be used for capturing very fine details and enabling radiation-free biomedical imaging. Furthermore, in HT the reconstruction obtained from the measurements of multiple 2D holograms (projections) is the 3D refractive index (RI) distribution of the weakly scattering sample under study.
The optical setup for holographic tomography consists of a DHM equipped with a unit controlling illumination beams impinging onto a sample by means of one of two alternative schemes: an object rotation (Full Projection Angle HT, FAHT) or illumination beam rotation (Limited Projection Angle HT, LAHT)9. In LAHT, the specimen is stationary, while holographic projections are acquired for different object illumination directions. This makes it a perfect tool for measuring single biological cells, cell cultures and tissues directly at Petri dishes or microscopic slides. After all data is captured, specialized tomographic reconstruction algorithms are employed to retrieve the 3D refractive index distribution21. Because RI values depend on the number of intracellular biomolecules, LAHT allows label-free, high resolution quantitative 3D morphological mapping of biological specimens including fixed and live cells19,22, tissues23 and even organoids and small organisms24.
The most popular optical configuration of research and commercial digital holographic microscopes applied for transparent objects is a Mach-Zehnder interferometer with an additional microscope imaging setup in the object arm, to image the object on the image sensor with high magnification. The output hologram is captured as the result of interference between reference and object beams.
Depending on the detector location in an object diffraction field, the recorded digital hologram is an image plane (IPH) or Fresnel hologram (FH). An IPH’s capture yields several benefits: a) avoiding the need for numerical propagation, therefore simplifying QPI processing and allowing for easy switching between modalities that do not allow for numerical refocusing, such as white light or fluorescent imaging; b) compact design by shortening the object arm25; c) concentrating optimally the bandwidth of individual diffraction orders26. Also, in the case of IPHs, the amplitude of an object can be captured solely by recording the object beam intensity in the absence of the reference beam. On the other hand, if the object plane cannot be well identified or optical refocusing is required, the registration of a Fresnel hologram is necessary and the complex amplitude of an object is reconstructed by numerical propagation of the captured wavefront to the object plane12.
IPHs and FHs can be captured in two alternative ways depending on whether or not there is an angular difference between the propagation directions of the object and reference wavefronts: as on-axis or off-axis holograms. The spatial bandwidth product (SBP) of these two types of holograms vary significantly and so impose different requirements on the space-bandwidth of one’s detector. In the first case, the detector and object SBPs are comparable, while for off-axis holograms a significant carrier frequency is required, which increases the necessary detector pixel count at least four times. On the other hand, an advantage of off-axis holograms is that the three diffraction orders can be separated in Fourier space thus easily allowing reconstruction of the object wavefront. The off-axis tilt introduces a carrier frequency that allows one to spatially separate +1, −1 and 0 diffraction orders in the Fourier domain. In case of full spectral separation, a fast and robust Fourier filtering method27,28 is applied to a single hologram.
For on-axis holograms, or holograms with insufficient carrier frequency, the one wanted diffraction order cannot be separated in the Fourier domain. This is acceptable for applications that only require intensity reconstructions, such as for the three-dimensional localization of particles29. To obtain quantitative phase or full-complex reconstructions from an on-axis configuration, the concept of phase shifting digital holography (PSDH) can be used30. PSDH requires capturing at least three phase shifted holograms (although more provide robustness) of the same object wavefront. This redundancy of information enables numerical elimination of the twin and the zero-order terms and provides an accurate object phase reconstruction. Unfortunately, PSDH cannot be applied (or is difficult to apply) to measure dynamically changing objects or in the case of unstable environments or long time-lapse sessions, as it requires capturing multiple holograms for a single measurement. These are the reasons why off-axis holograms and the Fourier transform method (FTM) of phase retrieval are applied in most of the research and commercial DHM systems31.
Regardless of the phase retrieval method, the output data and results from DHMs are delivered in two main formats: a) holograms (interference patterns) being real-valued greyscale images representing high spatial frequencies (for off-axis holograms, dense fringes) and relatively low spatial frequencies (for on-axis and quasi on-axis holograms); and b) complex amplitudes (or real-imaginary components) of the object field. The phases are calculated by FTM or PSDH, while the amplitude is additionally captured (image with no reference wave). In FTM, the bandwidth of the first order term that encodes the received wavefield is limited to a well defined region in the Fourier domain26. These complex amplitudes represent objects directly (in the case of image plane holograms) or should be numerically propagated to an object plane in the case of Fresnel holograms.
For holographic tomography it is obligatory to deliver a set of complex amplitudes acquired within a range of viewing angles, which are later processed with one of the tomographic solvers1,21. HT setups are similar to their DHM counterparts, but utilize multiple consecutive oblique illumination beam incidences instead of the single normal illumination beam incidence used in DHM. The most common optical architecture for HT is a Mach-Zehnder interferometer with an illumination beam scanning module (LATH configuration) and in-plane off-axis holographic projections capture. Such configurations have already been used in commercial solutions such as NanoLive10 and TomoCube11 products. The basic output data from HTs are multiple holograms per a single measurement followed by retrieval of complex amplitudes (CA) of holographic projections gathered into CA sinogram. However, the final HT output is a 3D distribution of refractive indices reconstructed in the measurement volume. The most basic tomographic reconstruction algorithm taking diffraction into account, namely direct inversion, is a reconstruction method that utilizes the Fourier Diffraction Theorem. It assumes either the Born or Rytov approximations of single scattering. The latter has no restrictions on sample thickness and is thus more suitable for analysis of biological specimens.
DI allows for the reconstruction of a 3D scattering potential from multiple complex-valued projections. However, when this method is applied to data acquired with a LAHT system, highly distorted results are obtained21. Thus, numerous iterative algorithms with constraints applied in each iteration are reported in the literature1. The most often presently used approach is the Gerchberg-Papoulis (GP) method with total variation minimization or object support constraints21. The tomographic reconstruction algorithms for LAHT are still not fully mature and they are the object of extensive research by many groups, including the application of artificial intelligence methods1. Therefore, a compression type that preserves the original complex-valued object wavefield, permitting future changes in the processing pipeline used by reconstruction algorithms, is most suitable (Table 1).
System Recording set-up Representation Information output DHM Single DH measurement Hologram(s) (intensity images) 2D map and its changes over PSDH (multiple DHs) with or without carrier frequency time at multiple depths Multiplexed Complex amplitude Phase/dry mass Time-lapse (both real and imaginary terms) Movement trajectories in three Large-sample (FoV) dimensions Time-lapse + large FoV Visualization HT Single HT measurement Hologram (intensity image) 2D RI (at best focal plane) (projections differing through scanning) with or without carrier 3D RI illumination or wavelength frequency Changes in time of Thick sample (multiple depths) Complex amplitude sinogram 2D or 3D RI Time-lapse + large volume (or real/imaginary term) Visualization Table 1. Overview of the most common hologram recording setups, their preferred representations, and typical information extracted in biomedical applications
-
Another important use case for Holography is the field of 3D visualization. Indeed, by embedding all the information describing light waves scattered by a scene in 3D space, the holographic signal contains all the data needed to provide the most convincing depth illusion to the human eye. When looking at a hologram properly displayed and illuminated, viewers perceive the recorded 3D scene as if it were physically present, with proper shade, texture and geometry. More specifically, holography reproduces all the depth perception mechanisms, also called Human Visual System (HVS) depth cues, occurring in natural vision, and solves the Vergence-Accommodation Conflict of stereoscopic systems32.
To display a hologram, the typical approach is to use a beam shaping device called Spatial Light Modulator (SLM). These are electrically or optically addressed panels which reproduce the recorded object wave by modulating the amplitude or the phase of the illuminating light beam. Several SLM technologies can be used for holographic displays, including Liquid Crystal on Silicon (LCoS)33, Digital Micro-mirror Displays (DMD)34,35 and Acousto-Optics Modulators (AOM)36.
Current commercially available SLMs have resolutions typically limited to 1920 × 1080 or (3840 × 2160) and pixel pitches of about
$ 4\mu\rm{m} $ ; hence commercial holographic displays using SLM technology are now available on small devices such as Head-Mounted Displays37,38 or automotive windshields. To develop larger holographic TV screens, it is possible to spatially tile multiple SLMs, increasing the size and/or FoV of the displayed hologram39. However, building a system of seamless tiled SLMs is very challenging, and the resolution of such systems is still insufficient for very large holograms.To plot large holograms with a pixel size in the order of the wavelength of visible light, it is possible to print them onto a photosensitive plate using a wavefront printer40, whose general setup is shown in Fig. 1. These printers typically comprise a laser, controlled by an electronic shutter, an SLM, a demagnifying lens system, and a photosensitive plate mounted on an X-Y moving stage. The large-resolution hologram is segmented into several rectangular tiles, which are displayed one after the other on the SLM. For each displayed tile, the control system opens the shutter to illuminate the SLM by the laser beam. The reconstructed object wave is then demagnified by the lens system and exposed to the photosensitive plate, which is also illuminated by the laser beam from the other side. After exposure, the shutter is closed and the plate is translated by the X-Y stage to the next exposure position, while the next segment of the hologram is displayed on the SLM. This process is repeated until the entire hologram is exposed. In addition to enable the display of large holograms with a wide FoV, wavefront printers can also print Holographic Optical Elements (HOE) combining several custom optical functions to enable extreme miniaturization of complex optical systems41.
As explained previously, data transmission for holography is a major issue, requiring novel compression algorithms able to significantly reduce the memory and bandwidth consumption of holographic signals. From this observation, one could argue that it would be more efficient to compress and transmit the data representative of the scene – such as Multiview data or 3D meshes – and convert it to a hologram on the display side using Computer-Generated Hologram (CGH) synthesis methods42-44. Nevertheless, in the case of biomedical holograms, the 3D information of the scene is not available. Moreover, in this transmission scheme, the client terminal would have to convert the compressed 3D scene data to hologram sequences in real-time, which is still very challenging using state-of-the-art calculation methods, especially for holograms containing billions of pixels. This is particularly challenging in broadcast scenarios, i.e. one-to-many transmission where one attempts to globally optimize the system, and as such having an N times repeat CGH calculation is highly undesirable. This is of course all on the condition that decoders depict a low complexity — which is usually the case — and that the compression bitrate achieved is meeting the channel bandwidth capacity. Holographic data coding is therefore required for both biomedical and visualization scenarios. Table 2 synthesizes the most common hologram visualization methods.
Use case Visualization device Representation Resolutions Display Spatial Light Modulator (SLM) Complex amplitude or real/imaginery term 2 × 106 to 3 × 107 Printing Photosensitive plate Complex amplitude or bilevel phase Over 1010 Table 2. Overview of most common hologram visualization methods, their preferred representations and resolutions.
-
Accurate visual quality analysis and reliable perceptual quality prediction of holograms is highly demanded. This is due to the rapid advances in developing processing modules of end-to-end holographic visualization pipeline. For example a number of fast numerical methods for CGH have been proposed recently94-99. Several approximations and compromises are required in their operations to ease the highly-demanding computations involved. The visual impact of these approximations needs to be analyzed and tested via conducting subjective experiments. Feedback-loops — without the need to reconstruct the object wavefronts — can be created by utilizing high performing objective quality measures to contribute on steering and optimizing the performance of these numerical operations. Another example is the need for systematic VQA of compression artifacts on the holograms and having objective measures to predict their visual impact on the reconstructed 3D scenes. Furthermore, high-end solutions for holographic compression can perform more efficiently, by embedding customized evaluation criteria that control their rate-distortion optimization process based on predicting visual quality of the reconstructed 3D scene after decoding. In this section, we will subsequently address subjective quality assessment procedures, objective quality assessment, and, finally, metrological quality assessment. In the latter category, which is particularly useful for the addressed biomedical use case, the impact of compression and hence, the performance of a particular compression technique, is assessed by evaluating the performance of a particular automated task – e.g. phase or refraction index retrieval, or segmentation – after compressing the hologram.
-
Conducting systematic subjective experiments for holographic data is the first step in their VQA process. To do so, open-access data sets of holograms recorded or numerically generated from various 3D models and real objects with varied scene-compositions are required. Over the past few years, more than a handful of such public data sets have been created72,100-104. The JPEG Pleno database105, hosts an updated portal with direct links to access these data sets. In106, a subset of holograms in JPEG Pleno database were chosen and for each one, several 2D reconstructions have been produced.
Next is to identify the testing procedure (illustrated for JPEG Pleno Holography in Fig.3). This mainly depends on the application use case(s) and more importantly the type of the display technology to visualize the holograms for the human observers. Ideally, the reference holograms and their distorted versions have to be visualized via holographic displays. However, holographic displays with acceptable visual characteristics are still rare107,108. Moreover, configuring and operating them requires advanced technical skills. One alternative, is to utilize analogue techniques to print them, e.g. on glass plates109. This way, very high definition holograms of deep scenes can be printed to illustrate the full potential of the holograms for recreation of a 3D scene110-115. However, using hard-copy holograms for subjective experiments is facing serious challenges. Those holograms can not be evaluated by objective quality metrics. From practical point of view, it requires flawless printing of a large number of holograms which normally will be several times bigger than the final hologram set utilized for the subjective test. Because preliminary experiments and mock-up tests are required to shortlist and choose the right scenes and objects, depth of the scenes and distortion levels to span over the full visual quality range. Furthermore, the holograms need to be permuted through and optically aligned multiple times per test subject which will prolong the test far beyond the recommended duration.
Fig. 3 Pipelines for the assessment of hologram compression methods utilized in the context of JPEG Pleno Holography93. The anchor codecs are tested in two pipelines, one performing the encoding in the hologram in the hologram plane, the other in the object plane. Visual quality assessment is performed in both planes, except for the subjective visual quality assessment, which is solely performed in the object plane. Metrological data quality is measured directly on the metrological data extracted from the uncompressed (original) and compressed holograms.
Another alternative is to utilize an indirect assessment method, by displaying a single or limited set of reconstructed views and depths on a non-holographic display. This is the most popular option for subjective testing of holograms. Lehtimäki et al.116 pioneered such efforts by conducting a limited subjective experiment utilizing an auto-stereoscopic display to visualize the holographic reconstructions. In this experiment, they evaluated the depth perception, where front and back reconstruction of each test hologram was generated with multiple aperture resolutions. Consecutively, in117, the same display technology was utilized to test the perceived quality of holograms produced with different aperture size. In this test, subjects were asked to choose the hologram reconstruction having the lowest distortion in terms of noise, blur (caused from defocus) and depth perception. Next, a 15 inch 2D LCD screen was utilized in118 to evaluate the relation between numerical error and the perceived quality of the hologram reconstructions. In this test, two simple cases of lossy compression were considered where in one case, a uniform quantization was applied on the hologram and on the other case, the same quantization was applied on the Fourier transform of the holograms. Later, in119 a methodology to evaluate visual quality of compressed holograms with partial-reconstructions shown on a UHD 2D display was proposed. The holograms were compressed using the real and imaginary parts of the reconstructed holograms on hologram plane. Conventional codecs, namely JPEG, JPEG 2000, H.265/HEVC intra coding were utilized to introduce the compression artifacts in 5 different bitrates. In120 a subjective experiment using a 2D screen was conducted to investigate the impact of several speckle removal methods. The best performing methods based on the visual appearance of the denoised hologram reconstructions were identified. Nonetheless, in all of these experiments, an important concern regarding the validity of achieved results on non-holographic displays was unresolved; whether or not subjective test results achieved by visualizing holograms on non-holographic displays can replicate at least the overall trend in distribution of the quality scores gathered from a subjective test conducted using an actual holographic display. Ahar et al.121 conducted the first subjective experiment using a holographic display. They also used the same test stimuli to conduct two separate subjective tests utilizing high-end light field and 2D displays. They also utilized a multi-perspective and multi-depth testing procedure to investigate the compatibility of the quality scores gathered from the 3 displays. In their analysis it was demonstrated that scores gathered from non-holographic displays replicated a very close overall trend compared to the one of holographic display, however observers gave slightly higher scores to the same compressed holograms when displayed on the holographic display than the other two.
All of the experiments using non-holographic displays, only considered reconstructed holograms in the focused position and a single or very limited set of 2D reconstructions were visualized which obviously does not provide information about other 3D aspects of the captured scenes e.g., parallax. A dynamic testing procedure was proposed in54 to address this issue. In this experiment, pseudo-video sequences were generated from the reconstructed views along a hologram-specific scan path that involved changes in focus and viewing angle. Then, they were displayed using a 4K 2D screen. This method not only allowed for watching the reconstructed scenes from several angles or focal distances, but also provided a better grasp of the parallax and its potential influence from the compression artifacts (see Fig. 4). A similar procedure was also utilized in122 for evaluating the object plane compression artifacts.
Fig. 4 Representation of JPEG 2000 compression artifacts on Dices16K hologram102. Looking at the Center view reconstructions: in hologram plane, the aliasing effect is visible both in high (3 bpp) and low (0.75 bpp) bitrates. Also note combination of the aliasing with heavy distortion in lower bitrate. In object plane, smoothing of the corners of the checker board is visible even in high bitrate (3 bpp). Top-left views for both object plane and hologram plane compression demonstrate significant loss of information and limitation of FoV especially in low bitrate.
-
Currently, no objective quality measure is introduced that is specifically designed for evaluating digital holograms. Therefore, hologram quality measurements are carried out by means of currently available measures mainly originated from digital imaging domain. In terms of functionality, two main groups are typically realized for classification of objective quality measures. It includes methods that essentially measure the mathematical fidelity and those algorithms which are designed to predict the perceptual quality of the visualized content. From the first group, Mean Squared Error (MSE), Signal to Noise Ratio (SNR) and Peak Signal to Noise Ratio (PSNR) are some of the most utilized measures for evaluating the quality of the distorted holograms118-120,122-124. In the second group, a quality measure called Structural SIMilarity (SSIM), has been commonly deployed for evaluating digital holograms as well. But since holograms are mostly processed in the form of complex-valued wavefield, quality measures are used separately in their Cartesian components. It is also possible to use the polar components. However, due to non-linear response of the wrapped phase, it was showed that measurements using Cartesian components will be more accurate72. The Versatile Similarity Measure (VSM) is a complex-valued fidelity measurement framework that was introduced in125. Not only it provides an adaptable bounded measurement for complex-valued data evaluations, but also solves the issue of comparing wrapped-phase data without unwrapping. Also, Sparseness Significance Ranking Measure (SSRM)126 is a perceptual quality predictor that initially proved its efficiency for digital imaging but due to solely working in the Fourier domain, without any modification could be utilized for complex-valued data as well. Both methods were shown to be more accurate than MSE and PSNR and SSIM, when tested on a limited set of CGHs127,128. The prediction accuracy of available objective measures can be analyzed by comparing with the ground-truth visual quality scores obtained from subjective tests. Nonetheless, their performance is influenced by the position in the processing pipeline that prediction of visual quality is required (i.e. quality assessment can be performed in the hologram domain, after numerically reconstructing the captured scene or even after performing post-processing on the reconstructions to remove the speckle noise). A comprehensive performance evaluation of some the most prominent fidelity and image quality measures was performed in129. In this experiment, Fourier holograms of DHMulti database105 were utilized. All holograms were distorted using compression artifacts and later visually annotated from different viewing angles and focal distances. Eventually, quality measures were tested in the hologram domain, right after the compression (QA_1). Additionally, to ensure that the test results are not biased by the characteristics of Fourier holograms, a lossless, numerical conversion to the more conventional Fresnel hologram type was performed and the evaluation was repeated (QA_2). Furthermore, all measures tested on the numerically reconstructed scenes and objects (for the same 2D viewes accompanied with the subjective quality scores) before (QA_3) and after applying a speckle denoising procedure (QA_4). Multiple figures of merit were utilized to evaluate the prediction accuracy, monotonicity and consistency of the tested quality measures. Based on the achieved results for all evaluation criteria and testing points, a usage guideline was provided such that helps interested user to choose the most appropriate method for the holographic VQA tasks until more hologram-oriented quality measures are designed (See Table 3). Experimental results showed that for each test track, a couple of quality measures present a significantly correlated performance compared to the subjectively attained visual scores. Nevertheless, no single measure is robust enough to reliably achieve best results across all 4 testing points. Therefore, none of the available options for objective quality measurement are at the level that confidently alleviate the strong need for design and development of especial VQA algorithms for the holographic data. For more detail on the testing procedure and analysis of the performance of each tested quality measure, interested reader is referred to the original publication in129.
Recommendations Hologram Plane Object Plane Metric QA_1 QA_2 QA_3 QA_4 MSE NMSE PSNR SSRM SSIM IW-SSIM MS-SSIM UQI GMSD FSIM NLPD VIFp Table 3. Usage recommendation of image quality measures for digital holograms, colour coded based on overall performance in each test-track.
: advised to be utilized, : Neutral or results not conclusive within the bounds of this research, : usage is discouraged -
Special attention in quality assessment is needed in case of QPI. Techniques such as DHM and HT address strict metrological requirements. Popular visual quality assessment metrics such as SSIM are not applicable here, and the impact of compression needs to be measured such that the induced differences are meaningfully expressed in terms of final required values, such as the RMSE calculated on phase/RI values. Presently, in QPI applications of DHM, a user is most often interested in the quantitative phase retrieved from a hologram, which is used for analysis as shown in Fig. 5a. Similarly, in the case of HT, it is more appropriate to measure the error in the refractive index distribution as shown in Fig. 5b.
This requires that the assessment is done after the signal is finally processed after decompression. In DHM, phase unwrapping performed on the complex amplitude signal is a highly non-linear process. It was shown in31 that even small compression errors in the hologram can lead to large unwrapping errors. Some applications require even more robust phase unwrapping algorithms130 and the effects of compression should be evaluated in the appropriate framework to ensure relevant results. In the case of HT also, a plethora of reconstruction methods exists, depending on the type of approximation (Born/Rytov) and the regularization being performed. Procedures such as phase unwrapping and tomographic reconstruction algorithms are still being actively developed, posing an additional challenge, since each algorithm requires separate characterization. Therefore, for measuring archival quality, metrics like SNR calculated on the full complex amplitude information may be more relevant.
On the other hand, assessment based on other metrics, such as a performance drop in a specific classification problem, would be suitable for practitioners applying the methods for a particular use case. However, such an approach is not general enough to be considered in research focused only on compression due to vastness of applications of QPI methods and large variety of possible features to be extracted from the images.
Compression strategies for digital holograms in biomedical and multimedia applications
- Light: Advanced Manufacturing 3, Article number: (2022)
- Received: 15 September 2021
- Revised: 30 June 2022
- Accepted: 01 July 2022 Published online: 20 July 2022
doi: https://doi.org/10.37188/lam.2022.040
Abstract: While 60 years of successful application of holography is celebrated in this special issue, efficient representation and compression of holographic data has received relatively little attention in research. Notwithstanding this observation, and particularly due to the digitization that is also penetrating the holographic domain, interest is growing on how to efficiently compress holographic data such that interactive exchange of content, as well as digital storage can be facilitated proficiently. This is a particular challenge, not only because of its interferometric nature and the various representation formats, but also the often extremely large data volumes involved in pathological, tomographic, or high-end visualization applications. In this paper, we provide an overview of the state of the art in compression techniques and corresponding quality metrics for various practical applications in digital holography. We also consider the future by analyzing the emerging trends for addressing the key challenges in this domain.
Research Summary
Efficient representation and compression of holographic data are essential in multiple emerging applications. Holograms can easily result in large volumes of data representing tens or even hundreds of Gbytes. Compression methods with efficient data representation models can facilitate cost-effective storage and simplify data sharing between systems to improve the adoption of microscopic, tomographic, or high-end visualization applications.
JPEG Standardization Committee has identified this need, and currently, the JPEG Pleno standardization initiative is defining an interoperable and efficient compression solution. This paper provides an overview of the state of the art in compression of digital holograms and the quality evaluation models needed for selecting suitable compression technology.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article′s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article′s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.