X-ray-based non-destructive 3D grain mapping techniques are well established at synchrotron facilities. To facilitate everyday access to grain mapping instruments, laboratory diffraction contrast tomography (LabDCT), ...
详细信息
X-ray-based non-destructive 3D grain mapping techniques are well established at synchrotron facilities. To facilitate everyday access to grain mapping instruments, laboratory diffraction contrast tomography (LabDCT), using a laboratory-based conical polychromatic X-ray beam, has been developed and commercialized. Yet the currently available LabDCT grain reconstruction methods are either ill-suited for handling a large number of grains or require a commercial licence bound to a specific instrument. To promote the availability of LabDCT, grain reconstruction methods have been developed with multiple reconstruction algorithms based on both forward and back calculations. The different algorithms are presented in detail and their efficient implementation using parallel computing is described. The performance of different reconstruction methods is assessed on synthetic data. The code to implement all the described algorithms has been made publicly accessible with the intention of fostering the development of grain mapping techniques on widely available laboratory instruments.
In this study, we compared the repeatability and reproducibility of radiomic features obtained from positron emission tomography (PET) images according to the reconstruction algorithm used-advanced reconstruction algo...
详细信息
In this study, we compared the repeatability and reproducibility of radiomic features obtained from positron emission tomography (PET) images according to the reconstruction algorithm used-advanced reconstruction algorithms, such as HYPER iterative (IT), HYPER deep learning reconstruction (DLR), and HYPER deep progressive reconstruction (DPR), or traditional Ordered Subset Expectation Maximization (OSEM)-to understand the potential variations and implications of using advanced reconstruction techniques in PET-based radiomics. We used a heterogeneous phantom with acrylic spherical beads (4- or 8-mm diameter) filled with 18F. PET images were acquired and reconstructed using OSEM, IT, DLR, and DPR. Original and wavelet radiomic features were calculated using SlicerRadiomics. Radiomic feature repeatability was assessed using the Coefficient of Variance (COV) and intraclass correlation coefficient (ICC), and inter-acquisition time reproducibility was assessed using the concordance correlation coefficient (CCC). For the 4- and 8-mm diameter beads phantom, the proportion of radiomic features with a COV < 10% was equivocal or higher for the advanced reconstruction algorithm than for OSEM. ICC indicated that advanced methods generally outperformed OSEM in repeatability, except for the original features of the 8-mm beads phantom. In the inter-acquisition time reproducibility analysis, the combinations of 3 and 5 min exhibited the highest reproducibility in both phantoms, with IT and DPR showing the highest proportion of radiomic features with CCC > 0.8. Advanced reconstruction methods provided enhanced stability of radiomic features compared with OSEM, suggesting their potential for optimal image reconstruction in PET-based radiomics, offering potential benefits in clinical diagnostics and prognostics.
Purpose: Radiomics analysis of oncologic positron emission tomography (PET) images is an area of significant activity and potential. The reproducibility of radiomics features is an important consideration for routine ...
详细信息
Purpose: Radiomics analysis of oncologic positron emission tomography (PET) images is an area of significant activity and potential. The reproducibility of radiomics features is an important consideration for routine clinical use. This preliminary study investigates the robustness of radiomics features in PSMA-PET images across penalized-likelihood (***) and standard ordered subset expectation maximization (OSEM) reconstruction algorithms and their setting parameters in phantom and prostate cancer (PCa) patients. Method: A NEMA image quality (IQ) phantom and 8 PCa patients were selected for phantom and patient analyses, respectively. PET images were reconstructed using *** (reconstruction beta-value: 100-700, at intervals of 100 for both NEMA IQ phantom and patients) and OSEM (duration: 15sec, 30sec, 1 min, 2 min, 3 min, 4 min and 5 min for NEMA phantom and duration: 30 s, 1 min and 2 min for patients) reconstruction methods. Subsequently, 129 radiomic features were extracted from the reconstructed images. The coefficient of variation (COV) of each feature across reconstruction methods and their parameters was calculated to determine feature robustness. Results: The extracted radiomics features showed a different range of variability, depending on the reconstruction algorithms and setting parameters. Specifically, 23.0 % and 53.5 % of features were found as robust against beta-value variations in *** and different durations in OSEM reconstruction algorithms, respectively. Taking into account the two algorithms and their parameters, eleven features (8.5 %) showed COV <= 5 % and eighteen (14 %) showed 5 % 20 %. The mean COVs of the extracted radiomics features were significantly different between the two reconstruction methods (p < 0.05) except for the phantom mo
With the rapid development of computer vision technology, automatic image detection technology has become more and more important in various fields. This paper focuses on the reconstruction algorithm in computer image...
详细信息
ISBN:
(数字)9798350375442
ISBN:
(纸本)9798350375459
With the rapid development of computer vision technology, automatic image detection technology has become more and more important in various fields. This paper focuses on the reconstruction algorithm in computer image automatic detection technology, and puts forward innovative solutions to the limitations of the existing technology. In this paper, SRCNN (super-resolution convolutional neural network) algorithm is selected, and compared with bilinear interpolation and SRGAN (super-resolution generation countermeasure network) algorithm. The experimental results show that the image quality reconstruction of SRCNN algorithm is obviously higher than the other two algorithms, and the maximum PSNR is 60 and the minimum is 41. The calculation efficiency is also higher. The above research results provide new research possibilities for image reconstruction technology in the fields of medical imaging and industrial inspection.
Terahertz (THz) imaging has been regarded as cutting-edge technology in a wide range of applications due to its ability to penetrate through opaque materials, non-invasive nature, and its increased bandwidth capacity....
详细信息
Terahertz (THz) imaging has been regarded as cutting-edge technology in a wide range of applications due to its ability to penetrate through opaque materials, non-invasive nature, and its increased bandwidth capacity. Recently, THz imaging has been extensively researched in security, driver assistance technology, non-destructive testing, and medical applications. The objective of this review is to summarize the selection criteria for current state-of-the-art THz image reconstruction algorithms developed for short-range imaging applications over the last two decades. Moreover, we summarize the selected algorithms' performance and their implementation process. This study provides an in-depth understanding of the fundamentals of image reconstruction algorithms related to THz short-range imaging and future aspects of algorithm processing and selection. (C) 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement
The sparsity in levels model recently inspired a new generation of effective acquisition and reconstruction modalities for compressive imaging. Moreover, it naturally arises in various areas of signal processing such ...
详细信息
The sparsity in levels model recently inspired a new generation of effective acquisition and reconstruction modalities for compressive imaging. Moreover, it naturally arises in various areas of signal processing such as parallel acquisition, radar, and the sparse corruptions problem. reconstruction strategies for sparse in levels signals usually rely on a suitable convex optimization program. Notably, although iterative and greedy algorithms can outperform convex optimization in terms of computational efficiency and have been studied extensively in the case of standard sparsity, little is known about their generalizations to the sparse in levels setting. In this paper, we bridge this gap by showing new stable and robust uniform recovery guarantees for sparse in level variants of the iterative hard thresholding and the CoSaMP algorithms. Our theoretical analysis generalizes recovery guarantees currently available in the case of standard sparsity and favorably compare to sparse in levels guarantees for weighted l(1) minimization. In addition, we also propose and numerically test an extension of the orthogonal matching pursuit algorithm for sparse in levels signals.
The theory of compressive sensing (CS) has been widely used in the field of image signal processing, and CS can improve the reconstruction quality of images. While traditional iterative algorithms lead to resource was...
详细信息
ISBN:
(数字)9798350375909
ISBN:
(纸本)9798350375916
The theory of compressive sensing (CS) has been widely used in the field of image signal processing, and CS can improve the reconstruction quality of images. While traditional iterative algorithms lead to resource waste due to too many iterations, a deep reconstruction network CSResNet is proposed to save the amount of computation in the paper. Firstly, network training is used to replace the sampling matrix. Secondly, the structure is optimized by adding a residual network to form a deep network to recover the image. Finally, the experiment results are shown to demonstrate the advantages of the algorithm in the paper.
In the era of the Internet of Medical Things (IoMT), modern healthcare devices generate vast amounts of data, necessitating enhanced data processing, storage, transmission bandwidth, and increased power consumption, e...
详细信息
ISBN:
(数字)9798350365504
ISBN:
(纸本)9798350365511
In the era of the Internet of Medical Things (IoMT), modern healthcare devices generate vast amounts of data, necessitating enhanced data processing, storage, transmission bandwidth, and increased power consumption, especially in sensing applications. Compressive Sensing (CS) addresses these challenges by enabling signal acquisition with fewer samples than the traditional Nyquist rate, thereby conserving power. This is particularly beneficial for mobile healthcare applications, such as Electrocardiogram (ECG) monitoring, which require continuous monitoring and substantial power and bandwidth for signal transmission and reconstruction. Despite various CS strategies and reconstruction algorithms explored for ECG signals, achieving high accuracy with a high compression ratio remains a challenge. This research analyzes six sensing strategies two sparse bases nine reconstruction algorithms, to identify the most efficient method for ECG signal processing within the CS framework. From our analysis it shows that RL1 with db2 sparse basis shows the averagely superior performance with RGM, RBMB and RBMS and RLDPC sensing matrices and CVX-L1 shows the good parformance with SFM sensing matrix.
For low-power, energy-efficient wearable edge health devices and health monitoring systems, compressed sensing (CS) achieves significant advances by sampling at sub-Nyquist rates for sparse signals such as electrocard...
详细信息
ISBN:
(数字)9798350371154
ISBN:
(纸本)9798350371161
For low-power, energy-efficient wearable edge health devices and health monitoring systems, compressed sensing (CS) achieves significant advances by sampling at sub-Nyquist rates for sparse signals such as electrocardiograms (ECG). The paper attempts to explore the best sparse reconstruction algorithm, along with a sparsifying matrix composed of a discrete cosine (C) and a discrete sine (S) basis [C S] and a deterministic binary block diagonal (DBBD) sensing matrix. The six different sparse reconstruction algorithms used for the recovery of CS ECG are orthogonal matching pursuit (OMP), approximate message passing (AMP), L1-minimization (L1-min), compressive sampling matching pursuit (CoSaMP), iterative hard thresholding (IHT), and iterative soft thresholding (IST). The ECG signals from 48 records of the MIT-BIH arrhythmia database (mitdb) were tested for CS reconstruction at compression ratios (CR) of 2.4, 3, 4, 4.8, 6 and 8. Further results showed that the combination of the OMP algorithm, the DBBD sensing matrix, and the basis [C S] surpassed the performance of other recovery algorithms in terms of lower average percentage root mean square difference (PRD). This combination achieved a lowest average PRD of 1.04 for the mitdb record 213, at a CR of 2.4. Although L1-min exhibits competitive performance with OMP in terms of PRD, OMP is computationally simpler than L1-min with a shorter recovery time.
In the context of Dismantling and Decommissioning activities, the establishment of a precise radiological map is essential, as it facilitates the detection of contaminated areas. This task mainly relies on manual labo...
详细信息
ISBN:
(数字)9798350388152
ISBN:
(纸本)9798350388169
In the context of Dismantling and Decommissioning activities, the establishment of a precise radiological map is essential, as it facilitates the detection of contaminated areas. This task mainly relies on manual labor executed by radiation protection operators. Nonetheless, this conventional approach is prone to human errors, physically taxing for operators, and exposes them to potentially hazardous environments. Consequently, the literature is exploring alternative methodologies, such as integrating radiological measurements with Simultaneous Localization and Mapping (SLAM) techniques, enabling concurrent mapping of the environment while determining sensor location, independent of GPS or GNSS systems, which are typically ineffective within indoor nuclear facilities. However, existing solutions in the literature often have limitations, such as focusing on specific types of nuclear measurements, being cumbersome in design, and necessitating post-processing steps. To address these limitations, we developed a new modular device for real-time 3D environment reconstruction and localization of radioactivity measurements, based on the Visual Inertial Semi-direct Visual Odometry (VI-SVO) algorithm, which is the most suitable algorithm to operate in a nuclear facility undergoing dismantling, as demonstrated in recent works in the literature. To evaluate the ability of the VI-SVO to correctly identify the radioactivity measurements, we conceived newly datasets with the device: in the laboratory with punctual $\gamma$ and $\beta$ sources and in a real environment undergoing dismantling. As far as we know, this is the first time in the literature a single device is able to correctly localize different types of radiation, thanks to its modularity, being able to be coupled to different radiological sensors. The algorithm takes into account real-world constraints and was able to correctly identify $\gamma$ and $\beta$ count rates from different radiological sources in a real and
暂无评论