We perform a series of calculations using simulated QPUs, accelerated by NVIDIA CUDA-Q platform, focusing on a molecular analog of an amine-functionalized metal-organic framework (MOF) — a promising class of material...
详细信息
To achieve long-pulse steady operation, the physical mechanisms of boundary turbulence need further investigation. We employ the two-fluid model with flute reduction on BOUT++ to simulate the boundary plasma in Tokama...
To achieve long-pulse steady operation, the physical mechanisms of boundary turbulence need further investigation. We employ the two-fluid model with flute reduction on BOUT++ to simulate the boundary plasma in Tokamaks. The space and time scales of turbulence reproduced by our simulations closely relate to the spatial mesh size and time step size, respectively. As an inherent time scale, the Alfven time is sufficient to resolve MHD instabilities. The spatial scale can be refined by increasing mesh resolutions, which necessitates larger scale parallel computing resources. We have conducted nonlinear simulations using more than 33 million spatial meshes with 16,384 CPU processors in parallel. The results indicate that while the decrease in parallel efficiency with an increase in core numbers does not necessarily lead to shorter runtimes, higher computational complexity improves parallel efficiency for the same number of cores. In addition, the mesh resolution required for convergence conditions differs between linear and nonlinear simulations, with nonlinear simulations demanding higher resolution. Besides finer structure obtained, the fluctuation characteristic of density similar to WCM, which is more consistent with the experimental observation, also shows the requirement for high-resolution meshes and large-scale computing in the future.
We developed an automated continuous real-time QTc interval monitoring algorithm for the critical-care setting. The performance of the QT interval measurement algorithm was tested on the PhysioNet adult QT ECG dataset...
详细信息
ISBN:
(纸本)1424425328
We developed an automated continuous real-time QTc interval monitoring algorithm for the critical-care setting. The performance of the QT interval measurement algorithm was tested on the PhysioNet adult QT ECG dataset (n=105), and on a pediatric ECG dataset (n=20) and a neonatal dataset (n = 24) recorded from intensive care units. The algorithm performance is measured by sensitivity (the ability to measure the QT interval), and accuracy (the difference between the automated QT measurements and cardiologists' manual annotations). We obtained 92% sensitivity in the adult group, 85% in the pediatric group and 75% in the neonatal group. On the 95 adult cases which had both an algorithm and a cardiologist measurement, the mean difference was 1 ms with a standard deviation of 35 ms. On the pediatric ECGs, the mean difference was -12 ms with a standard deviation of 20 ms. In the neonatal cases, the mean difference was -6 ms with a standard deviation of 12 ms.
A number of cardiac conditions such as acute pericarditis (PC) and early repolarization (ER) cause ST elevation which mimics ST-segment Elevation Myocardial Infarction (STEMI). Current guidelines recommend analyzing S...
详细信息
Sepsis is a serious medical condition caused by the body's response to an infection. Early prediction and treatment of sepsis are critical. In response to the PhysioNet/CinC Challenge 2019, we developed an algorit...
详细信息
Without reference annotation, statistical metrics such as sensitivity and positive predictive value (PPV) cannot be calculated. Annotating a large ECG database may not be feasible, hence, the interest in developing an...
详细信息
Parameters extracted from ECG recordings show different noise tolerance levels. Some parameters may be slightly affected by noise, while the others could be inaccurate. Choosing a single noise threshold for all parame...
详细信息
ISBN:
(纸本)9781509008964
Parameters extracted from ECG recordings show different noise tolerance levels. Some parameters may be slightly affected by noise, while the others could be inaccurate. Choosing a single noise threshold for all parameters may lead to adoption of invalid results or removal of valid parameters. In this study, we develop a statistical model between the Signal-to-Noise Ratio (SNR) and our Signal Quality Indicator (SQI) algorithm, and determine the noise-tolerance threshold for several ECG parameters statistically. Our dataset was based on the STAFF-III database with added physical noise recording segments from MIT-BIH database containing electrode motion, muscle artifact and baseline wander. We generated 3193 noise-added 12-lead ECG signals with SNR varying from -6dB to 24dB. For each 10-second segment, the noise level was measured by our SQI algorithm, and ECG parameters were measured by the Philips DXL ECG algorithm, allowing us to derive a noise model and the thresholds for noise tolerance of the ECG parameters. Varying thresholds suggest using parameter-specific thresholds in order to avoid reduction in accuracy.
Allocating the cost of empty railcar miles to partners in a railcar pooling system is an important pricing problem in railway management. Recently, the authors of this paper proposed a cost allocation scheme for empty...
详细信息
Allocating the cost of empty railcar miles to partners in a railcar pooling system is an important pricing problem in railway management. Recently, the authors of this paper proposed a cost allocation scheme for empty railcar movements based on game theory that explicitly considers the level of participation and contribution from each partner, the costs generated before and after cooperation, and the overall benefit obtained by each partner because of cooperation. This paper compares the performance of the model with three other cost allocation models with respect to fairness, stability, and computational efficiency. The comparison is made with two scenarios adapted from examples documented in the literature. The results indicate that the cost allocation scheme based on game theory outperforms other methods in ensuring fairness and enhancing stability in a coalition. Most remarkably, it yields reasonable results even in situations in which other models behave poorly. Computationally, it is manageable for practical problems.
暂无评论